Coder Social home page Coder Social logo

meta-neural-network's Introduction

NNStreamer

Gitter DailyBuild CII Best Practices Total alerts Code Coverage Coverity Scan Defect Status GitHub repo size

Neural Network Support as Gstreamer Plugins.

NNStreamer is a set of Gstreamer plugins that allow Gstreamer developers to adopt neural network models easily and efficiently and neural network developers to manage neural network pipelines and their filters easily and efficiently.

Architectural Description (WIP)

Toward Among-Device AI from On-Device AI with Stream Pipelines, IEEE/ACM ICSE 2022 SEIP
NNStreamer: Efficient and Agile Development of On-Device AI Systems, IEEE/ACM ICSE 2021 SEIP [media]
NNStreamer: Stream Processing Paradigm for Neural Networks ... [pdf/tech report]
GStreamer Conference 2018, NNStreamer [media] [pdf/slides]
Naver Tech Talk (Korean), 2018 [media] [pdf/slides]
Samsung Developer Conference 2019, NNStreamer (media)
ResearchGate Page of NNStreamer

Official Releases

Tizen Ubuntu Android Yocto macOS
5.5M2 and later 16.04/18.04/20.04/22.04 9/P Kirkstone
arm armv7l badge Available Available Ready N/A
arm64 aarch64 badge Available android badge yocto badge N/A
x64 x64 badge ubuntu badge Ready Ready Available
x86 x86 badge N/A N/A Ready N/A
Publish Tizen Repo PPA Daily build Layer Brew Tap
API C/C# (Official) C Java C C
  • Ready: CI system ensures build-ability and unit-testing. Users may easily build and execute. However, we do not have automated release & deployment system for this instance.
  • Available: binary packages are released and deployed automatically and periodically along with CI tests.
  • Daily Release
  • SDK Support: Tizen Studio (5.5 M2+) / Android Studio (JCenter, "nnstreamer")
  • Enabled features of official releases

Objectives

  • Provide neural network framework connectivities (e.g., tensorflow, caffe) for gstreamer streams.

    • Efficient Streaming for AI Projects: Apply efficient and flexible stream pipeline to neural networks.
    • Intelligent Media Filters!: Use a neural network model as a media filter / converter.
    • Composite Models!: Multiple neural network models in a single stream pipeline instance.
    • Multi Modal Intelligence!: Multiple sources and stream paths for neural network models.
  • Provide easy methods to construct media streams with neural network models using the de-facto-standard media stream framework, GStreamer.

    • Gstreamer users: use neural network models as if they are yet another media filters.
    • Neural network developers: manage media streams easily and efficiently.

Maintainers

Committers

Components

Note that this project has just started and many of the components are in design phase. In Component Description page, we describe nnstreamer components of the following three categories: data type definitions, gstreamer elements (plugins), and other misc components.

Getting Started

For more details, please access the following manuals.

  • For Linux-like systems such as Tizen, Debian, and Ubuntu, press here.
  • For macOS systems, press here.
  • To build an API library for Android, press here.

Applications

CI Server

AI Acceleration Hardware Support

Although a framework may accelerate transparently as Tensorflow-GPU does, nnstreamer provides various hardware acceleration subplugins.

  • Movidius-X via ncsdk2 subplugin: Released
  • Movidius-X via openVINO subplugin: Released
  • Edge-TPU via edgetpu subplugin: Released
  • ONE runtime via nnfw(an old name of ONE) subplugin: Released
  • ARMNN via armnn subplugin: Released
  • Verisilicon-Vivante via vivante subplugin: Released
  • Qualcomm SNPE via snpe subplugin: Released
  • NVidia via TensorRT subplugin: Released
  • TRI-x NPUs: Released
  • NXP i.MX series: via the vendor
  • Others: TVM, TensorFlow, TensorFlow-lite, PyTorch, Caffe2, SNAP, ...

Contributing

Contributions are welcome! Please see our Contributing Guide for more details.

meta-neural-network's People

Contributors

anyj0527 avatar ibz6205 avatar jijoongmoon avatar jimmyohn avatar kparichay avatar leigh-johnson avatar myungjoo avatar vivien avatar wooksong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

meta-neural-network's Issues

Unexpected SSAT test results

In qemu (BusyBox),

The callCompareTest of files which do not exist returns PASS.
When the file does not exist, in BusyBox condition, the bufsize is null and results of dd are same error message. And unexpected PASS result follows.

Bitbaking tensorflow-lite fails with, Error: 'IsOutRange' is not a member of 'flatbuffers'

Issue Description

I am trying to add tensorflow-lite to my Boot2Qt image for Jetson Nano.
My Yocto Release version is dunfell.

bitbake tensorflow-lite command builds the meta-neural-network/recipes-tensorflow/tensorflow-lite/tensorflow-lite_2.3.bb recipe. The recipe does not assign anything to SRCREV for git://github.com/tensorflow/tensorflow;destsuffix=git/;branch=r2.3;protocol=https in the SRC_URI. So, I created the following .bbappend.

unset SRC_URI
SRC_URI = " \
	git://github.com/tensorflow/tensorflow;destsuffix=git/;branch=r2.3;rev=v2.3.0;protocol=https \
	git://gitlab.com/libeigen/eigen.git;destsuffix=eigen/;branch=master;rev=cdb377d0cba4889fc909d1bbdd430b988db0db97;protocol=https \
	git://github.com/google/gemmlowp;destsuffix=gemmlowp/;branch=master;rev=2483d846ad865dd4190fe4a1a1ba2d9cfcea78e1;protocol=https \
	git://github.com/google/googletest;destsuffix=googletest/;branch=master;rev=release-1.10.0;protocol=https \
	git://github.com/abseil/abseil-cpp;destsuffix=abseil-cpp/;branch=master;rev=20190808;nobranch=1;protocol=https \
	git://github.com/intel/ARM_NEON_2_x86_SSE;destsuffix=neon_2_sse/;branch=master;rev=8dbe2461c89760ac4b204aa0eafb72413a97957d;protocol=https \
	git://github.com/google/farmhash;destsuffix=farmhash/;branch=master;rev=0d859a811870d10f53a594927d0d0b97573ad06d;protocol=https \
	git://github.com/google/flatbuffers;destsuffix=flatbuffers/;branch=master;rev=v1.11.0;protocol=https \
	https://mirror.bazel.build/www.kurims.kyoto-u.ac.jp/~ooura/fft.tgz;md5sum=${MD5SUM_FFT};sha256sum=${SHA256SUM_FFT} \
	file://apply-modification-for-tflite-1.15-to-eigen.patch \
	file://tensorflow-lite.pc.in \
"

In the .bbappend, I have set rev=v2.3.0 for the tensorflow repo. On running the bitbake tensorflow-lite do_compile task fails with the following error:

In file included from ./tensorflow/lite/core/api/op_resolver.h:20,
                 from ./tensorflow/lite/model_builder.h:27,
                 from ./tensorflow/lite/model.h:22,
                 from ./tensorflow/lite/c/c_api_internal.h:25,
                 from tensorflow/lite/c/c_api_experimental.cc:24:
./tensorflow/lite/schema/schema_generated.h: In function 'const char* tflite::EnumNameTensorType(tflite::TensorType)':
./tensorflow/lite/schema/schema_generated.h:422:20: error: 'IsOutRange' is not a member of 'flatbuffers'

The above error is followed by the same error on many lines in the schema_generated.h where IsOutRange is used. I have pasted a link to the log file at the bottom of the post.

While looking for a solution I came across this issue on tensorflow repo. In that issue it seems the reason behind the error was IsOutRange not being available in flatbuffers v1.11.0. They seem to have solved the issue by using flatbuffers v1.12.0 as can be seen in this commit which edits the third_party_downloads.inc file.

The ${BUILDDIR}/tmp/work/aarch64-poky-linux/tensorflow-lite/2.3-r0/git/tensorflow/lite/micro/tools/make/third_party_downloads.inc file has the following contents.
FLATBUFFERS_URL is being set like so via the statement defined in else:
FLATBUFFERS_URL := "https://github.com/google/flatbuffers/archive/v1.12.0.tar.gz"

I cannot figure out why the do_compile task throws this error. I will appreciate any help.

Update 1:
${BUILDDIR}/build-jetson-nano-qspi-sd/tmp/work/aarch64-poky-linux/tensorflow-lite/2.3-r0/flatbuffers/src/flatc.cpp has the following line: #define FLATC_VERSION "1.11.0" So the 1.11.0 version of flatbuffers is being downloaded?

Update 2:
Changed the rev for flatbuffers in SRC_URI to git://github.com/google/flatbuffers;destsuffix=flatbuffers/;branch=master;rev=v1.12.0;protocol=https and now the build fails with the following error.

Expected Result

The tensorflow-lite package should get built.

How to Reproduce

As described in the issue description.

Further Information

  • A link to an output result showing the issue: log file
  • Exact OS version: Host for Yocto: Ubuntu 20.04. Yocto Release: dunfell.

Dunfell yocto support for NNStreamer.

Neural Network Support kirkstone yocto (a new yocto release). Do we have dunfell yocto version support for NNStreamer. Kindly let us know how to download dunfell yocto version support for nnstreamer?

EPIC / Backlogs will be at nnsuite/nnstreamer

For easier maintanance, (we have few people with many git repos) development-process related items will be stacked at github.com/nnsuite/nnstreamer

However, questions, discussions, and bug reports will be in this github.

NNtrainer yocto-build.patch file "0001-Patch-for-yocto-build.patch" not upgraded with latest meson.build file in nntrainer repo

Hi,

We tried to build the meta-neural-network layer for nnstreamer. While building, we observed that the patch files for yocto build are not up-to-date with the latest meson.build files (example yocto build file for nntrainer).

Because of this, we receive errors during the build. PFA for the txt file indicating the errors.

Can you please provide the updated files, so that the patch can be directly applied to the build ?
build_failed.txt

Propose Tree of meta-neural-network

Just as everyone has no confusion, I'm proposing tree of meta-neural-nework.
If you need to add more packages, then add it in this tree.

├── conf
│   └── layer.conf (WIP)
├── lib
│   └── oeqa
│       └── runtime
│           └── cases
│               └── nnstreamer.py (WIP)
├── recipes-nnstreamer 
│   ├── nnstreamer (WIP)
│   │   └── nnstreamer_0.1.2.bb
│   ├── nnstreamer-example
│   │   └── nnstreamer-example.bb
│   └── nnstreamer-ros
│       └── nnstreamer-ros.bb
├── recipes-devtools
│   ├── ssat (WIP)
│   │   └── ssat.bb
│   ├── protobuf
│   │   └── protobuf.bb
│   ├── ....
│   │   └── ....bb
├── recipes-tensorflow
│   ├── tensorflow
│   │   └── tensorflow.bb
│   └── tensorflow-lite
│       └── tensorflow-lite.bb
├── recipes-caffe
│   └── caffe
│       └── caffe.bb
├── recipes-caffe2
│   └── caffe
│       └── caffe2.bb
└── recipes-...
     └── ...
          └── ... .bb

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.