Coder Social home page Coder Social logo

arm-software / ethos-n-driver-stack Goto Github PK

View Code? Open in Web Editor NEW
58.0 17.0 20.0 11.24 MB

Driver stack (including user space libraries, kernel module and firmware) for the Arm® Ethos™-N NPU

CMake 0.08% C++ 73.77% Makefile 0.05% Python 3.56% C 19.84% Shell 0.05% HTML 1.31% JavaScript 0.87% Batchfile 0.01% CSS 0.01% Rust 0.46%

ethos-n-driver-stack's Issues

compilation error with scons

Hi,

when I compile ethos-n driver with scons, I met the error
support_library/src/nonCascading/Strategies.cpp:737:66: error: '*((void*)(& bestParams)+4).ethosn::support_library::Strategy6::TrySetupAnyBlockConfig(ethosn::support_library::TensorConfig&, ethosn::support_library::SramAllocator&, const TensorShape&, const TensorShape&, const TensorShape&, ethosn::support_library::DataFormat, const TensorShape&, const std::vector<ethosn::command_stream::BlockConfig>&, const ethosn::support_library::HardwareCapabilities&, const ethosn::support_library::utils::ShapeMultiplier&, const ethosn::support_library::utils::ShapeMultiplier&, std::pair<bool, unsigned int>, ethosn::support_library::CompilerMceAlgorithm, uint32_t)::Strategy6Params::blockHeight' may be used uninitialized in this function [-Werror=maybe-uninitialized] tensorConfig.blockHeight = bestParams.value().blockHeight; ^ support_library/src/nonCascading/Strategies.cpp:736:65: error: '*((void*)(& bestParams)+4).ethosn::support_library::Strategy6::TrySetupAnyBlockConfig(ethosn::support_library::TensorConfig&, ethosn::support_library::SramAllocator&, const TensorShape&, const TensorShape&, const TensorShape&, ethosn::support_library::DataFormat, const TensorShape&, const std::vector<ethosn::command_stream::BlockConfig>&, const ethosn::support_library::HardwareCapabilities&, const ethosn::support_library::utils::ShapeMultiplier&, const ethosn::support_library::utils::ShapeMultiplier&, std::pair<bool, unsigned int>, ethosn::support_library::CompilerMceAlgorithm, uint32_t)::Strategy6Params::blockWidth' may be used uninitialized in this function [-Werror=maybe-uninitialized] tensorConfig.blockWidth = bestParams.value().blockWidth; ^ cc1plus: all warnings being treated as errors scons: *** [support_library/build/release_native/src/nonCascading/Strategies.o] Error 1 scons: building terminated because of errors.
Does anyone know how to solve these errors?

Error message while executing inference with NPU

Hi,

We adopted two methods to compile our ArmNN inference code, the first method is compiling via Makefile and the other one is via CMakeLists. We logged the execution time from these compiling approaches and compared the results based on the same inference code, test image, and test model, and the logs are shown in the attachment.
logs.zip

However, the execution results contain the following error message:

Error: An error occurred attempting to execute a workload: An error has occurred waiting for the inference of a pre-compiled object: Timed out while waiting for the inference to complete
Why does the execution of NPU produce the error above? Since it takes time to reset the status of NPU, how to resolve the issue?

Best Regards

Execution error of Convolution2d on EthosNAcc mode

Hi,
We have an int8 tflite model based on BiSeNetv2 architecture. When we applied the model on NPU, we met the following error:
WARNING: Layer of type Convolution2d is not supported on requested backend EthosNAcc for input data type QAsymmS8 and output data type QAsymmS8(reason: Convolution: Overall scale (of the input* weights/ output) should be in range [2.328306e-10, 1)), falling back to the next backend.
Warning: ERROR: Layer of type Convolution2d is not supported on any preferred backend[EthosNAcc]
terminate called after throwing an instance of 'armnn::InvalidArgumentException'
what(): Failed to assign a backend to each layer

However, the same model could be executed in GPUAcc mode and SPA tool (with NPU evaluation). Does the error above mean the weight value of the convolution2d layer must in the range [2.328306e-10, 1)?

Best Regards,
Rahn

[BUG]Unable to build tvm with ethos-n-driver-stack support

I want to build the Ethos-N backend with tvm in a minimal way, I just build the driver lib and it generates four files: libEthosNDriver.a, libEthosNDriver.so, libEthosNSupport.a, libEthosNSupport.so. And install into the default directory, with the most updated version(20.11) of Ethos-N Driver stack. Another problem arises, I get the following errors:

/home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc: In member function ‘tvm::relay::contrib::ethosn::NetworkWithIDs tvm::relay::contrib::ethosn::ConstructNetworkVisitor::Construct(const tvm::relay::Function&)’:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc:202:32: error: too few arguments to function ‘std::shared_ptr<ethosn::support_library::Network> ethosn::support_library::CreateNetwork(const std::vector<char>&)’
  202 |   network_ = sl::CreateNetwork();
      |                                ^
In file included from /home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api_version.h:20,
                 from /home/dyn/tvm/src/relay/backend/contrib/ethosn/capabilities.h:34,
                 from /home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc:27:
/usr/local/include/ethosn_support_library/Support.hpp:1116:26: note: declared here
 1116 | std::shared_ptr<Network> CreateNetwork(const std::vector<char>& caps);
      |                          ^~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc: In static member function ‘static ethosn::support_library::CompilationOptions tvm::relay::contrib::ethosn::EthosnCompiler::CreateOptions()’:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc:564:64: error: no matching function for call to ‘ethosn::support_library::CompilationOptions::CompilationOptions(std::vector<char>&)’
  564 |   sl::CompilationOptions options(variants[cfg.value()->variant]);
      |                                                                ^
In file included from /home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api_version.h:20,
                 from /home/dyn/tvm/src/relay/backend/contrib/ethosn/capabilities.h:34,
                 from /home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc:27:
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note: candidate: ‘constexpr ethosn::support_library::CompilationOptions::CompilationOptions()’
   69 | struct CompilationOptions
      |        ^~~~~~~~~~~~~~~~~~
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note:   candidate expects 0 arguments, 1 provided
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note: candidate: ‘ethosn::support_library::CompilationOptions::CompilationOptions(const ethosn::support_library::CompilationOptions&)’
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note:   no known conversion for argument 1 from ‘std::vector<char>’ to ‘const ethosn::support_library::CompilationOptions&’
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note: candidate: ‘ethosn::support_library::CompilationOptions::CompilationOptions(ethosn::support_library::CompilationOptions&&)’
/usr/local/include/ethosn_support_library/Support.hpp:69:8: note:   no known conversion for argument 1 from ‘std::vector<char>’ to ‘ethosn::support_library::CompilationOptions&&’
/home/dyn/tvm/src/relay/backend/contrib/ethosn/codegen.cc:579:55: error: cannot convert ‘const bool’ to ‘ethosn::support_library::CompilationOptions::DebugLevel’ in assignment
  579 |   options.m_DebugInfo.m_DumpDebugFiles = cfg.value()->dump_debug_files;
      |                                          ~~~~~~~~~~~~~^~~~~~~~~~~~~~~~
      |                                                       |
      |                                                       const bool
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:638:27: error: ‘IsDepthwiseConvolutionSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  638 |         *rv = !err && sl::IsDepthwiseConvolutionSupported(params.bias_info, params.weights_info,
      |                           ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:641:27: error: ‘IsConvolutionSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  641 |         *rv = !err && sl::IsConvolutionSupported(params.bias_info, params.weights_info,
      |                           ^~~~~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:651:25: error: ‘IsFullyConnectedSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  651 |       *rv = !err && sl::IsFullyConnectedSupported(params.bias_info, params.weights_info,
      |                         ^~~~~~~~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:660:25: error: ‘IsPoolingSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  660 |       *rv = !err && sl::IsPoolingSupported(params.pool_info, params.input_info);
      |                         ^~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:668:25: error: ‘IsPoolingSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  668 |       *rv = !err && sl::IsPoolingSupported(params.pool_info, params.input_info);
      |                         ^~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:676:25: error: ‘IsReshapeSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  676 |       *rv = !err && sl::IsReshapeSupported(params.new_shape, params.input_info);
      |                         ^~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:684:25: error: ‘IsAdditionSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  684 |       *rv = !err && sl::IsAdditionSupported(params.lhs_info, params.rhs_info,
      |                         ^~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:693:25: error: ‘IsSigmoidSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  693 |       *rv = !err && sl::IsSigmoidSupported(params.input_info);
      |                         ^~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:701:25: error: ‘IsConcatenationSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  701 |       *rv = !err && sl::IsConcatenationSupported(params.input_infos, params.concat_info);
      |                         ^~~~~~~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:709:25: error: ‘IsSplitSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  709 |       *rv = !err && sl::IsSplitSupported(params.input_info, params.split_info);
      |                         ^~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:717:25: error: ‘IsDepthToSpaceSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  717 |       *rv = !err && sl::IsDepthToSpaceSupported(params.input_info, params.depth_info);
      |                         ^~~~~~~~~~~~~~~~~~~~~~~
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc: In lambda function:
/home/dyn/tvm/src/relay/backend/contrib/ethosn/ethosn_api.cc:725:25: error: ‘IsReluSupported’ is not a member of ‘tvm::relay::contrib::ethosn::sl’
  725 |       *rv = !err && sl::IsReluSupported(params.relu_info, params.input_info);
      |                         ^~~~~~~~~~~~~~~
make[2]: *** [CMakeFiles/tvm_objs.dir/build.make:4340: CMakeFiles/tvm_objs.dir/src/relay/backend/contrib/ethosn/ethosn_api.cc.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: *** [CMakeFiles/tvm_objs.dir/build.make:4327: CMakeFiles/tvm_objs.dir/src/relay/backend/contrib/ethosn/codegen.cc.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:193: CMakeFiles/tvm_objs.dir/all] Error 2
make: *** [Makefile:130: all] Error 2

When I check out the 20.08 version of Ethos-N driver stack and build it, I get the following errors:

g++ -o support_library/build/release/src/Support.o -c -std=c++14 -O3 -Werror -Wall -Wextra -Wformat=2 -Wno-format-nonliteral -Wctor-dtor-privacy -Woverloaded-virtual -Wsign-promo -Wstrict-overflow=2 -Wswitch-default -Wlogical-op -Wnoexcept -Wstrict-null-sentinel -Wconversion -fPIC -Icommand_stream/include -Iutils/include -Isupport_library/build/release/src -Isupport_library/src -Isupport_library/build/release/include -Isupport_library/include support_library/src/Support.cpp
In file included from support_library/src/Graph.hpp:9,
                 from support_library/src/Compiler.hpp:10,
                 from support_library/src/Support.cpp:9:
/usr/include/c++/9/ext/new_allocator.h: In instantiation of 'void __gnu_cxx::new_allocator<_Tp>::construct(_Up*, _Args&& ...) [with _Up = ethosn::support_library::Operand::Consumer; _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer]':
/usr/include/c++/9/bits/alloc_traits.h:482:2:   required from 'static void std::allocator_traits<std::allocator<_CharT> >::construct(std::allocator_traits<std::allocator<_CharT> >::allocator_type&, _Up*, _Args&& ...) [with _Up = ethosn::support_library::Operand::Consumer; _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer; std::allocator_traits<std::allocator<_CharT> >::allocator_type = std::allocator<ethosn::support_library::Operand::Consumer>]'
/usr/include/c++/9/bits/vector.tcc:115:30:   required from 'void std::vector<_Tp, _Alloc>::emplace_back(_Args&& ...) [with _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer; _Alloc = std::allocator<ethosn::support_library::Operand::Consumer>]'
support_library/src/Network.hpp:46:50:   required from here
support_library/src/Network.hpp:28:19: error: but 'constexpr ethosn::support_library::Operand::Consumer::Consumer(ethosn::support_library::Operation&, size_t)' does not throw; perhaps it should be declared 'noexcept' [-Werror=noexcept]
   28 |         constexpr Consumer(Operation& operation, const size_t inputIndex)
      |                   ^~~~~~~~
/usr/include/c++/9/ext/new_allocator.h: In instantiation of 'void __gnu_cxx::new_allocator<_Tp>::construct(_Up*, _Args&& ...) [with _Up = ethosn::support_library::Network; _Args = {}; _Tp = ethosn::support_library::Network]':
/usr/include/c++/9/bits/alloc_traits.h:482:2:   required from 'static void std::allocator_traits<std::allocator<_CharT> >::construct(std::allocator_traits<std::allocator<_CharT> >::allocator_type&, _Up*, _Args&& ...) [with _Up = ethosn::support_library::Network; _Args = {}; _Tp = ethosn::support_library::Network; std::allocator_traits<std::allocator<_CharT> >::allocator_type = std::allocator<ethosn::support_library::Network>]'
/usr/include/c++/9/bits/shared_ptr_base.h:548:39:   required from 'std::_Sp_counted_ptr_inplace<_Tp, _Alloc, _Lp>::_Sp_counted_ptr_inplace(_Alloc, _Args&& ...) [with _Args = {}; _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr_base.h:679:16:   required from 'std::__shared_count<_Lp>::__shared_count(_Tp*&, std::_Sp_alloc_shared_tag<_Alloc>, _Args&& ...) [with _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {}; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr_base.h:1344:71:   required from 'std::__shared_ptr<_Tp, _Lp>::__shared_ptr(std::_Sp_alloc_shared_tag<_Tp>, _Args&& ...) [with _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {}; _Tp = ethosn::support_library::Network; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr.h:359:59:   required from 'std::shared_ptr<_Tp>::shared_ptr(std::_Sp_alloc_shared_tag<_Tp>, _Args&& ...) [with _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {}; _Tp = ethosn::support_library::Network]'
/usr/include/c++/9/bits/shared_ptr.h:701:14:   required from 'std::shared_ptr<_Tp> std::allocate_shared(const _Alloc&, _Args&& ...) [with _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {}]'
/usr/include/c++/9/bits/shared_ptr.h:717:39:   required from 'std::shared_ptr<_Tp> std::make_shared(_Args&& ...) [with _Tp = ethosn::support_library::Network; _Args = {}]'
support_library/src/Support.cpp:405:38:   required from here
support_library/src/Network.hpp:81:5: error: but 'ethosn::support_library::Network::Network(bool)' does not throw; perhaps it should be declared 'noexcept' [-Werror=noexcept]
   81 |     Network(bool estimatePerformance = false)
      |     ^~~~~~~
/usr/include/c++/9/ext/new_allocator.h: In instantiation of 'void __gnu_cxx::new_allocator<_Tp>::construct(_Up*, _Args&& ...) [with _Up = ethosn::support_library::Network; _Args = {bool}; _Tp = ethosn::support_library::Network]':
/usr/include/c++/9/bits/alloc_traits.h:482:2:   required from 'static void std::allocator_traits<std::allocator<_CharT> >::construct(std::allocator_traits<std::allocator<_CharT> >::allocator_type&, _Up*, _Args&& ...) [with _Up = ethosn::support_library::Network; _Args = {bool}; _Tp = ethosn::support_library::Network; std::allocator_traits<std::allocator<_CharT> >::allocator_type = std::allocator<ethosn::support_library::Network>]'
/usr/include/c++/9/bits/shared_ptr_base.h:548:39:   required from 'std::_Sp_counted_ptr_inplace<_Tp, _Alloc, _Lp>::_Sp_counted_ptr_inplace(_Alloc, _Args&& ...) [with _Args = {bool}; _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr_base.h:679:16:   required from 'std::__shared_count<_Lp>::__shared_count(_Tp*&, std::_Sp_alloc_shared_tag<_Alloc>, _Args&& ...) [with _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {bool}; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr_base.h:1344:71:   required from 'std::__shared_ptr<_Tp, _Lp>::__shared_ptr(std::_Sp_alloc_shared_tag<_Tp>, _Args&& ...) [with _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {bool}; _Tp = ethosn::support_library::Network; __gnu_cxx::_Lock_policy _Lp = __gnu_cxx::_S_atomic]'
/usr/include/c++/9/bits/shared_ptr.h:359:59:   required from 'std::shared_ptr<_Tp>::shared_ptr(std::_Sp_alloc_shared_tag<_Tp>, _Args&& ...) [with _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {bool}; _Tp = ethosn::support_library::Network]'
/usr/include/c++/9/bits/shared_ptr.h:701:14:   required from 'std::shared_ptr<_Tp> std::allocate_shared(const _Alloc&, _Args&& ...) [with _Tp = ethosn::support_library::Network; _Alloc = std::allocator<ethosn::support_library::Network>; _Args = {bool}]'
/usr/include/c++/9/bits/shared_ptr.h:717:39:   required from 'std::shared_ptr<_Tp> std::make_shared(_Args&& ...) [with _Tp = ethosn::support_library::Network; _Args = {bool}]'
support_library/src/Support.cpp:410:42:   required from here
support_library/src/Network.hpp:81:5: error: but 'ethosn::support_library::Network::Network(bool)' does not throw; perhaps it should be declared 'noexcept' [-Werror=noexcept]
cc1plus: all warnings being treated as errors
scons: *** [support_library/build/release/src/Support.o] Error 1
scons: building terminated because of errors.

How to fix this problem?
This is a continueing discuss of apache/tvm#7261.

Ethos-N driver + simulated Ethos-N NPU?

Hi all,

I want to introduce some security protections to Arm NPU. It requires to modify the NPU driver to collaborate with some access control in TF-A. Currently, I want to implement on Arm Ethos-N NPU driver, but I do not have a real Ethos-N NPU.

Thus, if I use some NPU simulator, such as SCALE-Sim (https://github.com/ARM-software/SCALE-Sim), to simulate the NPU, will the driver works with the simulator?

Orange Pi 5B no driver file created after initializing the kernel module

Using Ubuntu 22.04 (5.10.160-rockchip kernel) with the SMMU config options as per your tutorial.

The firmware file was copied to the right folder, the ethosn.ko file initialized with sudo insmod ethosn.ko and it works:

dmesg | tail shows:
[ 8232.407774] Registering ethosn-main_allocator
[ 8232.410038] Registering ethosn-memory
[ 8232.411936] Registering ethosn-core

and lsmod shows:
ethosn 110592 0

and yet there is no device file:
$ ls /dev/ethosn*
ls: cannot access '/dev/ethosn*': No such file or directory

and so unit tests don't work etc.

What could be wrong?

How can I get a device with Ethnos NPU?

Hello, I am interested in running several networks on arm Ethos NPU and check its power usage on android device.
However, I couldn't find a mobile device that has Ethos NPU in it.
Is there any way I can get a device with Ethos NPU?

Build fails on Centos7/manylinux2014 and Ubuntu:latest

The build on Centos7/manylinux2014 fails at build stage:

g++ -o support_library/build/release/src/Support.o -c -std=c++14 -O3 -Werror -Wall -Wextra -Wformat=2 -Wno-format-nonliteral -Wctor-dtor-privacy -Woverloaded-virtual -Wsign-promo -Wstrict-overflow=2 -Wswitch-default -Wlogical-op -Wnoexcept -Wstrict-null-sentinel -Wconversion -fPIC -Icommand_stream/include -Iutils/include -Isupport_library/build/release/src -Isupport_library/src -Isupport_library/build/release/include -Isupport_library/include support_library/src/Support.cpp

With warnings that include:

In file included from support_library/src/Graph.hpp:9, from support_library/src/Compiler.hpp:10, from support_library/src/Support.cpp:9: /opt/rh/devtoolset-9/root/usr/include/c++/9/ext/new_allocator.h: In instantiation of 'void __gnu_cxx::new_allocator<_Tp>::construct(_Up*, _Args&& ...) [with _Up = ethosn::support_library::Operand::Consumer; _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer]': /opt/rh/devtoolset-9/root/usr/include/c++/9/bits/alloc_traits.h:482:2: required from 'static void std::allocator_traits<std::allocator<_CharT> >::construct(std::allocator_traits<std::allocator<_CharT> >::allocator_type&, _Up*, _Args&& ...) [with _Up = ethosn::support_library::Operand::Consumer; _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer; std::allocator_traits<std::allocator<_CharT> >::allocator_type = std::allocator<ethosn::support_library::Operand::Consumer>]' /opt/rh/devtoolset-9/root/usr/include/c++/9/bits/vector.tcc:115:30: required from 'void std::vector<_Tp, _Alloc>::emplace_back(_Args&& ...) [with _Args = {ethosn::support_library::Operation&, const long unsigned int&}; _Tp = ethosn::support_library::Operand::Consumer; _Alloc = std::allocator<ethosn::support_library::Operand::Consumer>]' support_library/src/Network.hpp:46:50: required from here support_library/src/Network.hpp:28:19: error: but 'constexpr ethosn::support_library::Operand::Consumer::Consumer(ethosn::support_library::Operation&, size_t)' does not throw; perhaps it should be declared 'noexcept' [-Werror=noexcept] 28 | constexpr Consumer(Operation& operation, const size_t inputIndex) | ^~~~~~~~

Resulting in

cc1plus: all warnings being treated as errors scons: *** [support_library/build/release/src/Support.o] Error 1 scons: building terminated because of errors.

Similar warnings appear regardless of gcc toolchain version installed (7 or 8).

Support of elementwise mulitiplication

Hi,
Arm NN Ethos-N backend 21.08 supports multiplication of a variable by a scalar constant. Does this means Ethos-N backend 21.08 currently does not support the operation of elementwise multiplication for multiplying two tensors?

Best regard,
Rahn

SPA test generates warning "Layer of type is not supported"

Hi,
We have a license to use a SPA program, and we analyze modified BiSeNetv2 models with the SPA program for evaluating NPU. During the analysis process of modified BiSeNetv2 int8 and uint8 model, we met the following warning respectively:

(uint8): Layer of type Resize is not supported on request backend EthosNAcc for input data type QAsymmU8 and output data type QAsymmU8
14950996264721

(int8): Layer of type DepthwiseConvolution2d is not supported on request backend EthosNAcc for input data type QAsymmS8 and output data type QAsymmS8
14950997786476

We want to ask:

  1. whether the warning messages above affect the results when using a physical NPU?
  2. The warning messages informs the layer is falling back to the next backend, will the unsupported layer be executed on GPU or CPU?

Best Regards,
Rahn

An exception occurred using the cache network function

I first changed the optimization options to save the compiled network, and then changed the optimization options again to load the network that was serialized to the file in the previous step, but there was an exception, which seems to be due to a format check error during deserialization

The execution log is as follows:

./object_detection_example --model-file-path ./22W27-yolov5s-conv3-conv2-relu_640-int8.tflite --label-path ./label.txt --preferred-backends EthosNAcc --img-path ./test.jpg --num-runs 1

Info: ObjectDetection example is running...
Info: ObjectDetection example profiling is running...
Info: ArmNN v29.0.0
Info: Initialization time: 0.41 ms.
Info: Network parsing time: 303.61 ms.
Debug: OptimizerOptions:
ReduceFp32ToFp16: 0
ReduceFp32ToBf16: 0
Debug: 0
ShapeInferenceMethod: ValidateOnly
ImportEnabled: 0
ProfilingEnabled: 0
ModelOptions:
Backend: EthosNAcc
Option: SaveCachedNetwork
Value: false
Backend: EthosNAcc
Option: CachedNetworkFilePath
Value: /mnt/run/EthosN-CachingEndToEnd-TempFile.bin

Info: Optimization time: 304.33 ms.
Error: An error occurred when preparing the network workloads: Not a serialized CompiledNetwork
Error: An error occurred when preparing the network workloads: Not a serialized CompiledNetwork
terminate called after throwing an instance of 'armnn::Exception'
what(): An error occurred when preparing the network workloads: Not a serialized CompiledNetwork
Aborted

The converged pad and convolution NPU are not supported

pad: pad_left: 1 pad_right: 1 pad_top: 1 pad_bottom: 1
convolution: dialation: 1 padding: VALID stride: 2 kernel_size: 3x3
After actual testing,the above convolution NPU is supported, but the NPU does not support the fusion of the two operators. The problem is that when the two operators are fused, the pad mode of convolution is SAME, which is supported by NPU documentation.

The NPU documentation is described as follows:

Convolution 2D

  • HWIO format weights are supported.
  • The supported kernel heights and widths (the kernel does not have to be square) are: { 1, 2, 3, 5, 7, 9 }.
    - The supported strides (the height and width stride have to match) are: { 1, 2 }.
  • For kernels with height or width >7, only a stride of 1 is supported.
    - SAME and VALID padding are supported.
  • I*W/O must be between 2.33e-10 and 1, where:
    • I is the input quantization scale.
    • W is the weight quantization scale.
    • O is the output quantization scale.

Kernel support

Hello!

The readme says that the ethos-n-driver has been tested with Kernels v5.4 and v5.10.

Is there a roadmap for newer kernels to be supported/tested?

Thanks, Chris

Best practise to create a quantized tflite model

Can you guys share a recommended way fo producing quantised tflite model for the SPA tool to consume? I have followed the instructions from Tensorflow to quantize a simple mnist model, but the tflite model generated can not be processed by the SPA tool(AnalyzeNetowork command):
Screenshot from 2021-12-12 23-16-32

The example code can be found here. I used tensoflow 2.3.4 to run the code, which matches the version I compiled the SPA with:armnn 21.05 and tensorflow r2.3. I have tried both tf.lite.TFLiteConverter.from_keras_model and tf.lite.TFLiteConverter.from_saved_model, neither of the quantized model can be consumed by the SPA tool. Can you guys advice?

Conv2D could not be estimated

Hi,

We have a license to the SPA tool (compiled with 21.05 armnn and tensorflow r2.3 specifically). We tried to evaluated a quantized network. The AnalyzeNetwork command executed without exceptions, but the the perfreport it generated listed 3 unsupported operators:
Screenshot from 2021-12-12 23-05-42
.
However, the Conv2D it complains about is a normal Convolution layer just like the rest of the network Conv2D layers, which can be seen in the attached. I have attached the model here.

Can you guys let me know what I am missing?

Inference result of int8 model on NPU

Hi,
We've tested an int8 model with ArmNN 21.08 and Compute Library 21.08 patch, which solves the issue of wrong result in int8 models.

To further evaluate the results on NPU, we want to ask that if we apply the same tflite model on NPU, will the inferred results contain noise or the results be identical to TFLite API?

Best regards,
Rahn

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.