bhuman / compilednn Goto Github PK
View Code? Open in Web Editor NEWA JIT Compiler for Neural Network Inference
License: MIT License
A JIT Compiler for Neural Network Inference
License: MIT License
Models for sequence data need 1D variants of the convolutional and pooling layers.
AC: WhistleNetMk16 can be run in CompiledNN
Hey,
I'm currently trying to compile my network with your lib. As the title says I get an ASSERT(result) failed in NeuralNetwork::Model::parseJSONModel.
In the example you are giving it says:
Model model;
model.load("model.h5");
Though I could not find a load(...)
function and thus used:
Model model;
model.loadKerasHDF5("model.h5");
My network is saved using model.save(...)
, is there anything else needed?
I am using Keras Version 2.1.3. Which version are you using and have you yet experienced incompatible versions?
Currently, softmax is only implemented on flat tensors and is thus only useful as the last layer of a classification model.
In the future, the softmax operation should allow multi-dimensional input and choosing an arbitrary axis along which the operation is applied.
Is the README the only documentation? In particular, I'm looking to create a tiny_dnn-compatible 2d convolutional layer using CompiledNN, in order to benchmark it against a new one I created "tiny_jnn", but I don't want to replace the rest of our pipeline - just the one layer.
Hey,
I just tried your repo and the command make install
fails with the following error:
[100%] Built target CompiledNN
Install the project...
-- Install configuration: ""
-- Installing: /usr/local/lib/libCompiledNN.a
-- Installing: /usr/local/include/CompiledNN/CompiledNN.h
-- Installing: /usr/local/include/CompiledNN/Model.h
-- Installing: /usr/local/include/CompiledNN/SimpleNN.h
-- Installing: /usr/local/include/CompiledNN/Tensor.h
-- Installing: /usr/local/include/CompiledNN/CompiledNN/CompilationSettings.h
-- Installing: /usr/local/include/CompiledNN/Platform/BHAssert.h
CMake Error at cmake_install.cmake:67 (file):
file INSTALL cannot find
"/tmp/CompiledNN/3rdParty/B-Human/Tools/Math/BHMath.h": No such file or
directory.
make: *** [Makefile:100: install] Error 1
This file does indeed not exist in my folder structure, but there is B-Human/MathBase/
and commit b6ecd81 which might be related.
At least the individual operation compilers need to be automatically tested for correctness. This has already been started for the UpSampling2D
operation. A further step would be testing the merging of layers.
Summary of the original issue description in the B-Human repository:
We do not know whether inference is always correct. Some bugs are known, but there are probably more that just do not occur in the models we use. We need systematic, unit-test-like tests which create all kinds of network architectures, apply them to random data and compare the result of CompiledNN to SimpleNN. The goal is 100% path coverage of all compiler classes, also considering that some layers can sometimes be fused. The tests should be a standalone executable in the CompiledNN repository. A nice addition would be the evaluation of execution times of different model configurations. E.g., once the implementations of Im2Col
and Conv1x1
are done, there need to be systematic comparisons between Conv2D
and Im2Col + Conv1x1
.
Currently, the loading ONNX files results in the following output:
/tmp/CompiledNN/Src/CompiledNN/Formats/ONNX.cpp:37: FAIL: Not yet implemented.
ONNX would be cool as it is not so dependent on the training framework and some code seems to be there.
Hello,
I'm trying to compile a simple model
inputs = Input((120,160,1))
conv1 = Conv2D(64, 3, activation = 'relu', padding = 'same')(inputs)
pool1 = MaxPooling2D(pool_size=(2, 2))(conv1)
conv2 = Conv2D(128, 3, activation = 'relu', padding = 'same')(pool1)
up1 = UpSampling2D(size = (2,2))(conv2)
merge = Concatenate()([up1,conv1] )
conv3 = Conv2D(64, 3, activation = 'relu', padding = 'same')(merge)
outputs = Conv2D(64, (1,1))(conv3)
model = Model(input = inputs, output = outputs)
but I get the following error, which i don't fully understand.
/home/barth/Workspace/CompiledNN/Src/CompiledNN/CompiledNN/Operations/UpSampling2D.cpp:25: ASSERT(input.dims(2) <= settings.xmmRegs() * 4) failed
Aborted (core dumped)
At HULKs, we are experiencing sporadic segmentation faults at:
CompiledNN/Src/CompiledNN/Tensor.h
Line 225 in f785096
In our code, we get a segmentation fault when we are using nn.input(0)[528]
. The following assertions hold true in our use case:
assert(nn.input(0).size() == 1024);
assert(nn.input(0).dims().size() == 3);
assert(nn.input(0).dims()[0] == 32);
assert(nn.input(0).dims()[1] == 32);
assert(nn.input(0).dims()[2] == 1);
assert(nn.input(0).rank() == 3);
assert(nn.input(0).capacity() == 8195);
The internal Tensor::buffer
has length 32795
and alignmentShift
has an upper bound of 64
. I then conclude that the buffer should be large enough for our nn.input(0)[528]
.
CompiledNN is used in a multithreaded program where we have separate loaded models and compiled nets per thread (should therefore be thread-safe).
It seems that some input data triggers the segmentation fault more often than other, but we don't know why.
This is because the store-operations write four elements even when there are only two or three, which overwrites data that need to be read in the future.
and use it for Dense layers and 1x1 convolutions
GlobalPooling2D is currently only implemented for a number of channels between 3 and 64 (or 32 on 32-bit systems).
It needs to be handled at runtime, not at compile time. This currently limits the number of channels to at most one full batch (plus another non-full batch).
I'm trying to add a new target with an example usage of this library to the cmake file. Linking against the CompiledNN library yields this error though.
If I understand the cmake stuff correctly, this is because the B-Human subdirectory is included as PRIVATE. So targets depending on CompiledNN don't transitively include the files as well.
To fix this, I think, one has to either include the B-Human subdirectory as PUBLIC or remove the uses from the public headers of CompiledNN.
Could someone help me setup the project?
i tried making a main.cpp file with the code in the readme file.
I extended the CMakeFile.txt with
add_executable(main main.cpp)
target_link_libraries(main PRIVATE CompiledNN)
I build with
mkdir _build && cd _build
cmake .. -G "Unix Makefiles"
cmake --build .
but i get the following error:
CMakeFiles/main.dir/main.cpp.o: In Funktion »NeuralNetwork::CompiledNN::apply() const«:
main.cpp:(.text._ZNK13NeuralNetwork10CompiledNN5applyEv[_ZNK13NeuralNetwork10CompiledNN5applyEv]+0x3c): Warnung: undefinierter Verweis auf »Assert::print(char const*, int, char const*, ...)«
main.cpp:(.text._ZNK13NeuralNetwork10CompiledNN5applyEv[_ZNK13NeuralNetwork10CompiledNN5applyEv]+0x41): Warnung: undefinierter Verweis auf »Assert::abort()«
collect2: error: ld returned 1 exit status
CMakeFiles/main.dir/build.make:90: recipe for target 'main' failed
make[2]: *** [main] Error 1
CMakeFiles/Makefile2:104: recipe for target 'CMakeFiles/main.dir/all' failed
make[1]: *** [CMakeFiles/main.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.