Coder Social home page Coder Social logo

xilinx / bnn-pynq Goto Github PK

View Code? Open in Web Editor NEW
645.0 53.0 306.0 186.11 MB

Quantized Neural Networks (QNNs) on PYNQ

Home Page: https://xilinx.github.io/finn/

License: BSD 3-Clause "New" or "Revised" License

Python 1.86% Tcl 6.08% Jupyter Notebook 89.96% C++ 1.69% C 0.18% Shell 0.17% Dockerfile 0.07%

bnn-pynq's Introduction

BNN-PYNQ PIP INSTALL Package

This repo contains the pip install package for Quantized Neural Network (QNN) on PYNQ. Two different network topologies are here included, namely CNV and LFC as described in the FINN Paper . Now, there are multiple implementations available supporting different precision for weights and activation:

  • 1 bit weights and 1 bit activation (W1A1) for CNV and LFC
  • 1 bit weights and 2 bit activation (W1A2) for CNV and LFC
  • 2 bit weights and 2 bit activation (W2A2) for CNV

We support 3 boards for hardware acceleration which are Pynq-Z1, Pynq-Z2 and Ultra96 (with PYNQ image).

Note, this repository has now been archived and is no longer being actively maintained. If you are relying on this repository, we strongly recommend you switch to the FINN compiler.

Citation

If you find BNN-PYNQ useful, please cite the FINN paper:

@inproceedings{finn,
author = {Umuroglu, Yaman and Fraser, Nicholas J. and Gambardella, Giulio and Blott, Michaela and Leong, Philip and Jahre, Magnus and Vissers, Kees},
title = {FINN: A Framework for Fast, Scalable Binarized Neural Network Inference},
booktitle = {Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays},
series = {FPGA '17},
year = {2017},
pages = {65--74},
publisher = {ACM}
}

Quick Start

Please refer to PYNQ Getting Started guide to set-up your PYNQ Board.

In order to install it to your PYNQ, connect to the board, open a terminal and type:

sudo pip3 install git+https://github.com/Xilinx/BNN-PYNQ.git (on PYNQ v2.3 and later versions, tested up to v2.5)
sudo pip3.6 install git+https://github.com/Xilinx/BNN-PYNQ.git (on PYNQ v2.2 and earlier)

This will install the BNN package to your board, and create a bnn directory in the Jupyter home area. You will find the Jupyter notebooks to test the networks in this directory.

Repo organization

The repo is organized as follows:

  • bnn: contains the LfcClassifier and CnvClassifier python class description
    • src: contains the sources of the different precision networks, the libraries to rebuild them, and scripts to train and pack the weights:
      • library: FINN library for HLS QNN descriptions, host code, script to rebuilt and drivers for the PYNQ and Ultra96 (please refer to README for more details)
      • network: HLS top functions for QNN topologies (CNV and LFC) with different implementations for weight and activation precision, host code and make script for HW and SW build (please refer to README for more details)
      • training: scripts to train on the Cifar10, GTSRB and MNIST datasets and scripts to pack the weights in a binary format which can be read by the overlay
    • bitstreams: contains the bitstreams for the 5 overlays
      • pynqZ1-Z2: bitstreams for Pynq devices
      • ultra96: bitstreams for Ultra96 devices
    • libraries: pre-compiled shared objects for low-level driver of the 5 overlays each for hardware and software runtime
      • pynqZ1-Z2: shared objects used by Pynq devices
      • ultra96: shared objects used by ultra96
    • params: set of trained parameters for the 5 overlays:
      • MNIST and NIST dataset for LFC network. Note that NIST dataset is only applicable to LFC-W1A1 by default.
      • Cifar10 , SVHN and German Road Signs dataset for CNV network. Note that SVHN and German Road Signs databases are only applicable to CNV-W1A1 by default.
  • notebooks: lists a set of python notebooks examples, that during installation will be moved in /home/xilinx/jupyter_notebooks/bnn/ folder
  • tests: contains test script and test images

Hardware design rebuilt

In order to rebuild the hardware designs, the repo should be cloned in a machine with installation of the Vivado Design Suite (tested with 2018.2). Following the step-by-step instructions:

  1. Clone the repository on your linux machine: git clone https://github.com/Xilinx/BNN-PYNQ.git --recursive;
  2. Move to <clone_path>/BNN_PYNQ/bnn/src/network/
  3. Set the XILINX_BNN_ROOT environment variable to <clone_path>/BNN_PYNQ/bnn/src/
  4. Launch the shell script make-hw.sh with passing parameters for target network, target platform and mode, with the command ./make-hw.sh {network} {platform} {mode} where:
    • network can be cnvW1A1, cnvW1A2, cnvW2A2 or lfcW1A1, lfcW1A2;
    • platform can be pynqZ1-Z2 or ultra96;
    • mode can be h to launch Vivado HLS synthesis, b to launch the Vivado project (needs HLS synthesis results), a to launch both;
  5. The results will be visible in clone_path/BNN_PYNQ/bnn/src/network/output/ that is organized as follows:
    • bitstream: contains the generated bitstream(s);
    • hls-syn: contains the Vivado HLS generated RTL and IP (in the subfolder named as the target network and target platform);
    • report: contains the Vivado and Vivado HLS reports;
    • vivado: contains the Vivado project;
  6. Copy the generated bitstream, hwh and tcl script on the PYNQ board pip_installation_path/bnn/bitstreams/

bnn-pynq's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bnn-pynq's Issues

Why should we convert image file to bin file in the dataset

Hi! I just want to ask a simple question, but it confused me a couple of days. Why should we convert image file to bin file? How can I open the bin file? Which means if I don't know what is exactly the input data in the bin file, it is hard for me to understand the whole data flow in the BNN architecture. Looking forward for your reply! Thank you!

After training my own datasets and generating the binary weights

After training my own datasets and generating the binary weights, i get binparam-lfc-pynq .
How can i use these param in my PYNQ? My data was trained with lfc form. Do i need to rename the folder
as lfcW1A1 and replace the former one in bnn\params\mnist? If not , what should i do?

Running BNN-PYNQ on a PYNQ-Z2 board

Hi,
I have run BNN-PYNQ on a PYNQ-Z2 board. But there is an error:

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-7068xha9-build/

Questions about hardware design rebuild

Hi!
I am trying to rebuild the hardware design, but i find there are some problems.
My system is centos 7, and your recommend system is Ubuntu, i have no idea whether there is problem with system.My vivado version is Vivado 2017.4 webpack.
When i run the "make-hw.sh" file it says that can not find tiny_cnn. So i download tiny_cnn from https://github.com/Xilinx/xilinx-tiny-cnn. Then i copy the tiny_cnn folder to XILINX_BNN_ROOT/library/host.
I am not sure whether my tiny_cnn version is right and whether my tiny_cnn work well but problem disappear when i do like this however then new problem appear.
default
I am not very fimilar with Vivado HLS, please give me some advices.
Thanks a lot!

set up question

Hello,
I am now trying to set up this BNN project on my PYNQ-Z1, but I meet some issues. Since I am not allowed to connect my PYNQ to Network Router in my university, I connect it directly to my computer. So I could not use "sudo pip3.6 install git+https://github.com/Xilinx/BNN-PYNQ.git" to install this package to my board. What I did was download the whole BNN-PYNQ, extract it and copy to \192.168.2.99\xilinx., I added the VIVADO-HLS include under \192.168.2.99\xilinx\BNN-PYNQ-master\bnn, then I added one line in the setup.py:"os.environ['VIVADOHLS_INCLUDE_PATH']=site.getsitepackages()[0] + "/bnn/"
Then used the terminal to run the setup. What I typed on the terminal on PYNQ was: sudo python3.6 setup.py install, it did show running install at first, but at last it shows syntax Error. Could you guys help me figure out the problem? the detail is below.
Thank you!

byte-compiling build/bdist.linux-armv7l/egg/bnn/src/training/characters-gen-binary-weights.py to characters-gen-binary
-weights.cpython-36.pyc
byte-compiling build/bdist.linux-armv7l/egg/bnn/src/training/lfc.py to lfc.cpython-36.pyc
byte-compiling build/bdist.linux-armv7l/egg/bnn/src/training/augmentors.py to augmentors.cpython-36.pyc
byte-compiling build/bdist.linux-armv7l/egg/bnn/src/training/gtsrb.py to gtsrb.cpython-36.pyc
byte-compiling build/bdist.linux-armv7l/egg/bnn/src/training/finnthesizer.py to finnthesizer.cpython-36.pyc
File "build/bdist.linux-armv7l/egg/bnn/src/training/finnthesizer.py", line 60
print "Layer %d: %d x %d, SIMD = %d, PE = %d" % (l, paddedH, paddedW, simdCount, peCount)
^
SyntaxError: invalid syntax

byte-compiling build/bdist.linux-armv7l/egg/bnn/init.py to init.cpython-36.pyc
installing package data to build/bdist.linux-armv7l/egg
running install_data
creating build/bdist.linux-armv7l/egg/EGG-INFO
copying bnn_pynq.egg-info/PKG-INFO -> build/bdist.linux-armv7l/egg/EGG-INFO
copying bnn_pynq.egg-info/SOURCES.txt -> build/bdist.linux-armv7l/egg/EGG-INFO
copying bnn_pynq.egg-info/dependency_links.txt -> build/bdist.linux-armv7l/egg/EGG-INFO
copying bnn_pynq.egg-info/top_level.txt -> build/bdist.linux-armv7l/egg/EGG-INFO
writing build/bdist.linux-armv7l/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
bnn.pycache.bnn.cpython-36: module references file
creating 'dist/bnn_pynq-0.1-py3.6.egg' and adding 'build/bdist.linux-armv7l/egg' to it
removing 'build/bdist.linux-armv7l/egg' (and everything under it)
Processing bnn_pynq-0.1-py3.6.egg
removing '/opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg' (and everything under it)
creating /opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg
Extracting bnn_pynq-0.1-py3.6.egg to /opt/python3.6/lib/python3.6/site-packages
File "/opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg/bnn/src/training/gtsrb-gen-binary-weights.py
", line 105
print "Using peCount = %d simdCount = %d for engine %d" % (peCount, simdCount, convl)
^
SyntaxError: invalid syntax

File "/opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg/bnn/src/training/cifar10-gen-binary-weights.
py", line 60
print "Using peCount = %d simdCount = %d for engine %d" % (peCount, simdCount, convl)
^
SyntaxError: invalid syntax

File "/opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg/bnn/src/training/binary_net.py", line 58
def c_code(self, node, name, (x,), (z,), sub):
^
SyntaxError: invalid syntax

File "/opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg/bnn/src/training/finnthesizer.py", line 60
print "Layer %d: %d x %d, SIMD = %d, PE = %d" % (l, paddedH, paddedW, simdCount, peCount)
^
SyntaxError: invalid syntax

bnn-pynq 0.1 is already the active version in easy-install.pth

Installed /opt/python3.6/lib/python3.6/site-packages/bnn_pynq-0.1-py3.6.egg
Processing dependencies for bnn-pynq==0.1
Finished processing dependencies for bnn-pynq==0.1
bash: /opt/python3.6/lib/python3.6/site-packages/bnn/src/network//make-sw.sh: No such file or directory
Traceback (most recent call last):
File "setup.py", line 76, in
run_make(XILINX_BNN_ROOT+"/bnn/src/network/", "cnv-pynq" ,"python_sw")
File "setup.py", line 65, in run_make
status = subprocess.check_call(["bash", src_path + "/make-sw.sh", network, output_type])
File "/opt/python3.6/lib/python3.6/subprocess.py", line 291, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['bash', '/opt/python3.6/lib/python3.6/site-packages/bnn/src/network//make-sw.
sh', 'cnv-pynq', 'python_sw']' returned non-zero exit status 127.
image

image
image

image

image

Misc questions

  1. I noticed an accuracy drop of upto 7% between training and inference when porting a 62 class dataset onto the LFC model. What could be the possible reasons for this?

  2. Does the ConvLayer support padding/ 'same' convolution? (I believe not)

  3. Does the MaxPooling Layer support odd dimension input images? (Again, I believe it does not).

Warning: Cannot find design file

Hi I am trying to setup this package and I am running into a few issues.

I cloned the repository, set the XILINX_BNN_ROOT variable, launched the hw shell script for both cnv-pynq and lfc-pynq but I get this error.

error2

Another issue I have is I'm not clear on the quick start instructions for the Pynq board. It is asking to copy the include folder from Windows to the Pynq board correct? I was wondering if there is an easy way to copy the folder since there is over 200 files, and Jupyter does not let me import folders, only files.

Thanks,
Jonathan

Cifar10 cannot be launched in H/W after capturing Video through HDMI-in

Hi,
Now I am trying to launch Cifar10 in H/W after capturing Video through HDMI-in.
Software flow I wanna do is as below:

  1. Capture a image through HDMI-in
  2. Classify the image by CONV(Cifar-10)
  3. Output the image with classified result through HDMI-out

1

At first when I tried classification for a local JPEG image before configuring HDMI-I/O, everything was fine like this:

>>> import bnn
>>> classifier = bnn.CnvClassifier('cifar10') # Launched in H/W
Setting network weights and thresholds in accelerator...
>>> from PIL import Image
>>> im = Image.open('./img/cifar10/car.jpg')
>>> class_out = classifier.classify_image(im)
Packing and interleaving CIFAR-10 inputs...
Running prebuilt CIFAR-10 test for 1 images...
Inference took 1584 microseconds, 1584 usec per image
Classification rate: 631.313 images per second
Inference took 1584.00 microseconds
Classification rate: 631.31 images per second
>>>

2

Second, I configured Overlay for HDMI-i/o, and again I tried classification for a image captured by camera through HDMI-in. But not work, freeze in the following state:

>>> from pynq import Overlay
>>> from pynq.lib.video import *
>>> base = Overlay("base.bit")
>>> base.download()
>>> hdmi_in = base.video.hdmi_in
>>> hdmi_out = base.video.hdmi_out
>>> hdmi_in.configure(PIXEL_RGB)
<contextlib._GeneratorContextManager object at 0x2fc232b0>
>>> hdmi_out.configure(hdmi_in.mode, PIXEL_RGB)
<contextlib._GeneratorContextManager object at 0x2fc234f0>
>>> hdmi_in.start()
<contextlib._GeneratorContextManager object at 0x33d14110>
>>> hdmi_out.start()
<contextlib._GeneratorContextManager object at 0x33ee5790>
>>> inframe = hdmi_in.readframe()
>>> im = Image.fromarray(inframe)
>>> class_out = classifier.classify_image(im)
Packing and interleaving CIFAR-10 inputs...
Running prebuilt CIFAR-10 test for 1 images...

Where's wrong ?
Sorry to be poor English, it would be greatly appreciated if you could explain the details.

Best regards

BNN Compatibility with MYiR Zynq XC7Z020

Hello,

I have a MYiR Zynq XC7Z020 board with two ddr3 IC and I want to run the BNN over there. Unfortunately I
just found a .bit file for the PYNQ board. Can this project be ported to others Zynq base boards?

Thank you in advance.

Kevin

some questions about Hardware design rebuild

HI!
In order to intergrate the BNN ip core with my base overlay, i try to rebuild the hardware designs ,following the instructions. After the ./make-hw.sh cnvW1A1 pynqZ1-Z2 a is finshed, i open the vivado project and there are some problems.

WARNING: [IP_Flow 19-3664] IP 'procsys_BlackBoxJam_0_0' generated file not found '/home/sxw228/BNN_PYNQ/bnn/src/network/output/vivado/cnvW1A1-pynqZ1-Z2/cnvW1A1-pynqZ1-Z2.srcs/sources_1/bd/procsys/ip/procsys_BlackBoxJam_0_0/procsys_BlackBoxJam_0_0.dcp'. Please regenerate to continue.
WARNING: [IP_Flow 19-3664] IP 'procsys_BlackBoxJam_0_0' generated file not found '/home/sxw228/BNN_PYNQ/bnn/src/network/output/vivado/cnvW1A1-pynqZ1-Z2/cnvW1A1-pynqZ1-Z2.srcs/sources_1/bd/procsys/ip/procsys_BlackBoxJam_0_0/procsys_BlackBoxJam_0_0_stub.v'. Please regenerate to continue.
WARNING: [IP_Flow 19-3664] IP 'procsys_BlackBoxJam_0_0' generated file not found '/home/sxw228/BNN_PYNQ/bnn/src/network/output/vivado/cnvW1A1-pynqZ1-Z2/cnvW1A1-pynqZ1-Z2.srcs/sources_1/bd/procsys/ip/procsys_BlackBoxJam_0_0/procsys_BlackBoxJam_0_0_stub.vhdl'. Please regenerate to continue.
WARNING: [IP_Flow 19-3664] IP 'procsys_BlackBoxJam_0_0' generated file not found '/home/sxw228/BNN_PYNQ/bnn/src/network/output/vivado/cnvW1A1-pynqZ1-Z2/cnvW1A1-pynqZ1-Z2.srcs/sources_1/bd/procsys/ip/procsys_BlackBoxJam_0_0/procsys_BlackBoxJam_0_0_sim_netlist.v'. Please regenerate to continue.
WARNING: [IP_Flow 19-3664] IP 'procsys_BlackBoxJam_0_0' generated file not found '/home/sxw228/BNN_PYNQ/bnn/src/network/output/vivado/cnvW1A1-pynqZ1-Z2/cnvW1A1-pynqZ1-Z2.srcs/sources_1/bd/procsys/ip/procsys_BlackBoxJam_0_0/procsys_BlackBoxJam_0_0_sim_netlist.vhdl'. Please regenerate to continue.
WARNING: [BD 41-1663] The design 'procsys.bd' cannot be modified due to following reason(s):

  • Block design BD file is read-only. Please check file-permissions.
  • Project is read-only. Please run 'save_project_as' before making any change to the bd-design.
  • Block design BXML file is read-only. Please check file-permissions.

WARNING: [BD 41-1661] One or more IPs have been locked in the design 'procsys.bd'. Please run report_ip_status for more details and recommendations on how to fix this issue.
List of locked IPs:
procsys_auto_pc_0
procsys_BlackBoxJam_0_0
procsys_ps7_0
procsys_rst_ps7_100M_0
procsys_ps7_axi_periph_0
procsys_axi_mem_intercon_0
procsys_auto_pc_1

It says that procsys_BlackBoxjam_0_0_synth_1 Failed
So what's wrong. How can i fix this problem?

Environment setting

Hi,
When doing the Hardware design rebuilt, how can I set the XILINX_BNN_ROOT environment variable to clone_path/BNN_PYNQ/bnn/src/. I am not good at Ubuntu ,please give an advice.

why do I execute this code at a slow speed(pynq-z2 and pynq2.3 images)?

this code is included in http://192.168.2.99:9090/notebooks/bnn/CNV-BNN_Cifar10.ipynb in pynq-z2(pynq2.3)。
Concretely,in my board:
image

Inference took 1582.00 microseconds
Classification rate: 632.11 images per second
[CLASS] [RANKING]
Airplane 256
Automobile 379
Bird 165
Cat 169
Deer 163
Dog 244
Frog 249
Horse 244

  Ship       267
 Truck       268

but in this Repositories ,I see that:
image
Inference took 527.00 microseconds
Classification rate: 1897.53 images per second
[CLASS] [RANKING]
Airplane 256
Automobile 379
Bird 165
Cat 169
Deer 163
Dog 244
Frog 249
Horse 244
Ship 267
Truck 268

as u see, the speed in my board is three times slower.

Running BNN on a PYNQ-Z2 board

Hi,
I have run BNN-PYNQ on a PYNQ-Z2 board. But there is an error:

Complete output from command python setup.py egg_info:
Only supported on a Pynq Z1 Board
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-7068xha9-build/

What should I do ?
Thanks.

After bitsteam generation

Once the bitstream is generated, do you program the FPGA with the bitstream using Vivado? How do you have the FPGA take an MNIST file and measure its performance? I have the bitstream created, as well as Linux running on the PYNQ and the mnist_parameters.npz/binparam-lfc-pynq created. What are the next steps?

Make my a training with my own images

Hi, i have a pack of images and i want to make the training based on them.
I saw (as example) that the mnist.py (from BNN-PYNQ/bnn/src/training) ** import the datasets from pylearn2** (from pylearn2.datasets.mnist import MNIST), inside of this i can see :

t10k-images-idx3-ubyte
train-images-idx3-ubyte
t10k-labels-idx1-ubyte
train-labels-idx1-ubyte

So here is my question. How can i do my own "ubyte files" from my own pack of images to do the training of them?
You have some script to do that ?
Do you used pylearn2 ?(because i read that its obsolete from may of 2017)

Inference result different after training the mnist dataset again

I can make-hw.sh with csim produce the correct inference result from git clone the repo.

However, when I go thru the training process on mnist (which result in 1.6% test error) and use the weights and threshold bin file to re-run the csim again. the inference result mismatch.

i wonder what step i am missing? anyone sucessfully re-run the bnn-pynq from training to inference and csim?

ImportError

When I am running "sudo pip install "...this BNN-PYNQ.git" in pynq2.3 on PYNQ-Z2 Board ,some errors are as follows:
File "bnn/bnn.py" ,line 30,in
from pynq import Overlay,PL
ImportError:No module named pynq

Python 3.6

The current code on master branch seems having some compilation issues with print function and tuple argument (PEP 3113) in Python 3.6.

I include my fixes here. FYI.
fix_python3.6.txt

Theano Lasagne Inference

Hi,
Apart using HLS to do inference, I would like to have sample on theano lasagne to perform inference and cross-check with HLS.
Any example can be given?

Thanks

Could you please tell me how FINN works to transfer Python codes to C++?

Dear Giuliogamba,
hello!
I am working with neural network on FPGA. And I'm very interested in your project. But could you please tell me how can you generate the "top.cpp"? Did you write some python codes and use "setup.py" to transfer it?
I am a beginner please excuse my easy question.
Sincerely,
Thomas Zhang
4. Sep

Source code for low level driver

Hi,

I am porting the LFC to a smaller device and found that it cannot fits in.
So I modified the whole thing to SFC and it's already working on HLS CSIM.

Now I have a problem on loading the *.so and would like to seek for your help.
It would be very nice if you can at least release the API definition of those function in *.so

Or if possible source code will definitely good for further customization.

Parameters on-chip

Hi,
the FINN paper says that the parameters are kept on-chip removing the off-chip memory
bottleneck.
In your code, what is the point in that is possible to note that?
I saw the allocAccelBuffer function in xlnkdriver.hpp, but I think it allocates memory in /dev/mem for accelBufIn and accelBufOut.
Can you clarify me how the parameters are loaded on-chip?
Thanks,

Sara

Runing other BNNs on PYNQ using FINN

Hi!
I have finished the PYNQ installing and ran the python notebooks examples successfully on the PYNQ board. on the other hand ,in order to find how the BNN works on the PYNQ, I read the 'FINN' papar carfully,but still have many problems in the papar . I have two questions:

  1. Are there futher way to study your FINN for some details ?
  2. Can I Runing the BNNs designed by myself which have diffirent topologies on PYNQ using FINN,and how to do it ?

Problem about hardware design rebuilt

Hi!
I'm rebuilding the hardware design following the guide. But I've met some problem.

problem

As you can see, the problem occurs when running the hardware synthesis. It seems that the argument "-compile" doesn't work for the "csim_design" command. I tried to delete the argument, but it still fails with compilation error.
Could you please help me to solve the problem?

CVN-PYNQ from train to inference

Hi,

Anyone tried to retrain the CIFAR10 network and put it in HLS simulation?
The newly generated parameters cannot correctly classified the deer

Steps to reproduce

  1. python cifar10.py in training folder
  2. python cifar10-gen-binary-weights.py
  3. copy the param to param/cifar10/*
  4. run ./make-hw.sh cnv-pynq pynq h

Installing BNN library and files on Pynq-Z1 using pip3: Command "python setup.py egg_info" failed with error code 1

Dear all I really need help on that issue: Installing BNN library and files on Pynq-Z1 using pip3:

I am having problems with the pip install command. I really need help, the sudo pip3 intall git+..... puts that error:
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-req-build-s4880beh/

I'd also downloaded and uncompressed BNN-PYNQ to a directory and run pip3 directly on the directory with the same error:

root@pynq:/home/xilinx/jupyter_notebooks/BNN-PYNQ/bnn# sudo pip3 install ./../
Processing /home/xilinx/jupyter_notebooks/BNN-PYNQ
Complete output from command python setup.py egg_info: running egg_info
creating pip-egg-info/bnn_pynq.egg-info
writing dependency_links to pip-egg-info/bnn_pynq.egg-info/dependency_links.txt
writing pip-egg-info/bnn_pynq.egg-info/PKG-INFO
writing top-level names to pip-egg-info/bnn_pynq.egg-info/top_level.txt
writing manifest file 'pip-egg-info/bnn_pynq.egg-info/SOURCES.txt'
reading manifest file 'pip-egg-info/bnn_pynq.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'pip-egg-info/bnn_pynq.egg-info/SOURCES.txt'
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-req-build-s4880beh/setup.py", line 97, in
if os.path.isdir(os.environ["PYNQ_JUPYTER_NOTEBOOKS"]+"/bnn/"):
File "/usr/lib/python3.4/os.py", line 633, in getitem
raise KeyError(key) from None
KeyError: 'PYNQ_JUPYTER_NOTEBOOKS'

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-req-build-s4880beh/

Two questions about hls code

Thank you for your great job.
I'm trying to grasp your hls code.
I have two question about your hls code.

The one is about the first convlayer for cnv-pynq.
I think "InPerImage" should be 32323*8/24 in line 77 of convlayer.h because it have to process input image(32(H)*32(W)3(C)8(precision)).
Why do you use 32
32
3/24 for it?

The other is about the slidingwindow.h
In the line 111 , 141, 160, you are missing the last element of cycles.
For example, you use (counter_internal_block < cycles_write_block-1) instead of (counter_internal_block < cycles_write_block)
Could you explain the reason?

Undefined Latency and Interval values in Vivado HLS

When I run the "make-hw.sh" file, the compilation finishes successfully. However, Vivado HLS reports the Latency and Interval values as undefined (marked as "?"). I would appreciate if you let me know how I can get the throughput reports.

Utilization Estimates of Synthesis

Hi,
Sorry for bothering again...But I've met another confusing problem.
Here is the Utilization of my synthesis result, which is much higher than yours, especially the LUT utilization. I wonder how this happens.
My environment is Vivado 2017.1, Ubuntu 16.04.
utilization
Looking forward to your reply.

Question regarding modification of number of output classes

Hi,

I am trying out the HLS source code together with the SW host code to run a binary neural network in a Zynq via SDSoC. So far everything has worked out well with MNIST and CIFAR10 (running both in SW and HW without a python overlay), however now I would like to try using a dataset with more than 10 classes.

In short this means that I have to align the lasange neural network output with the FPGA implementation output. But I cannot find the connection in the FINN synthesis which makes this alignment.

Could you in short explain how I should modify CNV-PYNQ network to get e.g. 20 16 bit slices instead of 10 16 slices as in the current state. How does these 10 16 bit values couple with the 10 float values that you get from the lasange network output?

Finally, in the CNV-PYNQ example you read out 16 64 bit values each containing 4 16 bit output results of which 10 are used, thus resulting in alot of data being unused. But it seems likes there are still weights connected to the output that is unused, or am I wrong about this? Does the FINN synthesiser produce a weight padding somewhere?

Making a network with a new dataset

In BNN-PYNQ/bnn/src/training/lfc.py it says that changing the file will likely result in not fitting on the PYNQ overlay. Is it possible to change the units, layers, and shape without having to worry? I am currently running it with changes but am unsure if it will not work later on.

New model HW layer acceleration

Hello,
I have added some extra layers on a CIFAR-10. I managed to make a successful simulation of the model and execute software inference on PYNQ board. However if I try to run HW acceleration the results for all classes are 1. I guess the weights are not properly loaded on FPGA or some configuration in main_python.cpp is wrong (in OFFLOAD sections as software inference runs just fine).
Could you suggest how to fix this issue?

After Binary Weights Files Generated

Hi, I am training the CNV network with cifar10 dataset. I got 'binparam-cnv-pynq' folder after using use 'python cifar10-gen-binary-weights.py', then you said place the resultant folder into the XILINX_BNN_ROOT/data directory on the Pynq device. Are you mean put the .bin files on the PYNQ board?
I find there is no .bin files under /home/xilinx/jupyter_notebooks/bnn/data by using the terminal connected to the PYNQ board. Do I put the resultant folder in this folder? Where is the original weights on the board for cifar10?

Question about driver and host code

Hi!
I am a beginner to use PYNQ and I find BNN-PYNQ is a excellent demo for me.
My goal is to write a new neural network accelerator which also using PYNQ platform,
Now I understand the hls codes, however, the driver codes and host codes make me
frustrated, So I want to ask that:
1 how to write the driver codes and host codes In the LIBRARY folder(I have no
background about ARM)
2 what software did you use for driver codes and host codes. Because I know all the
HLS codes are used in the Vivado_HLS software, I don't know which software to implement
driver codes and host codes and how to do debug and simulation of these codes. Did you
use SDK or SdSoc?
3 I didn't see test bench codes in Vivado_HLS of the BNN-PYNQ. Does there has any
way to simulate your work?

Thank you so much! Looking forward to your reply!

AttributeError: module 'bnn' has no attribute 'NETWORK_LFCW1A1'

Excuse me, the project you have posted on Github named BNN-PYNQ. When I tried to execute the code you have posted. There is an attributeError which reminds me that the module 'bnn' has no attribute ‘NETWORK_LFCW1A1’.
I have check the file named BNNs.py on the GitHub. There is actually an attribute named NETWORK_LFCW1A1. However, I have not any idea where it has been installed on the PYNQ.
Would you like to give me any advice to solve this problem?

Problems after Training

Hi, i did the mnist training ( python mnist.py) now i have a file called "mnist_parameters.npz". As the README say :
**"they need converted from real floating point values into binary values and packed into .bin files.
bash
$ python DATASET-gen-binary-weights.py


These scripts will process the weights for the given dataset and place them into a new directory. ''**

but when i did the python mnist-gen-binary-weights.py  i have a error:


"
**BNN-PYNQ-master/bnn/src/training$ python mnist-gen-binary-weights.py
Traceback (most recent call last):
  File "mnist-gen-binary-weights.py", line 46, in <module>
    fth.convertFCNetwork(npzFile, targetDirBin, simdCounts, peCounts)
TypeError: convertFCNetwork() takes exactly 11 arguments (4 given)
"**


I used this mnist-gen-binary-weights.py (from previous commit) :


**import os
import sys
import finnthesizer as fth

if __name__ == "__main__":
    bnnRoot = "."
    npzFile = bnnRoot + "/mnist_parameters.npz"
    targetDirBin = bnnRoot + "/binparam-lfc-pynq"
    
    simdCounts = [64, 32, 64, 8]
    peCounts = [32, 64, 32, 16]
    
    classes = map(lambda x: str(x), range(9))
    
    fth.convertFCNetwork(npzFile, targetDirBin, simdCounts, peCounts)
    
    with open(targetDirBin + "/classes.txt", "w") as f:
        f.write("\n".join(classes))**


Training in Pytorch

Hi, Your work is amazing!

I am trying to use finnthesizer with binary weights trained in pytorch from mnist example given here. I converted the trained pytorch weights to .npz file which finnthesizer reads. Moreover, I ran inference in lasagne after slight modification in this script with my converted .npz weights file and confirmed that this .npz is correct. Test accuracy was the same as reported in pytorch.

But finnthesizer gives (8-10) warnings for zero or negative thresholds detected and the generated .bin files fails in HLS simulation.

Why .bin files are not classifying correctly while lasagne model with pytorch weights worked fine?
Are these finnthesizer warnings generated because of lasagne tracking the instd and pytorch tracking the running_var or something else?

rebuild error

Hi! When I used vivado 2018.2 to rebuild the repo, some errors are as followed:
root@wfl-ThinkPad-T440:/home/wfl/BNN-PYNQ/bnn/src/network# ./make-hw.sh cnvW1A1 pynqZ1-Z2 h
xilinx-tiny-cnn already cloned
Calling Vivado HLS for hardware synthesis...

****** Vivado(TM) HLS - High-Level Synthesis from C, C++ and SystemC v2018.2 (64-bit)
**** SW Build 2258646 on Thu Jun 14 20:02:38 MDT 2018
**** IP Build 2256618 on Thu Jun 14 22:10:49 MDT 2018
** Copyright 1986-2018 Xilinx, Inc. All Rights Reserved.

source /opt/Xilinx/Vivado/2018.2/scripts/vivado_hls/hls.tcl -notrace
INFO: [HLS 200-10] Running '/opt/Xilinx/Vivado/2018.2/bin/unwrapped/lnx64.o/vivado_hls'
INFO: [HLS 200-10] For user 'root' on host 'wfl-ThinkPad-T440' (Linux_x86_64 version 4.15.0-29-generic) on Mon Oct 29 07:45:10 CST 2018
INFO: [HLS 200-10] On os Ubuntu 18.04.1 LTS
INFO: [HLS 200-10] In directory '/home/wfl/BNN-PYNQ/bnn/src/network/output/hls-syn'
HLS project: cnvW1A1-pynqZ1-Z2
HW source dir: /home/wfl/BNN-PYNQ/bnn/src/network/cnvW1A1/hw
BNN HLS library: /home/wfl/BNN-PYNQ/bnn/src/library/hls
INFO: [HLS 200-10] Opening project '/home/wfl/BNN-PYNQ/bnn/src/network/output/hls-syn/cnvW1A1-pynqZ1-Z2'.
INFO: [HLS 200-10] Adding design file '/home/wfl/BNN-PYNQ/bnn/src/network/cnvW1A1/hw/top.cpp' to the project
INFO: [HLS 200-10] Adding test bench file '/home/wfl/BNN-PYNQ/bnn/src/network/cnvW1A1/hw/../sw/main_python.cpp' to the project
INFO: [HLS 200-10] Adding test bench file '/home/wfl/BNN-PYNQ/bnn/src/library/host/foldedmv-offload.cpp' to the project
INFO: [HLS 200-10] Adding test bench file '/home/wfl/BNN-PYNQ/bnn/src/library/host/rawhls-offload.cpp' to the project
INFO: [HLS 200-10] Opening solution '/home/wfl/BNN-PYNQ/bnn/src/network/output/hls-syn/cnvW1A1-pynqZ1-Z2/sol1'.
INFO: [SYN 201-201] Setting up clock 'default' with a period of 5ns.
INFO: [HLS 200-10] Setting target device to 'xc7z020clg400-1'
INFO: [SIM 211-2] *************** CSIM start ***************
INFO: [SIM 211-4] CSIM will launch CLANG as the compiler.
ERROR: [SIM 211-100] 'csim_design' failed: compilation error(s).
INFO: [SIM 211-3] *************** CSIM finish ***************
4
while executing
"source [lindex $::argv 1] "
("uplevel" body line 1)
invoked from within
"uplevel #0 { source [lindex $::argv 1] } "

INFO: [Common 17-206] Exiting vivado_hls at Mon Oct 29 07:45:11 2018...
ERROR: [SIM 211-100] 'csim_design' failed: compilation error(s).
Error in Vivado_HLS

Please help ,Thank you!

Which version of Vivado HLS was used to rebuild the whole design?

Hi,

I am still working on this project and now I face another problem regarding Vivado HLS generated IP.

It seems like the generated IP get hang after writing ap_start. (no ap_done)

I would like to know which version of Vivado tools is suggested to re-run everything of this project.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.