nervanasystems / ngraph-python Goto Github PK
View Code? Open in Web Editor NEWOriginal Python version of Intel® Nervana™ Graph
Home Page: http://ngraph.nervanasys.com/docs/legacy/
License: Apache License 2.0
Original Python version of Intel® Nervana™ Graph
Home Page: http://ngraph.nervanasys.com/docs/legacy/
License: Apache License 2.0
Hi,
The performance of the VGG16 network imported to ngraph via TF frontend appears to be very very slow. To get the perspective, I have 3 implementations of VGG16:
And for batch size of 64 on a skylake machine I get the following performances:
Implementation 1: 1561 GFlops/s
Implementation 2: 1309 GFlops/s ( Using tensorflow 1.4.0-dev on intel python)
Implementation 3: 51.82 GFlops/s.
I am trying to investigate why the TF frontend in ngraph is slow. Any inputs would be very helpful.
In the github code, there are mamy functions of maxas and sass. But I can't find any codes calls them. Are they there for future usage? But it is said that nGraph currently uses hand written SASS kernels assembled using the maxas assembler.
I’m not familiar with how exactly the sass kernels are invoked by the GPU backend.
Could you provide some information about how to use hand written SASS kernels assembled using the maxas assembler in ngraph?
I got error(NotImplementedError("Reshape can only support flatten"
) in importer.import_graph_def(tf.get_default_graph().as_graph_def())
when I used tf.reshape in my tenforslow model.
I also got same error when used tf.contrib.layers.flatten()
.
Can't we input 4d array into tf.reshape
or tf.contrib.layers.flatten()
in ngraph Reshape implimentation?
Dear Nervana team,
Given that you are now part of the Intel family, are there any plans to leverage the work in TBB Flow Graph Designer? The ability to graphically build the network (drag and drop) will be priceless.....
Thanks,
Pedro N.
Addresses #1. Will expose graph creation, transformer compilation, and compiled graph execution in a language neutral way.
Hi guys,
I was always very happy about Nervana and their work. Really kudos to you, that you could manage to make such an amazing breaktrough in low level stuff for DL.
Now, this is not really an issue, but only place to write to the devs directly. I was wondering why you guys started this in python and not C++/C? And let me elaborate why I think that is a better idea. With it you can easily frontend any other language. Additionally, when you get to optimizing properly a graph you would actually care about speed, as the task is equivalent to a full blown compiler? I do understand the user base need for python, but I do think a more optimized language is needed for the graph traversal.
Since you guys did this, I just was hoping if you would take a look at a project I started quite recently, which is to provide a Graph IR, used mainly for the autodiff part in c++. I was hoping this somehow to bring more people on an LLVM style arch as you mention, where we optimize stuff in this representation. One of the key benefits of my project, is that it provides "compiler" time guarantees that if your inputs are valid the whole graph can execute, which is done by clever meta data propagation. Obviously however, I can not drive this myself only and I well if any of you find that interesting please let me know.
Here is a link
Addresses #1.
Hello,
Ngraph has the Stochastic, GDM, Adam optimizers implemented. Any plans to include others optimizers? Rather, would you be interested in a pull request for Adagrad? I needed Adagrad, so I have implemented that one.
Thanks,
Kaushik
Dear Nervana team,
First and foremost, great work!
If someone would like to use ngraph for GPP, instead of solely for DL, how could he/she add additional graph functions? Is there a document / example that describes the steps?
Thanks,
Pedro N.
Source documentation for ngraph needs a copy edit and branding edit as the first step to updating and improving the docs.
ngraph/examples/walk_through/Logistic_Regression_Part_2.ipynb
says : "The axes creation is conecptually the same as before, except we now add a new axes H to represent the new feature space."
should said: "The axes creation is conceptually the same as before, except we now add a new axes H to represent the new feature space."
Good evening,
I would like to use the following configs:
Nervana Ngraph + Intel Python Distribution via Anaconda
However, according to installation procedure, i need to use virtualenv (from neon) in order to install ngraph. When neon build a venv, it points to a python bin different from my desire.
Could you help me to make neon create a virtualenv with Intel python, in order to install ngraph above this?
Thanks!
requirements.txt need numpy==1.11, but gpu_requirements.txt need numpy==1.13 implicitly.
That's why I have error on numpy when import ngraph after executing below command.
make install
make gpu_prepare
Thanks
Hi,
I am trying to implement a function that calls the sign(x) function on a tensor, and I get the error that the name out is not defined. How can I define such out_axes?
The function sign(x) in op_graph.py is calling the class SignOp, this UnaryELementWiseOp class, however, doesnt have the generate_adjoints function implemented. Could this be related?
Thanks,
Pedro N.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.