prcastro / nervoso.jl Goto Github PK
View Code? Open in Web Editor NEWSimple Neural Networks library written in Julia
License: Other
Simple Neural Networks library written in Julia
License: Other
Write a suit of tests, probably using FactCheck
Implement AdaGrad
While I don't want to make MLBase a dependency, I belive we should document how to use this library alongside MLBase and also delegate performance evalutation functionality to MLBase.
This would not only make this lib more useful, but also reduce the amount of code to maintain.
Implement batch GD
This is what Theano do when used in Deep Learning with Python. We can do it by using packages from JuliaDiff to compute the derivatives instead of using a derivative dictionary. While this could cause some slowdown on differentiation (and at import, because we would depend on a package), this wouldn't be much, and would make much easier to extend the library with new cost/activation functions.
We should make regularization available.
Would it be feasible to make it extendable? The user could define a function of the weights to be minimized alongside with the error, and we use this function (its derivative) to update the weights? Or should we hardcode some of them (L1, L2 and Dropout, for instance) into Nervoso?
Useful link:
http://neuralnetworksanddeeplearning.com/chap3.html#regularization
We should add worked examples on the documentation
Change the package name to something nicer
Should we add the line
VERSION >= v"0.4.0-dev+6521" && __precompile__(true)
to the beginning of the module file (NeuralNetsLite.jl file) so that the Module is precompiled?
What can we benefit on doing this?
We must get both Coveralls and Codecov working. By working, I mean a badge on README with the correct % of code coverage.
An idea is to change someerrorprime
to be the derivative of the error with respect to the output, and handle the chain rule outside (like in here https://github.com/prcastro/NeuralNetsLite.jl/blob/master/src/feedforward/network.jl#L103)
Not only insert a nice/helpful docstring on each function/type, but also document the interface expected by the package (how to create new error/activation functions, etc) and conceptually how it works (like a PDF deriving the algorithm).
I should implement this and see if it help finding the optimum weights.
We need to write a reference, documenting each function/type/etc exported by the library. However, it should be automated, extracting the metadata from each function and putting into a markdown file. This way, we don't need to update the documentation by hand each time we write a new function.
Having problems with bth xor and digits datasets
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.