Coder Social home page Coder Social logo

tonygrad's Introduction

tonygrad

yeahhhhh

A tiny scalar-valued autograd engine with a small PyTorch-like neural network library on top. Implements backpropagation (reverse-mode autodiff) over a dynamically built DAG. The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows.

As it turns out, 90% of what you need for neural networks are a decent autograd/tensor library. Throw in an optimizer, a data loader, and some compute, and you have all you need.

What's next? Building a Tensor library?

Maybe.

Thanks karpathy/micrograd for your lessons :).

Installation

pip install tonygrad

Example usage

Below there is a simple example showing a number of possible supported operations:

from tonygrad.engine import Value

"""tanh() VERSION"""

# inputs x1,x2
x1 = Value(2.0, label='x1')
x2 = Value(0.0, label='x2')
# weights w1,w2
w1 = Value(-3.0, label='w1')
w2 = Value(1.0, label='w2')
# bias of the neuron
b = Value(6.8813735870195432, label='b')
# x1*w1 + x2*w2 + b
x1w1 = x1*w1; x1w1.label = 'x1w1'
x2w2 = x2*w2; x2w2.label = 'x2w2'
x1w1x2w2 = x1w1 + x2w2; x1w1x2w2.label = 'x1*w1 + x2*w2'
#output
n = x1w1x2w2 + b; n.label = 'n'
#apply tanh to the output 
o = n.tanh(); o.label = 'o'
#launch backprop on the built graph
o.backward()

Tracing / visualization

For added convenience, the notebook trace_graph.py produces graphviz visualizations. Here we draw the neural network graph built in the example above.

from tonygrad.trace_graph import draw_dot, trace

draw_dot(o)

2d neural net

Training a neural net

The notebook demo.ipynb provides a full demo of training an 2-layer neural network (MLP) binary classifier. This is achieved by initializing a neural net from tonygrad.nn module, implementing a simple svm "max-margin" binary classification loss and using stochastic gradient descent for optimization. As shown in the notebook, using a 2-layer neural net with two 16-node hidden layers we achieve the following decision boundary on the moon dataset:

2d neural net

License

MIT

tonygrad's People

Contributors

ent0n29 avatar

Stargazers

Lorenzo Taccini avatar  avatar Francesco Mecatti avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.