Coder Social home page Coder Social logo

fuath / neurocat Goto Github PK

View Code? Open in Web Editor NEW

This project forked from mandubian/neurocat

0.0 0.0 0.0 110 KB

From neural networks to the Category of composable supervised learning algorithms in Scala with compile-time matrix checking based on singleton-types

License: Apache License 2.0

Scala 100.00%

neurocat's Introduction

Neurocat

Neurocat is an experimental toy library studying 2 things:

  • The link between category theory and supervised learning algorithm and neural networks through the concepts described in this amazing paper Backprop as Functor: A compositional perspective on supervised learning which tries to unify (partly at least) category theory & supervised learning concepts & simple neural networks/

  • how to represent matrices (and thus neural networks) which dimensions are checked at compile-time using the new feature singleton-types described in SIP-23 which allows to manipulate the integer 3 as the value 3 but also the type 3. My matrix experimentations use the little & really cool library singleton-ops by Frank S. Thomas the author of the great refined library too. Shapeless Nat is a nice idea but not good for big naturals because this is a recursive structure (down to 0) checked at compile-time so 100000 as

Very superficially, the idea of the paper is quite simple (minus a few details):

  • A supervised learning algorithm can be seen as a structure able to approximate a function A -> B relying on parameters P which are updated through an optimization/training process using a set of training samples.
  • This paper shows that the set of supervised learning algorithms equipped with 3 functions (implement, update-params, request-input) forms a symmetric monoidal category Learn and then demonstrates that supervised learning algorithms can be composed
  • It also shows that there exists a Functor from the category ParaFn of parametrised functions P -> A -> B to Learn category.

ParaFn -> Learn

  • Then it shows that a (trained) neural network can be seen as an approximation of a function InputLayer -> OutputLayer parametrised by the weights W.
  • Thus it demonstrates there is also a Functor from the category of neural network (W, InputLayer, OutputLayer) to the category of parametrised functions (W -> InputLayer -> OutputLayer)

I: NNet -> ParaFn

  • By simple functor composition, you have then a Functor from neural networks to supervised learning algorithms:

NNet -> Learn : (ParaFn -> Learn) โˆ˜ (NNet -> ParaFn)

I'll stop there for now but my work has just started and there are more concepts about the bimonoidal aspects of neural networks under euclidean space constraints and pending studies about recurrent networks and more.

Discovering that formulation, I just said: "Whoaaa that's cool, exactly what I had in mind without being able to put words on it".

Why? Because everything I've seen about neural networks looks like programming from the 70s, not like I program nowadays with Functional Programming, types & categories.

This starts unifying concepts and is exactly the reason of being of category theory in maths. I think programming learning algorithms will change a lot in the future exactly as programming backends changed a lot those last 10 years.

I'm just scratching the surface of all of those concepts. I'm not a NeuralNetwork expert at all neither a good mathematician so I just want to open this field of study in a language which now has singleton-types allowing really cool new ways of manipulating data structures

So first, have a look at this sample:

For info, to manipulate matrices, I used ND4J to have an array abstraction to test both in CPU or GPU mode but any library doing this could be used naturally.

neurocat's People

Contributors

mandubian avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.