Coder Social home page Coder Social logo

nnet's Introduction

Erlang Neural Networks (nnet) is an application to represent neural networks in erlang terms.

Installation

Create your own project with rebar3.

$ rebar3 new app yourapp

Then in your project path find rebar.config file and add nnet as dependency under the deps key:

{deps, 
    [
        {nnet, {git, "https://github.com/BorjaEst/nnet.git", {tag, "<version>"}}}
    ]}.

Then using compile command, rebar3 will fetch the defined dependencies and compile them as well for your application.

$ rebar3 compile

At the end for making a release you first need to create your release structure and then making a release with following commands.

$ rebar3 new release yourrel
$ rebar3 release

You can find more information about dependencies in rebar3 - dependencies.

Usage

Load the app using your prefered method. For example in the project folder executing rebar3 shell:

$ rebar3 shell
===> Booted nnet

All user functions are defined inside the module src/nnet, however here is an example:

Start tables

As library without OTP supervision tree, the best way to start is to ensure the tables are created.

1> nnet:start_tables().
ok

The function will not fail if the tables are already created and correct.

Create a network from a model

To create a network from a model it is very easy. Just create a map where each layer is a key (inputs & outputs are mandatory) and the values are:

  • "connections": Type of connection and density between layers.
  • "units": The amount of neurons on each layer.
  • "data": The information on each node on each layer.
2> M = #{inputs  => #{connections => #{
2>                    layer1  => sequential,
2>                    layer2  => {sequential,0.5}}, 
2>                   units => 2, data=>#{}},
2>       layer1  => #{connections => #{
2>                    layer2  => sequential,
2>                    outputs => {sequential,0.5}}, 
2>                   units => 4, data=>#{}},
2>       layer2  => #{connections => #{
2>                    layer2  => recurrent,
2>                    outputs => sequential}, 
2>                   units => 4, data=>#{}},
2>       outputs => #{connections => #{
2>                    inputs  => {recurrent, 0.5},
2>                    layer2  => recurrent}, 
2>                   units => 2, data=>#{}}}.
...
3> {atomic,Id} = mnesia:transaction(fun() -> nnet:from_model(M) end)
{atomic,{network,#Ref<0.1530882211.2808086529.180929>}}

Get the network nodes, inputs and outputs

Using the function nnet:to_map/1 you can retrieve the network data.

4> mnesia:transaction(fun() -> nnet:to_map(Id) end).
{atomic,#{{network,#Ref<0.1530882211.2808086529.180929>} =>
              [{{network,#Ref<0.1530882211.2808086529.180929>},
                {nnode,#Ref<0.1530882211.2808086529.180893>}},
               {{network,#Ref<0.1530882211.2808086529.180929>},
                {nnode,#Ref<0.1530882211.2808086529.180896>}}],
          {nnode,#Ref<0.1530882211.2808086529.180893>} =>
              [{{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180899>}},
               {{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180902>}},
               {{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180905>}},
               {{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180908>}},
               {{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180911>}},
               {{nnode,#Ref<0.1530882211.2808086529.180893>},
                {nnode,#Ref<0.1530882211.2808086529.180914>}}],
...
          {nnode,#Ref<0.1530882211.2808086529.180926>} =>
              [{{nnode,#Ref<0.1530882211.2808086529.180926>},
                {network,#Ref<0.1530882211.2808086529.180929>}}]}}

Note the network inputs are out(Network): Network -> NNode

Note the network outputs are in(Network): NNode -> Network

Get a nnode inputs and outputs

After a network is create, you can modify it using the functions:

%% NNode operations (run inside 'fun edit/1') 
-export([rnode/1, wnode/2, rlink/1, wlink/2]).
-export([out/1, out_seq/1, out_rcc/1, in/1, lx/1]).
%% Connections operations (run inside 'fun edit/1')
-export([connect/1, connect_seq/1, connect_rcc/1, disconnect/1]).
-export([move/2, reset/1]).
%% Network operations (run inside 'fun edit/1')
-export([copy/2, clone/2, divide/2, split/2, delete/2, join/2]).

Visualization

There is a secret (not so secret) module "umlnn" which prints/formats the neural network in UML format (components diagram). You can use the results of that function together with plantUML to display your nerwork.

To display the sequential connections use the output from: umlnn:print_seq(Network_id)

Image of sequential connections

To display the recurrent connections use the output from: umlnn:print_rcc(Network_id)

Image of recurrent connections

More examples

Inside the module ./src/nnet.erl you will find the spec and comments for each function. Note that most of the functions are mnesia transactions therefore should run inside a mnesia:transaction/1 context.

For examples of usage you can take a look on the test suite ./test/nnet_SUITE.erl.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

Improvement ideas and requests

All ideas and sugestions are welcome:

License

This software is under GPL-3.0 license.

nnet's People

Contributors

borjaest avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.