Coder Social home page Coder Social logo

umbertogriffo / minimalistic-multiple-layer-neural-network-from-scratch-in-python Goto Github PK

View Code? Open in Web Editor NEW
25.0 3.0 10.0 824 KB

Minimalistic Multiple Layer Neural Network from Scratch in Python.

Python 100.00%
neural-network neural-networks neural-network-example gradient-descent perceptron classification regression backpropagation backpropagation-learning-algorithm deep-learning deep-neural-networks depplearning deep-learning-algorithms

minimalistic-multiple-layer-neural-network-from-scratch-in-python's Introduction

Minimalistic Multiple Layer Neural Network from Scratch in Python

  • Author: Umberto Griffo

Inspired by [1] and [2] I implemented a Minimalistic Multiple Layer Neural Network from Scratch in Python. You can use It to better understand the core concepts of Neural Networks.

Software Environment

  • Python 3.0 - 3.5

Features

  • Backpropagation Algorithm With Stochastic Gradient Descent. During training we are using single training examples for one forward/backward pass.
  • Supporting multiple hidden layers.
  • Classification (MultilayerNnClassifier.py).
  • Regression (MultilayerNnRegressor.py).
  • Activation Function: Linear, ReLU, Sigmoid, Tanh.
  • Classification Evaluator: Accuracy.
  • Regression Evaluator: Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Coefficient of Determination (R^2).

Demo

If you run Test.py you can see the following textual menu:

Please enter one of following numbers: 
 0 - Classification on Seed Dataset
 1 - Classification on Wine Red Dataset
 2 - Classification on Pokemon Dataset
 3 - Regression on Wine White Dataset
 4 - Regression on Wine Red Dataset 

If you choose 2 will be performed a classification task on Pokemon Dataset:

2
You entered 2
>epoch=0, lrate=0.100, error=0.396
>epoch=100, lrate=0.100, error=0.087
>epoch=200, lrate=0.100, error=0.083
>epoch=300, lrate=0.100, error=0.081
>epoch=400, lrate=0.100, error=0.081
>epoch=500, lrate=0.100, error=0.080
>accuracy=95.450
>epoch=0, lrate=0.100, error=0.353
>epoch=100, lrate=0.100, error=0.092
>epoch=200, lrate=0.100, error=0.085
>epoch=300, lrate=0.100, error=0.083
>epoch=400, lrate=0.100, error=0.082
>epoch=500, lrate=0.100, error=0.081
>accuracy=95.400
>epoch=0, lrate=0.100, error=0.415
>epoch=100, lrate=0.100, error=0.087
>epoch=200, lrate=0.100, error=0.083
>epoch=300, lrate=0.100, error=0.082
>epoch=400, lrate=0.100, error=0.081
>epoch=500, lrate=0.100, error=0.080
>accuracy=95.520
>epoch=0, lrate=0.100, error=0.401
>epoch=100, lrate=0.100, error=0.089
>epoch=200, lrate=0.100, error=0.084
>epoch=300, lrate=0.100, error=0.083
>epoch=400, lrate=0.100, error=0.082
>epoch=500, lrate=0.100, error=0.081
>accuracy=95.280
>epoch=0, lrate=0.100, error=0.395
>epoch=100, lrate=0.100, error=0.093
>epoch=200, lrate=0.100, error=0.087
>epoch=300, lrate=0.100, error=0.085
>epoch=400, lrate=0.100, error=0.084
>epoch=500, lrate=0.100, error=0.083
>accuracy=94.900
Scores: [95.45, 95.39999999999999, 95.52000000000001, 95.28, 94.89999999999999]
Mean Accuracy: 95.310%

Possible Extensions:

  • Early stopping.
  • Experiment with different weight initialization techniques (such as small random numbers).
  • Batch Gradient Descent. Change the training procedure from online to batch gradient descent and update the weights only at the end of each epoch.
  • Mini-Batch Gradient Descent. More info here.
  • Momentum. More info here.
  • Annealing the learning rate. More info here.
  • Dropout Regularization, Batch Normalization. More info here.
  • Model Ensembles. More info here.

References:

  • [1] How to Implement Backpropagation Algorithm from scratch in Python here.
  • [2] Implementing Multiple Layer Neural Network from Scratch here.
  • [3] Andrew Ng Lecture on Gradient Descent here.
  • [4] Andrew Ng Lecture on Backpropagation Algorithm here.
  • [5] (P. Cortez, A. Cerdeira, F. Almeida, T. Matos and J. Reis. Modeling wine preferences by data mining from physicochemical properties. In Decision Support Systems, Elsevier, 47(4):547-553, 2009.) here
  • [6] seeds Data Set here

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.