Coder Social home page Coder Social logo

sentdex / nnfs_book Goto Github PK

View Code? Open in Web Editor NEW
264.0 264.0 103.0 74 KB

Sample code from the Neural Networks from Scratch book.

Home Page: https://nnfs.io

License: MIT License

Python 100.00%
deep-learning machine-learning neural-networks python python3

nnfs_book's People

Contributors

sentdex avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nnfs_book's Issues

Outputs not matching on newer versions of numpy.

Using version 1.21.0 of numpy the outputs do not match the outputs from the book. I was able to rectify this issue by finding the version you used the first video on the companion series on your Youtube channel version 1.18.2. I am not sure if you would want to add a comment on the readme, update the outputs, or update the nnfs package to work with the newer version. Adding it to the readme would be the least maintenance in the future.

Ch6_final.py does not run

Ch6_final.py imports spiral_data but calls vertical_data()

from nnfs.datasets import vertical_data

Learning process visualization

It seems that you've mentioned Matplotlib in some chapters but haven't seen you make use of it. Maybe it will be better to understand the learning process if you could add something to visualize the learning process live like you showed in the video. (Sorry for my poor English)

Chapter 3 - no nnfs.datasets

I just started chapter 3 and the code will not execute because it can not find nnfs.datasets. I have globally installed the nnfs library using pip3.

chapter14 -Problem in Regularization_loss

Hi in our final code we calculated Regularization Loss and then sum it with
data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss.
(In summary, we didn't add the effect of Regularization Loss in forward pass phase).
And I think we should use this
loss_activation.backward(Loss, y)
instead of
loss_activation.backward(loss_activation.output, y).

Would be glad to receive your opinion.

Best Regards

Ch14_Final.py Output doesn't match up

The output in the NNFS book for Chapter 14 and commented at the end of the Ch14_Final.py file is as follows:

epoch: 10000, acc: 0.947, loss: 0.217 (data_loss: 0.157, reg_loss: 0.060), lr: 0.019900507413187767
validation, acc: 0.830, loss: 0.435

When I run this file on my machine the output of some values are slightly different:

epoch: 10000, acc: 0.947, loss: 0.217 (data_loss: 0.160, reg_loss: 0.057), lr: 0.019900507413187767
validation, acc: 0.837, loss: 0.509

I am running this in a Docker container with a requirements.txt file ensuring the versions align with those documented in the book:
Python 3.7.5
NumPy 1.15.0
Matplotlib 3.1.1

Is the output documented correctly?

Ch17 Data

What if I want to load my own data set?

Ch 10 Optimizer SGD parameter wrong in the book

In the book p 251, 271 & 276, learning_rate begins with a capitol letter "L" in the class definition, but it doesn't in this code and it shouldn't.

def init(self, learning_rate=1., decay=0., momentum=0.):
self.learning_rate = learning_rate

CH03_p61_inputs

Thank for the great book, just a minor, minor remark..
inputs = [[1, 2, 3, 2.5],
[2., 5., -1., 2], <----- formatted with zeroes
[-1.5, 2.7, 3.3, -0.8]]

inputs = [..
[2.0, 5.0, -1.0, 2.0],
[..]]

Chapter 10 Adagrad text.

In chapter 10 after the square root example the text is written as

Overall, the impact is the learning rates for parameters with smaller gradients are decreased slowly, while the parameters with larger gradients have their learning rates decreased faster

I am confused over this, we are not updating learning rate anywhere (other than rate decay). Yes the weights will be updated faster for parameters with bigger gradients but much slower than they would have if no normalization is used.

Am I correct or am I understanding it incorrectly.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.