Coder Social home page Coder Social logo

codeplea / genann Goto Github PK

View Code? Open in Web Editor NEW
1.9K 86.0 223.0 75 KB

simple neural network library in ANSI C

Home Page: https://codeplea.com/genann

License: zlib License

Makefile 1.80% C 98.20%
backpropagation genetic-algorithm artificial-neural-networks ann neurons hidden-layers neural-network neural-networks neural ansi

genann's People

Contributors

amboar avatar codeplea avatar crclark96 avatar dickby avatar jflopezfernandez avatar timgates42 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

genann's Issues

Different results

Hi,

I am using genann in a project. With same training , after each run I get a different result. Is there a way to get roughly same result after each run ?

input data

input data is form 0 .. 1 or -1 to +1 ?

Non-linear regression

I'm not skilled with machine learning and i'm trying to study its practical applications.
I'm trying to use an ANN for non-linear regression of function sin(x) with analitical points and i have found some problems.

Creating an ANN with 10 hidden layers and 4 neurons, running a for-cycle 1 million times over 100 points maked with x values evenly distributed between 0 and 2pi i obtain this:

alt text

with this specs once created the ANN: ann->activation_output = genann_act_linear;
and the output doesn't change trying to make different ANN.

Someone can help me to understand where i'm wrong (also teoretically)?
I've seen that sigmoid activation function is not the best choice for this purpose like relu, but it's just this?

A stack buffer overflow has been found.

A stack buffer overflow has been found in genann.c:299:

=================================================================
==12375==ERROR: AddressSanitizer: stack-buffer-overflow on address 0x7ffd0e981ce0 at pc 0x0000004081b4 bp 0x7ffd0e981b50 sp 0x7ffd0e981b40
READ of size 8 at 0x7ffd0e981ce0 thread T0
    #0 0x4081b3 in genann_train /home/mfc_fuzz/genann/genann.c:299
    #1 0x40147c in main /home/mfc_fuzz/genann/example1.c:36
    #2 0x7f823226182f in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x2082f)
    #3 0x4018a8 in _start (/home/mfc_fuzz/genann/example1+0x4018a8)

Address 0x7ffd0e981ce0 is located in stack of thread T0 at offset 160 in frame
    #0 0x40110f in main /home/mfc_fuzz/genann/example1.c:5

  This frame has 3 object(s):
    [64, 80) 'a'
    [128, 160) 'output' <== Memory access at offset 160 overflows this variable
    [192, 256) 'input'
HINT: this may be a false positive if your program uses some custom stack unwind mechanism or swapcontext
      (longjmp and C++ exceptions *are* supported)
SUMMARY: AddressSanitizer: stack-buffer-overflow /home/mfc_fuzz/genann/genann.c:299 genann_train
Shadow bytes around the buggy address:
  0x100021d28340: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d28350: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d28360: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d28370: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d28380: 00 00 00 00 00 00 00 00 f1 f1 f1 f1 f1 f1 f1 f1
=>0x100021d28390: 00 00 f4 f4 f2 f2 f2 f2 00 00 00 00[f2]f2 f2 f2
  0x100021d283a0: 00 00 00 00 00 00 00 00 f3 f3 f3 f3 f3 f3 f3 f3
  0x100021d283b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d283c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d283d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  0x100021d283e0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
Shadow byte legend (one shadow byte represents 8 application bytes):
  Addressable:           00
  Partially addressable: 01 02 03 04 05 06 07 
  Heap left redzone:       fa
  Heap right redzone:      fb
  Freed heap region:       fd
  Stack left redzone:      f1
  Stack mid redzone:       f2
  Stack right redzone:     f3
  Stack partial redzone:   f4
  Stack after return:      f5
  Stack use after scope:   f8
  Global redzone:          f9
  Global init order:       f6
  Poisoned by user:        f7
  Container overflow:      fc
  Array cookie:            ac
  Intra object redzone:    bb
  ASan internal:           fe
==12375==ABORTING

The program I ran was example1, but I have made some changes in that file.
The example1 I wrote has been placed at : https://github.com/fCorleone/fuzz_programs/blob/master/genann/example1.c
The input file has been put here:
https://github.com/fCorleone/fuzz_programs/blob/master/genann/testcase

Implement relu

Hello,

I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:

double inline genann_act_relu(const struct genann *ann unused, double a) {
    return (a > 0.0) ? a : 0.0;
}

But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0 But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?

Fails to learn a simple example

I like your library, but unfortunately it doesn't seem to learn the simple examples I've tried. See y = x*x below. I'm sorry I can't help with a fix. Let me know if I've done something wrong. In this example, genann_run() returns 1 for any input instead of something close to the square.

#include <stdio.h>
#include <stdlib.h>
#include "genann.h"

int main(int argc, char *argv[])
{
printf("GENANN example x^2.\n");
printf("Train a small ANN to the x^2 function using backpropagation.\n");

/* New network with 1 inputs,
 * 1 hidden layer of 2 neurons,
 * and 1 output. */
genann *ann = genann_init(1, 1, 2, 1);

/* Train */
for(int i = 0; i < 10000; ++i) {
    double x = rand() % 10;
    double y = x * x;
    printf("Train with %.0f -> %.0f\n", x, y);
    genann_train(ann, &x, &y, .1);
}

/* Run the network and see what it predicts. */
for(double x = 0; x < 10; x += .5) {
    printf("%.1f^2 -> %.0f\n", x, *genann_run(ann, &x));
}

genann_free(ann);
return 0;

}

Genann test suite error

After building with make, I ran the test program with the following output:

GENANN TEST SUITE
	basic         pass: 7   fail: 0      0ms
	xor           pass: 5   fail: 0      0ms
	backprop      pass: 1   fail: 0      0ms
	train and     pass: 4   fail: 0      0ms
	train or      pass: 4   fail: 0      0ms
	train xor     test.c:189 (0.000000 != 1.000000)
pass: 3   fail: 1      0ms
	persist       pass:60765   fail: 0     93ms
	copy          pass:60765   fail: 0      0ms
	sigmoid       pass:400001   fail: 0      5ms
SOME TESTS FAILED (521555/521556)

Any clues to why this is?

Prediction

Hi, I just started working with genann and I'm trying to fix this issue for days already.
This is my first training run:
image

Seems to work pretty good.
If I save the net and use it ffor a predicition with neraly the same input this is the output:
image

The purple graph is the target and the green one the output of my nn.

DOS port

Without reinventing from scratch, has a MS C 5.1 build of genann available to DOS coder's?
thx
eltoneo

There no srand in examples

Hi, at first thank you for this simple and interesting software!
I get a problem like in issue 2, example1 doesn't work right (but tests passed):

$ ./example1
GENANN example 1.
Train a small ANN to the XOR function using backpropagation.
Output for [0, 0] is 0.
Output for [0, 1] is 1.
Output for [1, 0] is 1.
Output for [1, 1] is 1.

This result always the same and depends only on test training duration.
I found srand function never used in examples or the lib:

$ ack-grep srand
test.c
261:    srand(100)

In test.c srand has constant argument. In this way, neural network always initialized with the same data.
The C library function void srand(unsigned int seed) seeds the random number generator used by the function rand.
I think that srand have to be included into every example:

#include <time.h>
...
srand(time(NULL));

Do the algorithms used by default work on complex problems?

Forgive me for this very beginner question, but I noticed when reading about neural networks there are a lot of different training approaches and a lot of different signal activation types used, with apparently ReLU used a lot. Not that I would know, I know C but not that much about neural networks.

Anyway, I naively tried to use genann on a word classification problems. Three sets of english words, 1000 words category A, 1000 words category B, 900ish words category C. I truncated all to 10 characters, and used 10 inputs that I mapped to the ascii value (so range 32.0 ish to 127.0 ish with the exact value being the respective letter) and shorter words having the remaining inputs set to 0.0. I used two outputs with the two categories returned as 1.0 1.0. 0.0 1.0, 0.5 1.0.

No matter how I did this though, I couldn't get at all even the words in the original training set to even have an approximately correct categorization returned. It just doesn't work at all. I tried 300 and more training repetitions, I tried 10 layers and 50 neurons (which I'm guessing could be too little to map it all but shouldn't it be better than return nonsense on almost everything in the training set?), nothing. Is there some approximate rule maybe on how much neurons or layers I would even need to possibly map this, is that the culprit?

Or is this a more fundamental issue beyond just the parameter choice? Like, is a simple training loop no longer doing this justice? Is there some limitation of this library, like the sigmoid activation function genann defaults to may not be capable of this? Should I be using a different way of input mapping of text entirely? Sorry again for this being such a beginner question.

Issue with changing activation functions

I was wondering how to change the default sigmoid activation function to something else. I've tried changing it to tanh and it's not working. I've also tried using the linear activation function on the examples given and it's failing that as well

Excellent code ever!

Your work is totally awsome! The codes seem really efficient! I just would like to ask a question regarding training weights associated with each layers. Let's suppose the value of inputs, hidden neurons (with 1layer) and outputs are 2, 3, and 4 respectively. As I understood, there should be 2x3 + 3x4 weights in total in this case, however your code actually set the number of weights to be (2+1)*3 + (3+1)*4. I tried to understand why by writing the code from scratch, but I wasn't able to find the answer.
I'd really appreciate your answer :)

Cross Entropy?

Looking through the code, not sure if its using this. If not, why not?

Error when trying to compile with G++ + bionic

The following error speears when trying to conpile with G++ + bionic
/genann-master/genann.c: In function 'genann* genann_init(int, int, int, int)':
/genann-master/genann.c:123:25: error: invalid conversion from 'void*' to 'genann*' [-fpermissive]
compilation terminated due to -Wfatal-errors.

genann_run() always return constant value

trained a very simple stock price predictor
the input and the outputs that go into the genann_train() seem correct,
as soon as the training is completed, a call to genann_run with any set of inputs will always produce the the same, constant value.

const auto layers = 40;
const auto neurons = 40;
const auto N = 100;
genann* ann = genann_init(5, layers, neurons, 1);

... 

for (auto& p: inputs) {
    const double o = outputs[i_outputs] / maxValue[targetFilename];
    genann_train(ann, p.second.data(), &o, 0.01);
    i_outputs++;
}

... 

for (... test inputs ...) {
   const auto prediction_raw = *genann_run(ann, input);
   LOG(INFO) << "prediction_raw: " << prediction_raw;
}

the prediction_raw value will be the same, regardless of the input.

OCR example?

How can I present the pixel data to Gennan so that he can tell me if an image is a specific letter? could you give me an example

example1 not enough training.

with a value of 300 in the training loop I see this output:

Output for [0, 0] is 0.
Output for [0, 1] is 1.
Output for [1, 0] is 1.
Output for [1, 1] is 1.

changing the loop to 350 gives:

Output for [0, 0] is 0.
Output for [0, 1] is 1.
Output for [1, 0] is 1.
Output for [1, 1] is 0.

Was this done on purpose to show some kind of limitation of back propagation ?

Anything about C++?

I am trying to use a simple NN library in a C++ program but genann is for C not C++
Can you please make a C++ version or reccomend smth similiar?

possible memleak in example4.c?

hi, i really love this project and was reading through everything because it is so well written and easy to understand

i noticed in example 4 the possibility of a memory leak from the double *input and double *class variables since they are allocated memory with malloc on lines 35 and 36 but are not subsequently free'd after the test is complete

please let me know if i have overlooked something, but i believe this is correct

Feature Extraction

In example4.c there is an example (l. 96) where features where extracted column per column from an file in the genann_train function. Is there a way to extract picked features? Or does anyone have an idea or suggestions how to implement that?

link issue using msvc

Great work on this, just wanted to make note of a minor issue when using MSVC(compiler v19, linker v14) to compile and link w/ the test file for windows. (gcc on linux worked fine.)

It seems the 'inline' definitions are causing unresolved external symbol errors (e.g genann_act_threshold). I messed around w/ different optimization switches (/GL, /LTCG, /Ob{0|1|2} etc.) but it doesn't seem to help. Making the declarations explicitly 'extern' fixes it, but not sure if that will cause issues elsewhere, or make the 'inline' superfluous .

Or maybe I'm missing something obvious?

Why rand() has no seed?

I am sure there must be a good reason why the srand seed is not used to feed rand(). What it the reason?

Basically, without using srand, untrained neural networks will always output the same value given the same input.

Beautiful work

Not an issue, I just wanted to commend you for your excellent contribution. Its hard finding stuff as small and powerful as this.

Well done.

Change type double to int16_t

Hi,

I want to change all data types from double to int16_t. In genann.h addition :

typedef int16_t ann_t;

Then change every occurrence of double to ann_t. Now modifying example1.c :

const ann_t input[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
const ann_t output[4] = {0, 1, 1, 0};

I also use %hd for printf.

And then , compiling the file example1.c I get the output:

$ ./example1
GENANN example 1.
Train a small ANN to the XOR function using backpropagation.
Output for [0, 0] is 0.
Output for [0, 1] is 0.
Output for [1, 0] is 0.
Output for [1, 1] is 0.

What can I do to get the examples provided to work correctly with new type ?

is possible to generate source code?

I need function in C
output = answer(data)

Can You write mi this functionality? When I train my network i would like to use it in my program.

-nan Assertion Error

When the activation_output and activation_hidden functions are set to sigmoid_cached, every once in a while, the assertion error pops up.
p.s. i'm quite new to neural networks and was using your code to learn.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.