sferes2 / nn2 Goto Github PK
View Code? Open in Web Editor NEW**Sferes2 module** generic toolbox for evolving neural networks
**Sferes2 module** generic toolbox for evolving neural networks
In the test_dnn.cpp example you have these parameters:
struct parameters {
// maximum value of parameters
SFERES_CONST float min = -5.0f;
// minimum value
SFERES_CONST float max = 5.0f;
};
struct dnn {
SFERES_CONST size_t nb_inputs = 4;
SFERES_CONST size_t nb_outputs = 1;
SFERES_CONST size_t min_nb_neurons = 4;
SFERES_CONST size_t max_nb_neurons = 5;
SFERES_CONST size_t min_nb_conns = 100;
SFERES_CONST size_t max_nb_conns = 101;
SFERES_CONST float max_weight = 2.0f;
SFERES_CONST float max_bias = 2.0f;
SFERES_CONST float m_rate_add_conn = 1.0f;
SFERES_CONST float m_rate_del_conn = 1.0f;
SFERES_CONST float m_rate_change_conn = 1.0f;
SFERES_CONST float m_rate_add_neuron = 1.0f;
SFERES_CONST float m_rate_del_neuron = 1.0f;
SFERES_CONST int io_param_evolving = true;
SFERES_CONST init_t init = random_topology;
};
Here min and max set the bounds for the weights for the NN. So what is max_weight for? I could not find it used anywhere in the code. Is this parameter defunct? The same for io_param_evolving and max_bias?
Also, if max_bias is not used, how is bias parameter on the activation function bounded?
Hi,
In the mentioned example there are parameters controlling the number of neurons and connections set (such as min_nb_neurons
etc). Unfortunately, after reading implementation I conclude they seem to have no impact under dnn::ff
and control flow never uses them. Am I missing something here or is this just a mistake? If latter, same confusion is propagated through tests.
As a side question, are those parameters only meant for initialization ranges? That is, they don't really limit the number of neurons or connections through evolution if add/del rates are non-zero?
Recently added example_dnn_nsga2
causes 2 types of errors during the linking under g++:
../modules/nn2/example_dnn_nsga2.cpp: In member function 'void FitXOR<Params, Exact>::eval(Indiv&)':
../modules/nn2/example_dnn_nsga2.cpp:156:25: error: 'powf' is not a member of 'std'
fit -= std::powf(outf[0] - outputs[i], 2.0);
../modules/nn2/example_dnn_nsga2.cpp:216:1: required from here
/git/sferes2/sferes/ea/crowd.hpp:171:23: error: ... FitXOR, Params> >' has no member named 'data'
assert((*it)->data().size() == (*it2)->data().size());
I'm not sure it's an issue but when I'm saving an network by serialization with boost, and by saving with show function (dot) :
I do have the same number of connections, inputs and outputs, but the weights differs.
For instance in my dot, I can see :
i0 -> o0[label=" -1.32414"]
But I cannot see 1.32414 in the xml file. Why?
I've made a minimal code in the file test_dnn_ff_io.cpp here : https://github.com/matthieu637/nn2/blob/09ca77d267f094348a80f0932c03eac77cdac1ec/test_dnn_ff_io.cpp
The activation function in AfSigmoidBias, defined in af.hpp, seems wrong. As you can see below, in the current implementation, only the bias (trait<P>::single_value(this->_params
) is multiplied by lambda.
return 1.0 / (exp(-p + trait<P>::single_value(this->_params) * lambda) + 1);
However, I would assume that the sum of (-p + bias) should be multiplied by lambda. In code:
return 1.0 / (exp((-p + trait<P>::single_value(this->_params)) * lambda) + 1);
Also, since it is called a bias, shouldn't it be added to p before it becomes negative? In its current form, it seems that the bias would function more like a threshold.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.