vbop9834 / neuralfish Goto Github PK
View Code? Open in Web Editor NEWNeuroevolution in F#
License: GNU Affero General Public License v3.0
Neuroevolution in F#
License: GNU Affero General Public License v3.0
On my machine, the exporter tests are taking the longest. This is probably due to the intensive record generation that needs to be done. There is no need to do this as the data is merely duplicated into a different structure.
If the node record is apart of the neuron instance, then the conversion of node node records will be considerably sped up by removing the need of neuron crawling for record instantiation. Instead the neurons will be crawled and node records requested from the neurons.
InfoLog is a rather ugly hack atm. Passing this function via the configuration records will allow for adaptability of NeuralFish into other systems.
Need to optimize the RemoveSensorLink mutation
Large neural networks reveal this bottleneck
Some of the neuron instance operations can be made async. This should speed up Neural Network response times. Any operation that does not modify (or is depended on to modify) the loop parameters should be eligible for this optimization.
Experiment with care
Exporter creates nodes synchronously which is a potential bottleneck.
The current structure of the neuron barrier (which simulates the axon hillock) is not supportive of recurrent connections. This is due to neurons not knowing the layers of input.
By adding support for awareness of which layer a neuron is in and which layer a connection is coming from, it will allow handling of recurrent connections.
Remove this to use initial weights instead
Synchronization is currently a message that is sent to all nodes in the NN but only processed by the Sensors. Collecting the sensors during construction and passing that data to the cortex should reduce initial network activity by (totalNodes-numberOfSensors)
the original implementation of NodeRecordConnections reflected the fact that weights were tied to outbound connections in neuron instances. This was changed to accommodate neural plasticity (and to reflect the proper implementation of a NeuralNet). NodeRecordConnections should be changed to have weights tied to the node record with the inbound connection, not outbound.
Issue #3 needs a test
Evolution should have a thinking phase dedicated to fine tuning weights. This should allow new neurons, actuators, or sensors to become favorable during the artificial selection process.
Everything is logged via infoLog. Having levels of logging is useful for filtering noise.
ConnectionIds are GUIDS but should be integer based for a smaller memory footprint
Sometimes the xunit tests will hang for about 20 seconds. If two tests hang then the test is increased by 40 seconds. I spent a little time trying to work through the problem and identify a pattern. Initially, it seemed like killNeuralNetwork
was the problem, but after removing it entirely the problem remains.
It seems to have something to do with the neural network itself. Further investigation is required
I recently switched to NixOs and tried to run NeuralFish today. Below is the Mono stack trace generated when running NeuralFish in interactive
Version
Xbuild: 14.0
Mono: 4.6.0
Fsharp: 4.0.1
* Assertion at sgen-alloc.c:460, condition `*p == NULL' not met
Stacktrace:
at <unknown> <0xffffffff>
at (wrapper managed-to-native) object.__icall_wrapper_mono_object_new_specific (intptr) <0xffffffff>
at <StartupCode$FSI_0002>.$FSI_0002.main@ () <0x0006b>
at (wrapper runtime-invoke) object.runtime_invoke_void (object,intptr,intptr,intptr) <0xffffffff>
at <unknown> <0xffffffff>
at (wrapper managed-to-native) System.Reflection.MonoMethod.InternalInvoke (System.Reflection.MonoMethod,object,object[],System.Exception&) <0xffffffff>
at System.Reflection.MonoMethod.Invoke (object,System.Reflection.BindingFlags,System.Reflection.Binder,object[],System.Globalization.CultureInfo) <0x000d7>
at System.MonoType.InvokeMember (string,System.Reflection.BindingFlags,System.Reflection.Binder,object,object[],System.Reflection.ParameterModifier[],System.Globalization.CultureInfo,string[]) <0x0043d>
at System.Reflection.Emit.TypeBuilder.InvokeMember (string,System.Reflection.BindingFlags,System.Reflection.Binder,object,object[],System.Reflection.ParameterModifier[],System.Globalization.CultureInfo,string[]) <0x00069>
at System.Type.InvokeMember (string,System.Reflection.BindingFlags,System.Reflection.Binder,object,object[],System.Globalization.CultureInfo) <0x0005c>
at Microsoft.FSharp.Compiler.AbstractIL.ILRuntimeWriter/[email protected] (Microsoft.FSharp.Core.Unit) <0x00093>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Core.Unit, Microsoft.FSharp.Core.FSharpOption`1<System.Exception>>) <0x00022>
at Microsoft.FSharp.Primitives.Basics.List.iter<T> (Microsoft.FSharp.Core.FSharpFunc`2<T, Microsoft.FSharp.Core.Unit>,Microsoft.FSharp.Collections.FSharpList`1<T>) <0x00049>
at Microsoft.FSharp.Collections.ListModule.Iterate<T> (Microsoft.FSharp.Core.FSharpFunc`2<T, Microsoft.FSharp.Core.Unit>,Microsoft.FSharp.Collections.FSharpList`1<T>) <0x0002f>
at Microsoft.FSharp.Compiler.Interactive.Shell.arg10@888 (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompiler,Microsoft.FSharp.Collections.FSharpList`1<Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Core.Unit, Microsoft.FSharp.Core.FSharpOption`1<System.Exception>>>,Microsoft.FSharp.Core.Unit) <0x00067>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompiler.ProcessInputs (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,Microsoft.FSharp.Collections.FSharpList`1<Microsoft.FSharp.Compiler.Ast/ParsedInput>,bool,bool,bool,Microsoft.FSharp.Collections.FSharpList`1<Microsoft.FSharp.Compiler.Ast/Ident>) <0x0072b>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompiler.EvalParsedDefinitions (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,bool,bool,Microsoft.FSharp.Collections.FSharpList`1<Microsoft.FSharp.Compiler.Ast/SynModuleDecl>) <0x001df>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x0179b>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.InteractiveCatch<a> (Microsoft.FSharp.Core.FSharpFunc`2<a, System.Tuple`2<a, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>,a) <0x00050>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.ExecInteraction (bool,Microsoft.FSharp.Compiler.CompileOps/TcConfig,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,Microsoft.FSharp.Compiler.Ast/ParsedFsiInteraction) <0x0006b>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.ExecInteractions (bool,Microsoft.FSharp.Compiler.CompileOps/TcConfig,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,Microsoft.FSharp.Core.FSharpOption`1<Microsoft.FSharp.Compiler.Ast/ParsedFsiInteraction>) <0x009b7>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.MainThreadProcessParsedInteraction (bool,Microsoft.FSharp.Core.FSharpOption`1<Microsoft.FSharp.Compiler.Ast/ParsedFsiInteraction>,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x00197>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x0002f>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, System.Tuple`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x0001d>
at Microsoft.FSharp.Core.OptimizedClosures/[email protected] (T2) <0x00029>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x001c8>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.InteractiveCatch<a> (Microsoft.FSharp.Core.FSharpFunc`2<a, System.Tuple`2<a, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>,a) <0x00050>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.ParseAndProcessAndEvalOneInteractionFromLexbuf (bool,Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, System.Tuple`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>, Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, System.Tuple`2<Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>>,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,Microsoft.FSharp.Compiler.LexFilter/LexFilter) <0x00163>
at Microsoft.FSharp.Compiler.Interactive.Shell.run@1835 (Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor,bool,Microsoft.FSharp.Compiler.LexFilter/LexFilter,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x00067>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Core.Unit) <0x00047>
at Microsoft.FSharp.Compiler.Interactive.Shell.WithImplicitHome<a> (Microsoft.FSharp.Compiler.CompileOps/TcConfigBuilder,string,Microsoft.FSharp.Core.FSharpFunc`2<Microsoft.FSharp.Core.Unit, a>) <0x0004a>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.EvalIncludedScript (bool,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,string,Microsoft.FSharp.Compiler.Range/range) <0x000b7>
at Microsoft.FSharp.Compiler.Interactive.Shell/[email protected] (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x00043>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.InteractiveCatch<a> (Microsoft.FSharp.Core.FSharpFunc`2<a, System.Tuple`2<a, Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionStepStatus>>,a) <0x00050>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.EvalIncludedScripts (Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,bool,Microsoft.FSharp.Collections.FSharpList`1<string>) <0x0008b>
at Microsoft.FSharp.Compiler.Interactive.Shell.consume@1865 (Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor,bool,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState,Microsoft.FSharp.Collections.FSharpList`1<System.Tuple`2<string, bool>>) <0x000eb>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiInteractionProcessor.LoadInitialFiles (bool,Microsoft.FSharp.Compiler.Interactive.Shell/FsiDynamicCompilerState) <0x0002f>
at Microsoft.FSharp.Compiler.Interactive.Shell/FsiEvaluationSession.Run () <0x00e57>
at Microsoft.FSharp.Compiler.Interactive.Shell.evaluateSession@2420 (string[],Microsoft.FSharp.Core.Unit) <0x00093>
at Microsoft.FSharp.Compiler.Interactive.Shell.MainMain (string[]) <0x0013b>
at Microsoft.FSharp.Compiler.Interactive.Main.FsiMain (string[]) <0x0000f>
at (wrapper runtime-invoke) <Module>.runtime_invoke_int_object (object,intptr,intptr,intptr) <0xffffffff>
Native stacktrace:
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x4b790a]
/nix/store/jm1n87rp8vr90j9ahcrfzr57nc2r8vgf-glibc-2.24/lib/libpthread.so.0(+0x10e90) [0x7f7ae5a0ae90]
/nix/store/jm1n87rp8vr90j9ahcrfzr57nc2r8vgf-glibc-2.24/lib/libc.so.6(gsignal+0x104) [0x7f7ae54781d4]
/nix/store/jm1n87rp8vr90j9ahcrfzr57nc2r8vgf-glibc-2.24/lib/libc.so.6(abort+0x16a) [0x7f7ae547963a]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x63dc79]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x63df0c]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x63e0a3]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x5f9151]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono() [0x5f978f]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono(mono_object_new_alloc_specific+0x14) [0x5b7b04]
/nix/store/g2qv8fwwzwabg1xvl8c3wnfm3jpmkhqd-mono-4.0.4.1/bin/mono(mono_object_new_specific+0x98) [0x5b7bf8]
[0x41b944b2]
Debug info from gdb:
=================================================================
Got a SIGABRT while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries
used by your application.
=================================================================
Aborted
The code is currently working but needs optimization of map and sequence iterations.
In order to test if this issue has been resolved, performance benchmarks will need to be generated.
The idea of Dropout in neural networks could potentially be adapted to NeuralFish. Need to experiment.
Cortex registration is currently done in a separate step, specifically when the live neurons are attached to a cortex. Registering a cortex at neuron creation will remove the need for the additional step. Also the neuron wouldn't have to send out recurrent connections since that is controlled by the cortex.
Potential optimization. Need metrics.
Early in development, I noticed that the tests take ~20 seconds to execute (longest test is 10 seconds).
Running the tests on windows only took 0.5 seconds
This issue needs further investigation
I've added the minimal amount of mutations needed for a functioning diverse neural network. There are still more to be added.
1. add_bias:
Choose a random neuron A, check if it has a bias in its weights list, if it does
not, add the bias value. If the neuron already has a bias value, do nothing.
2. remove_bias:
Choose a random neuron A, check if it has a bias in its weights list, if it does,
remove it. If the neuron does not have a bias value, do nothing.
3. mutate_weights:
Choose a random neuron A, and perturb each weight in its weight list with a
probability of 1/sqrt(length(weights)), with the perturbation intensity randomly
chosen between -Pi/2 and Pi/2.
4. reset_weights:
Choose a random neuron A, and reset all its synaptic weights to random values
ranging between -Pi/2 and Pi/2.
5. mutate_af:
Choose a random neuron A, and change its activation function to a new random
activation function chosen from the af_list in the constraint record belonging to
the NN.
6. add_inlink:
Choose a random neuron A, and an element B, and then add a connection from
element B (possibly an existing sensor) to neuron A.
7. add_outlink:
Choose a random neuron A, and an element B, and then add a connection from
neuron A to element B (possibly an existing actuator). The difference between
this mutation operator and the add_inlink mutation operator, is that in one we
choose a random neuron and then choose a random element from which we
make a connection to the chosen neuron. While in the other we choose a random
neuron, and then choose a random element to which the neuron makes a
connection. The first (add_inlink) is capable of making links to sensors, while
the second (add_outlink) is capable of potentially making links to actuators.
8. add_neuron:
Create a new neuron A, giving it a unique id and positioning it in a randomly
selected layer of the NN. Then give the neuron A a randomly chosen activation
function. Then choose a random neuron B in the NN and connect neuron A’s
inport to the neuron B’s outport. Then choose a random neuron C in the NN
and connect neuron A’s outport to the neuron C’s inport.
9. splice: There are 2 versions of this mutation operator, outsplice, and insplice:
– outsplice: Create a new neuron A with a random activation function. Then
choose a random neuron B in the NN. Randomly select neuron B’s outport
leading to some element C’s (neuron or actuator) inport. Then disconnect
neuron B from element C, and reconnect them through the newly created
neuron A.
– insplice: Create a new neuron A with a random activation function. Then
choose a random neuron B in the NN. Randomly select neuron B’s inport
from some element C’s (neuron or sensor) outport. Then disconnect neu-
264 Chapter 8 Developing a Simple Neuroevolutionary Platform
ron B from element C, and reconnect them through the newly created neuron
A. The reason for having an outsplice and an insplice, is that the
outsplice can insert a new neuron between some random element and an
actuator, while the insplice can insert a new neuron between an element
and a sensor.
10. add_sensorlink:
Compared to the number of neurons, there are very few sensors, and so the
probability of the add_inlink connecting a neuron to a sensor is very low. To
increase the probability that the NN connects to a sensor, we can create the
add_sensorlink mutation operator. This mutation operator first chooses a random
existing sensor A, it then chooses a random neuron B to which A is not yet
connected, and then connects A to B.
11. add_actuatorlink:
As in add_sensorlink, when compared to the number of neurons, there are very
few actuators, and so the probability of the add_outlink connecting a neuron to
an actuator is very low. Thus, we can implement the add_actuatorlink to increase
the probability of connecting a neuron to an actuator. In this mutation
operator, first a random actuator A is chosen which is connected to less neurons
than its vl element dictates (an incompletely connected actuator). Then a random
neuron B is chosen to which the actuator is not yet connected. Then A is
connected from B.
12. remove_sensorlink:
First a random sensor A is chosen. From the sensor’s fanout_ids list, a random
neuron id is chosen, and then the sensor is disconnected from the corresponding
neuron.
13. remove_actuatorlink:
First a random actuator A is chosen. From the actuator’s fanin_ids list, a random
neuron id is chosen, and then the actuator is disconnected from the corresponding
neuron.
14. add_sensor:
Choose a random sensor from the sensor list belonging to the NN’s morphology,
but which is not yet used. Then connect the sensor to a random neuron A in
the NN, thus adding a new sensory organ to the NN system.
15. add_actuator:
Choose a random actuator from the actuator list belonging to the NN’s morphology,
but which is not yet used. Then connect a random neuron A in the NN
to this actuator, thus adding a new morphological feature to the NN that can be
used to interact with the world.
16. remove_inlink:
Choose a random neuron A, and disconnect it from a randomly chosen element
in its input_idps list.
17. remove_outlink:
Choose a random neuron A, and disconnect it from a randomly chosen element
in its output_ids list.
8.5 Developing the genotype_mutator 265
18. remove_neuron:
Choose a random neuron A in the NN, and remove it from the topology. Then
fix the presynaptic neuron B’s and postsynaptic neuron C’s outports and inports
respectively to accommodate the removal of the connection with neuron A.
19. desplice: There are 2 versions of this operator, deoutspolice, and deinsplice:
– deoutsplice: Choose a random neuron B in the NN, such that B’s outport
is connected to an element (neuron or actuator) C through some neuron A.
Then delete neuron A and reconnect neuron B and element C directly.
– deintsplice: Choose a random neuron B in the NN, such that B’s inport is
connected to by an element (neuron or sensor) C through some neuron A.
Then delete neuron A and connect neuron B and element C directly.
20. remove_sensor:
If a NN has more than one sensor, choose a random sensor belonging to the
NN, and remove it by first disconnecting it from the neurons it is connected to,
and then removing the tuple representing it from the genotype altogether.
21. remove_actuator:
If a NN has more than one actuator, choose a random actuator belonging the
NN, and remove it by first disconnecting it from the neurons it is connected
from, and then removing the tuple representing it from the genotype altogether.
Once the #2 Evolution Chamber is done, evolution of NeuralFish Neural Networks will be possible.
It should be possible to create a process that is able to handle live situations while constantly evolving under specified parameters.
https://arxiv.org/pdf/1610.06258v2.pdf
Fast weights promise an effective short term memory. Experimenting with this idea may improve efficiency of NN
The format of the data needs to be simple to parse by humans and machines. (Once NeuralFish has reached enough milestones, we could bootstrap and teach a machine how to read the debug information jk that's unnecessary)
Events
Data
NeuralFish needs benchmark metrics.
How does it perform on varying topologies?
This will help set goals and inform when features degrade performance
Neuron connection weight should decay to compensate for potentially 'overpowering' other neuron connections.
And this code
https://github.com/JeremyBellows/NeuralFish/blob/23b31279ec643bad9026c3099ee80b3de0991772/NeuralFish/NeuralFish.fs#L102-L106
May benefit from a computation expression. It's worth looking into creating a prototype that validates the idea.
Sensors expect a certain vector length to operate the neural network correctly. Instead of dictating this vector length, it'd be really cool to have the genotype aware of the possible maximum size of a vector and adapt. When a sensor encounters a new maximum vector length it could record the new length and report it to the evolution chamber when the noderecord is collected.
To prevent neural network freezes from not meeting the threshold for expected data vector length, a system could be put in place to inflate the vector with data necessary to meet the expected vector length.
NeuralFish needs way more tests than it currently has. There's pieces of 'pocket logic' that are not being tested or asserted.
Overall system integrity is currently checked. Tests that explore various pieces of pocket logic would mitigate risk of system enhancements.
Test writers: Please comment on this post or reach out to me with any questions! Your work is greatly appreciated 👍
Implementing a die into the network allows control over the life of a neural network. This is important for exiting the asynchronous process.
The Evolution Chamber
will be a module that allows for evolving neural networks through genetic algorithms
##From the book on page 263-265
All of these are not needed due to the modifications in the architecture of the system.
1. add_bias:
Choose a random neuron A, check if it has a bias in its weights list, if it does
not, add the bias value. If the neuron already has a bias value, do nothing.
2. remove_bias:
Choose a random neuron A, check if it has a bias in its weights list, if it does,
remove it. If the neuron does not have a bias value, do nothing.
3. mutate_weights:
Choose a random neuron A, and perturb each weight in its weight list with a
probability of 1/sqrt(length(weights)), with the perturbation intensity randomly
chosen between -Pi/2 and Pi/2.
4. reset_weights:
Choose a random neuron A, and reset all its synaptic weights to random values
ranging between -Pi/2 and Pi/2.
5. mutate_af:
Choose a random neuron A, and change its activation function to a new random
activation function chosen from the af_list in the constraint record belonging to
the NN.
6. add_inlink:
Choose a random neuron A, and an element B, and then add a connection from
element B (possibly an existing sensor) to neuron A.
7. add_outlink:
Choose a random neuron A, and an element B, and then add a connection from
neuron A to element B (possibly an existing actuator). The difference between
this mutation operator and the add_inlink mutation operator, is that in one we
choose a random neuron and then choose a random element from which we
make a connection to the chosen neuron. While in the other we choose a random
neuron, and then choose a random element to which the neuron makes a
connection. The first (add_inlink) is capable of making links to sensors, while
the second (add_outlink) is capable of potentially making links to actuators.
8. add_neuron:
Create a new neuron A, giving it a unique id and positioning it in a randomly
selected layer of the NN. Then give the neuron A a randomly chosen activation
function. Then choose a random neuron B in the NN and connect neuron A’s
inport to the neuron B’s outport. Then choose a random neuron C in the NN
and connect neuron A’s outport to the neuron C’s inport.
9. splice: There are 2 versions of this mutation operator, outsplice, and insplice:
– outsplice: Create a new neuron A with a random activation function. Then
choose a random neuron B in the NN. Randomly select neuron B’s outport
leading to some element C’s (neuron or actuator) inport. Then disconnect
neuron B from element C, and reconnect them through the newly created
neuron A.
– insplice: Create a new neuron A with a random activation function. Then
choose a random neuron B in the NN. Randomly select neuron B’s inport
from some element C’s (neuron or sensor) outport. Then disconnect neu-
264 Chapter 8 Developing a Simple Neuroevolutionary Platform
ron B from element C, and reconnect them through the newly created neuron
A. The reason for having an outsplice and an insplice, is that the
outsplice can insert a new neuron between some random element and an
actuator, while the insplice can insert a new neuron between an element
and a sensor.
10. add_sensorlink:
Compared to the number of neurons, there are very few sensors, and so the
probability of the add_inlink connecting a neuron to a sensor is very low. To
increase the probability that the NN connects to a sensor, we can create the
add_sensorlink mutation operator. This mutation operator first chooses a random
existing sensor A, it then chooses a random neuron B to which A is not yet
connected, and then connects A to B.
11. add_actuatorlink:
As in add_sensorlink, when compared to the number of neurons, there are very
few actuators, and so the probability of the add_outlink connecting a neuron to
an actuator is very low. Thus, we can implement the add_actuatorlink to increase
the probability of connecting a neuron to an actuator. In this mutation
operator, first a random actuator A is chosen which is connected to less neurons
than its vl element dictates (an incompletely connected actuator). Then a random
neuron B is chosen to which the actuator is not yet connected. Then A is
connected from B.
12. remove_sensorlink:
First a random sensor A is chosen. From the sensor’s fanout_ids list, a random
neuron id is chosen, and then the sensor is disconnected from the corresponding
neuron.
13. remove_actuatorlink:
First a random actuator A is chosen. From the actuator’s fanin_ids list, a random
neuron id is chosen, and then the actuator is disconnected from the corresponding
neuron.
14. add_sensor:
Choose a random sensor from the sensor list belonging to the NN’s morphology,
but which is not yet used. Then connect the sensor to a random neuron A in
the NN, thus adding a new sensory organ to the NN system.
15. add_actuator:
Choose a random actuator from the actuator list belonging to the NN’s morphology,
but which is not yet used. Then connect a random neuron A in the NN
to this actuator, thus adding a new morphological feature to the NN that can be
used to interact with the world.
16. remove_inlink:
Choose a random neuron A, and disconnect it from a randomly chosen element
in its input_idps list.
17. remove_outlink:
Choose a random neuron A, and disconnect it from a randomly chosen element
in its output_ids list.
8.5 Developing the genotype_mutator 265
18. remove_neuron:
Choose a random neuron A in the NN, and remove it from the topology. Then
fix the presynaptic neuron B’s and postsynaptic neuron C’s outports and inports
respectively to accommodate the removal of the connection with neuron A.
19. desplice: There are 2 versions of this operator, deoutspolice, and deinsplice:
– deoutsplice: Choose a random neuron B in the NN, such that B’s outport
is connected to an element (neuron or actuator) C through some neuron A.
Then delete neuron A and reconnect neuron B and element C directly.
– deintsplice: Choose a random neuron B in the NN, such that B’s inport is
connected to by an element (neuron or sensor) C through some neuron A.
Then delete neuron A and connect neuron B and element C directly.
20. remove_sensor:
If a NN has more than one sensor, choose a random sensor belonging to the
NN, and remove it by first disconnecting it from the neurons it is connected to,
and then removing the tuple representing it from the genotype altogether.
21. remove_actuator:
If a NN has more than one actuator, choose a random actuator belonging the
NN, and remove it by first disconnecting it from the neurons it is connected
from, and then removing the tuple representing it from the genotype altogether.
If a NeuralNetwork enters into a deadlocked state or crashes then the cortex should restart the network to continue processing. The timeout should be configurable.
Synchronization should post back when finished. This should remove the need to sleep in the cortex
Logging utilizes string concatenation to display useful information. This happens before the infoLog function is called. It would be useful to circumvent string concatenation for performance reasons if the infoLog is not desired.
The cortex is responsible for synchronizing neural network input and output.
Specifically the cortex needs to handle
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.