lava-nc / lava-dnf Goto Github PK
View Code? Open in Web Editor NEWDynamic Neural Fields with Lava
License: BSD 3-Clause "New" or "Revised" License
Dynamic Neural Fields with Lava
License: BSD 3-Clause "New" or "Revised" License
As a user, I want to use lava-dnf with the newest version of lava and lava-loihi to have access to new features and performance optimizations.
Relational networks enable the representation of relative space. For instance, they enable representing the relative position of an object with respect to another object. This can be implemented with two DNFs, A and B, of the same dimensionality N that both project into a transformational DNF, T, of dimensionality 2*N. The output of T is then projected onto another DNF, C, which has the original dimensionality N. For C to express the relative position of A with respect to B, the projection from T to C must sum across the diagonal of T. This type of projection must be added to lava-dnf to support implementing relational networks.
The projection should probably be implemented as an operation ProjectDiagonal
, to be used as part of the connect()
function of lava-dnf.
Pull request & issue templates to standardize how people open pull requests and align with lava-nc/lava.
In the absence of ReshapePorts in Lava, we have implemented reshaping with dedicated Reshape Processes (ReshapeBool, ReshapeInt) and changed the implementation of the connect() function to insert these Processes:
Population (Source) -> ReshapeBool -> Dense -> ReshapeInt -> Population (Destination)
Once ReshapePorts are available, use them and remove the Reshape Processes.
# make connections from the source port to the connections process
# TODO (MR) workaround in absence of ReshapePorts
con_ip = connections.s_in
rs1 = ReshapeBool(shape_in=src_op.shape, shape_out=con_ip.shape)
src_op.connect(rs1.s_in)
rs1.s_out.connect(con_ip)
# make connections from the connections process to the destination port
# TODO (MR) workaround in absence of ReshapePorts
con_op = connections.a_out
rs2 = ReshapeInt(shape_in=con_op.shape, shape_out=dst_ip.shape)
con_op.connect(rs2.s_in)
rs2.s_out.connect(dst_ip)
by
# make connections from the source port to the connections process
src_op_flat = src_op.reshape(connections.in_ports.s_in.shape)
src_op_flat.connect(connections.in_ports.s_in)
# make connections from the connections process to the destination port
con_op = connections.out_ports.a_out.reshape(dst_ip.shape)
con_op.connect(dst_ip)
# check whether 'source' is connected to 'connections'
src_op = source.out_ports.s_out
con_ip = connections.in_ports.s_in
# TODO (MR): remove this after switching to Reshape ports
rs1_op = src_op.get_dst_ports()[0].process.out_ports.s_out
self.assertEqual(rs1_op.get_dst_ports(), [con_ip])
self.assertEqual(con_ip.get_src_ports(), [rs1_op])
# check whether 'connections' is connected to 'target'
con_op = connections.out_ports.a_out
dst_op = destination.in_ports.a_in
# TODO (MR): remove this after switching to Reshape ports
rs2_op = con_op.get_dst_ports()[0].process.out_ports.s_out
self.assertEqual(rs2_op.get_dst_ports(), [dst_op])
self.assertEqual(dst_op.get_src_ports(), [rs2_op])
by
# check whether 'source' is connected to 'connections'
src_op = source.out_ports.s_out
con_ip = connections.in_ports.s_in
self.assertEqual(src_op.get_dst_ports(), [con_ip])
self.assertEqual(con_ip.get_src_ports(), [src_op])
# check whether 'connections' is connected to 'target'
con_op = connections.out_ports.a_out
dst_op = destination.in_ports.a_in
self.assertEqual(con_op.get_dst_ports(), [dst_op])
self.assertEqual(dst_op.get_src_ports(), [con_op])
The file lava-dnf/tutorials/lava/lib/dnf/dnf_regimes/utils.py
produces deprecation warnings due to its use of np.float
.
Suggested fix: replace all instances of np.float
with float
.
Objective of issue:
State that currently, we cannot simply make use of lava-dnf
's connect
method to connect child Neural Processes of a Hierarchical Process without storing the Dense
+ 2 Reshape
Processes as instance variables.
Background:
lava-dnf
's connect
method is supposed to be a high-lever helper method the user uses to create connections between two Neural Processes with connectivity matrix (synaptic weights) derived from a list of Operation objects given as argument.
Roughly, it does the following things :
Dense
ProcessIt returns the created Dense
Process. The user chooses whether to use it (store it) or not.
That being said ;
Neural Processes currently available in the proc
library (LIF
Process) can by of any shape (N-D arrays).
The Dense
Process from the proc
library represent weights as 2-D arrays, which, implicitly, means the Neural Processes connected at both ends have to be 1-D arrays.
Ultimately, ReshapePort
should be added either :
OutPort
of Neural Processes to "virtually" reshape output with arbitrary shape to 1-D (N-D arrays to 1-D arrays), orInPort
and OutPort
of the Dense
Process to "virtually" reshape input with arbitrary shape to 1-D (N-D arrays to 1-D arrays) and "virtually" reshape output with 1-D shape to arbitrary shape (1-D arrays to M-D arrays).As stated in #7, connect
currently implement reshaping with dedicated Reshape Processes, i.e it additionally creates 2 Reshape Processes for each Dense Process created.
Lava version:
Lava-dnf version:
I'm submitting a ...
Current behavior:
I want to create a Hierarchical Process (with ProcessModel inheriting from AbstractSubProcessModel
).
Inside its __init__
definition, I want to connect connect child Neural Processes with the connect
method.
Code snippet 1 contains the definition of an example of what I want to build.
Code snippet 2 contains the definition of a SourceProcess
(created for testing purposes) that always sends an array full of ones.
When SourceProcess
is connected to DNF
, and the whole is run as is done in Code snippet 3, the ones sent from SourceProcess
never reach the input_conn of DNF
.
However ;
connect
is tweaked so as to also return the created Reshape
Processes, andDNFProcessModel
is tweaked so as to store the Reshape
Processes as instance variablesOnes reach the Dense
Process.
I suspect this has something to do with the way the Builder
creates instances of AbstractSubProcessModel
and their child Process at compile time.
I think when the overall network is compiled, if an instance of AbstractSubProcessModel
defines child process, it has to store them as instance variables, otherwise they are discarded from compilation.
Namely, I suspect that is what happens for Reshape
Processes created in the connect
method in our case.
If the Reshape
Process at the input side of the Dense
Process created by connect
doesn't get compiled, the data sent by SourceProcess
never reaches the Dense
Process.
Related code:
Code snippet 1 :
class DNF(AbstractProcess):
def __init__(self, **kwargs):
super().__init__(**kwargs)
shape = validate_shape(kwargs.pop("shape"))
self.u = Var(shape=shape, init=np.zeros(shape))
self.s_in = InPort(shape=shape)
self.s_out = OutPort(shape=shape)
@implements(proc=DNF, protocol=LoihiProtocol)
class DNFProcessModel(AbstractSubProcessModel):
def __init__(self, proc):
shape = proc.init_args.get("shape")
self.population = LIF(shape=shape, du=409, dv=2047, vth=200)
self.input_conn = connect(proc.in_ports.s_in, self.population.a_in, [Weights(25)])
kernel = MultiPeakKernel(amp_exc=83, width_exc=3.75, amp_inh=-70, width_inh=7.5)
self.recurrent_conn = connect(self.population.s_out, self.population.a_in, [Convolution(kernel)])
self.population.s_out.connect(proc.out_ports.s_out)
proc.vars.u.alias(self.population.vars.u)
Code snippet 2 :
class SourceProcess(AbstractProcess):
"""
Process that sends arbitrary vectors
Parameters
----------
shape: tuple, shape of the process
"""
def __init__(self,
**kwargs: ty.Union[ty.Tuple, np.ndarray]) -> None:
super().__init__(**kwargs)
shape = kwargs.get("shape")
data = kwargs.get("data")
self.data = Var(shape=shape, init=data)
self.a_out = OutPort(shape=shape)
@implements(proc=SourceProcess, protocol=LoihiProtocol)
@requires(CPU)
@tag("bit_accurate_loihi")
class SourceProcessBoolModel(PyLoihiProcessModel):
data: np.ndarray = LavaPyType(np.ndarray, bool)
a_out: PyOutPort = LavaPyType(PyOutPort.VEC_DENSE, bool)
def run_spk(self) -> None:
"""Send data every timestep"""
self.a_out.send(self.data)
Code snippet 3 :
def main():
shape = (20, )
run_cfg = Loihi1SimCfg(select_tag="bit_accurate_loihi",
select_sub_proc_model=True)
condition = RunSteps(num_steps=10)
source = SourceProcess(shape=shape, data=np.ones(shape, dtype=bool))
dnf = DNF(shape=shape)
source.a_out.connect(dnf.s_in)
dnf.run(condition=condition, run_cfg=run_cfg)
dnf.stop()
if __name__ == '__main__':
main()
Objective of issue: Fix Dependabot alert
Lava-DNF version:
Lava version:
I'm submitting a ...
Current behavior:
Expected behavior:
Steps to reproduce:
Related code:
insert short code snippets here
Other information:
insert the output from lava debug here
The lava-dnf repository has so far been empty, except for a README file and CI/CD. With the first release, we would like to add support for a connect function that enables users to generate connectivity between neural populations.
The link in requirements.txt (https://github.com/lava-nc/lava/releases/download/v0.2.0/lava-nc-0.3.0.tar.gz) refers to v0.2.0 in one spot and 0.3.0 later. This file does not exist and causes an error. I think they should both show the same number but I'm not sure which one.
Hi,
I have some errors when Lava is being built. Tests pass without problems but it fails when Running Twine check for generated artifacts.
Lava is already installed and working (version 0.2.0) and lava-dl as well. See error code below:
PyBuilder version 0.13.3
Build started at 2022-05-25 17:32:58
------------------------------------------------------------
[INFO] Installing or updating plugin "pypi:pybuilder_bandit, module name 'pybuilder_bandit'"
[INFO] Processing plugin packages 'pybuilder_bandit' to be installed with {}
[INFO] Activated environments: unit
[INFO] Building lava-dnf version 0.1.0
[INFO] Executing build in /home/garciaal/.local/share/Trash/files/lava-dnf
[INFO] Going to execute tasks: analyze, publish
[INFO] Processing plugin packages 'coverage~=5.2' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'flake8~=3.7' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'pypandoc~=1.4' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'setuptools>=38.6.0' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'sphinx_rtd_theme' to be installed with {}
[INFO] Processing plugin packages 'sphinx_tabs' to be installed with {}
[INFO] Processing plugin packages 'twine>=1.15.0' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'unittest-xml-reporting~=3.0.4' to be installed with {'upgrade': True}
[INFO] Processing plugin packages 'wheel>=0.34.0' to be installed with {'upgrade': True}
[INFO] Creating target 'build' VEnv in '/home/garciaal/.local/share/Trash/files/lava-dnf/target/venv/build/cpython-3.9.7.final.0'
[INFO] Processing dependency packages 'lava' from https://github.com/lava-nc/lava/releases/download/v0.2.0/lava-nc-0.2.0.tar.gz to be installed with {'force_reinstall': True}
[INFO] Processing dependency packages 'requirements.txt' to be installed with {}
[INFO] Creating target 'test' VEnv in '/home/garciaal/.local/share/Trash/files/lava-dnf/target/venv/test/cpython-3.9.7.final.0'
[INFO] Processing dependency packages 'requirements.txt' to be installed with {}
[INFO] Requested coverage for tasks: pybuilder.plugins.python.unittest_plugin:run_unit_tests
[INFO] Running unit tests
[INFO] Executing unit tests from Python modules in /home/garciaal/.local/share/Trash/files/lava-dnf/tests/lava
[INFO] Executed 207 unit tests
[INFO] All unit tests passed.
[INFO] Executing flake8 on project sources.
[INFO] Building distribution in /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0
[INFO] Copying scripts to /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/scripts
[INFO] Writing setup.py as /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/setup.py
[INFO] Collecting coverage information for 'pybuilder.plugins.python.unittest_plugin:run_unit_tests'
[WARN] ut_coverage_branch_threshold_warn is 0 and branch coverage will not be checked
[WARN] ut_coverage_branch_partial_threshold_warn is 0 and partial branch coverage will not be checked
[INFO] Running unit tests
[INFO] Executing unit tests from Python modules in /home/garciaal/.local/share/Trash/files/lava-dnf/tests/lava
[INFO] Executed 207 unit tests
[INFO] All unit tests passed.
[WARN] Test coverage below 70% for lava.lib.dnf.inputs.rate_code_spike_gen.models: 45%
[INFO] Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests coverage is 93%
[INFO] Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests branch coverage is 93%
[INFO] Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests partial branch coverage is 95%
[INFO] Overall lava-dnf coverage is 93%
[INFO] Overall lava-dnf branch coverage is 93%
[INFO] Overall lava-dnf partial branch coverage is 95%
[INFO] Building binary distribution in /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0
[INFO] Running Twine check for generated artifacts
------------------------------------------------------------
BUILD FAILED - Error while executing Twine ['check', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava-dnf-0.1.0.linux-x86_64.tar.gz', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava-dnf-0.1.0.tar.gz', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava_dnf-0.1.0-py3-none-any.whl']. See /home/garciaal/.local/share/Trash/files/lava-dnf/target/reports/distutils/twine_check.log for full details:
Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
.1.0/dist/lava_dnf-0.1.0-py3-none-any.whl: FAILED
ERROR `long_description` has syntax errors in markup and would not be
rendered on PyPI.
No content rendered from RST source.
WARNING `long_description_content_type` missing. defaulting to `text/x-rst`.
Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
.1.0/dist/lava-dnf-0.1.0.linux-x86_64.tar.gz: PASSED with warnings
WARNING `long_description_content_type` missing. defaulting to `text/x-rst`.
WARNING `long_description` missing.
Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
.1.0/dist/lava-dnf-0.1.0.tar.gz: PASSED with warnings
WARNING `long_description_content_type` missing. defaulting to `text/x-rst`.
WARNING `long_description` missing. (site-packages/pybuilder/plugins/python/distutils_plugin.py:421)
------------------------------------------------------------
Build finished at 2022-05-25 17:34:04
Build took 66 seconds (66761 ms)
Thanks in advance,
Alex.
When calling the gauss()
function with a shape
argument that contains axes of size 1, the numpy array that is returned is missing these axes.
We should probably switch back to depending on the tip of Lava rather than version 0.2.0.
Or is there a better way of getting around that with every release, @mgkwill ?
Replace the current DVSFileInput class used in the motion tracking demo (demos/motion_tracking/dvs_file_input/process.py) with the InivationProcess class (lava-peripherals) to minimize maintenance of the demo to a minimum while increasing compatibility with future I/O related changes.
Hello ,
It seems this library requires the previous lava version 0.2.0. If I remove the requirement about the release version, would this library still work correctly?
When RateCodeSpikeGen
receives a pattern containing one amplitude that is higher than 12000, the expected behavior would be that the neuron corresponding to that amplitude would spike every time step (saturate).
However, currently, it is not the case. Current behavior is that said neuron will not fire at all.
This is due to the fact that, when computing distances, upon new pattern arrival, we do the following :
distances[idx_non_negligible] = \
np.rint(TIME_STEPS_PER_MINUTE / pattern[idx_non_negligible])\
.astype(int)
This would result to distances
having a 0 at indices where the pattern is higher than 12000 (since TIME_STEP_PER_MINUTE
is set to 6000 right now). 0 representing an infinite distance with the current way RateCodeSpikeGen
is implemented, this is not correct.
It should saturate to 1, in that case.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.