Coder Social home page Coder Social logo

lava-nc / lava-dnf Goto Github PK

View Code? Open in Web Editor NEW
40.0 40.0 18.0 12.95 MB

Dynamic Neural Fields with Lava

License: BSD 3-Clause "New" or "Revised" License

Python 0.90% Jupyter Notebook 99.10%
dynamic-neural-fields neuromorphic neuromorphic-computing python

lava-dnf's People

Contributors

awintel avatar dependabot[bot] avatar joyeshmishra avatar mathisrichter avatar mgkwill avatar michaelbeale-il avatar philippplank avatar sveameyer13 avatar tobias-fischer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lava-dnf's Issues

Upgrade lava-dnf to newest Lava version for release

User story

As a user, I want to use lava-dnf with the newest version of lava and lava-loihi to have access to new features and performance optimizations.

Conditions of satisfaction

  • All tests are passing on lava-dnf when using the newest to-be-released version of lava and lava-loihi.

Tasks

  • Check out newest versions of Lava, Lava-Loihi, and lava-dnf (MR)
  • Run all tests locally (1h, MR)
  • Fix hanging tutorial tests (1h, MR)
    - [ ] Fix any incompatibilities (2h, MR)
    - [ ] Code review (1h, MR)
    - [ ] Merge code (MR)

Relational networks require an operation that sums across a diagonal of a DNF

Relational networks enable the representation of relative space. For instance, they enable representing the relative position of an object with respect to another object. This can be implemented with two DNFs, A and B, of the same dimensionality N that both project into a transformational DNF, T, of dimensionality 2*N. The output of T is then projected onto another DNF, C, which has the original dimensionality N. For C to express the relative position of A with respect to B, the projection from T to C must sum across the diagonal of T. This type of projection must be added to lava-dnf to support implementing relational networks.

The projection should probably be implemented as an operation ProjectDiagonal, to be used as part of the connect() function of lava-dnf.

Replace reshaping workaround with ReshapePorts

In the absence of ReshapePorts in Lava, we have implemented reshaping with dedicated Reshape Processes (ReshapeBool, ReshapeInt) and changed the implementation of the connect() function to insert these Processes:
Population (Source) -> ReshapeBool -> Dense -> ReshapeInt -> Population (Destination)

Once ReshapePorts are available, use them and remove the Reshape Processes.

  1. In connect.py, replace
    # make connections from the source port to the connections process
    # TODO (MR) workaround in absence of ReshapePorts
    con_ip = connections.s_in
    rs1 = ReshapeBool(shape_in=src_op.shape, shape_out=con_ip.shape)
    src_op.connect(rs1.s_in)
    rs1.s_out.connect(con_ip)

    # make connections from the connections process to the destination port
    # TODO (MR) workaround in absence of ReshapePorts
    con_op = connections.a_out
    rs2 = ReshapeInt(shape_in=con_op.shape, shape_out=dst_ip.shape)
    con_op.connect(rs2.s_in)
    rs2.s_out.connect(dst_ip)

by

    # make connections from the source port to the connections process
    src_op_flat = src_op.reshape(connections.in_ports.s_in.shape)
    src_op_flat.connect(connections.in_ports.s_in)

    # make connections from the connections process to the destination port
    con_op = connections.out_ports.a_out.reshape(dst_ip.shape)
    con_op.connect(dst_ip)
  1. In test_connect.py, replace
        # check whether 'source' is connected to 'connections'
        src_op = source.out_ports.s_out
        con_ip = connections.in_ports.s_in
        # TODO (MR): remove this after switching to Reshape ports
        rs1_op = src_op.get_dst_ports()[0].process.out_ports.s_out
        self.assertEqual(rs1_op.get_dst_ports(), [con_ip])
        self.assertEqual(con_ip.get_src_ports(), [rs1_op])

        # check whether 'connections' is connected to 'target'
        con_op = connections.out_ports.a_out
        dst_op = destination.in_ports.a_in
        # TODO (MR): remove this after switching to Reshape ports
        rs2_op = con_op.get_dst_ports()[0].process.out_ports.s_out
        self.assertEqual(rs2_op.get_dst_ports(), [dst_op])
        self.assertEqual(dst_op.get_src_ports(), [rs2_op])

by

        # check whether 'source' is connected to 'connections'
        src_op = source.out_ports.s_out
        con_ip = connections.in_ports.s_in
        self.assertEqual(src_op.get_dst_ports(), [con_ip])
        self.assertEqual(con_ip.get_src_ports(), [src_op])

        # check whether 'connections' is connected to 'target'
        con_op = connections.out_ports.a_out
        dst_op = destination.in_ports.a_in
        self.assertEqual(con_op.get_dst_ports(), [dst_op])
        self.assertEqual(dst_op.get_src_ports(), [con_op])
  1. Remove directories lava/lib/dnf/connect/reshape_* including all content.

Need to store Processes created by `connect` as instance variables if it is used within a ProcessModel of a Hierarchical Process.

Objective of issue:
State that currently, we cannot simply make use of lava-dnf's connect method to connect child Neural Processes of a Hierarchical Process without storing the Dense + 2 Reshape Processes as instance variables.

Background:
lava-dnf's connect method is supposed to be a high-lever helper method the user uses to create connections between two Neural Processes with connectivity matrix (synaptic weights) derived from a list of Operation objects given as argument.

Roughly, it does the following things :

  • Compute weights (connectivity matrix)
  • Create Dense Process

It returns the created Dense Process. The user chooses whether to use it (store it) or not.

That being said ;
Neural Processes currently available in the proc library (LIF Process) can by of any shape (N-D arrays).
The Dense Process from the proc library represent weights as 2-D arrays, which, implicitly, means the Neural Processes connected at both ends have to be 1-D arrays.
Ultimately, ReshapePort should be added either :

  • On the OutPort of Neural Processes to "virtually" reshape output with arbitrary shape to 1-D (N-D arrays to 1-D arrays), or
  • On the InPort and OutPort of the Dense Process to "virtually" reshape input with arbitrary shape to 1-D (N-D arrays to 1-D arrays) and "virtually" reshape output with 1-D shape to arbitrary shape (1-D arrays to M-D arrays).

As stated in #7, connect currently implement reshaping with dedicated Reshape Processes, i.e it additionally creates 2 Reshape Processes for each Dense Process created.

Lava version:

  • 0.2.0

Lava-dnf version:

  • 0.1.0

I'm submitting a ...

  • bug report

Current behavior:

I want to create a Hierarchical Process (with ProcessModel inheriting from AbstractSubProcessModel).
Inside its __init__ definition, I want to connect connect child Neural Processes with the connect method.

Code snippet 1 contains the definition of an example of what I want to build.

Code snippet 2 contains the definition of a SourceProcess (created for testing purposes) that always sends an array full of ones.

When SourceProcess is connected to DNF, and the whole is run as is done in Code snippet 3, the ones sent from SourceProcess never reach the input_conn of DNF.

However ;

  • If the connect is tweaked so as to also return the created Reshape Processes, and
  • If the DNFProcessModel is tweaked so as to store the Reshape Processes as instance variables

Ones reach the Dense Process.

I suspect this has something to do with the way the Builder creates instances of AbstractSubProcessModel and their child Process at compile time.
I think when the overall network is compiled, if an instance of AbstractSubProcessModel defines child process, it has to store them as instance variables, otherwise they are discarded from compilation.

Namely, I suspect that is what happens for Reshape Processes created in the connect method in our case.
If the Reshape Process at the input side of the Dense Process created by connect doesn't get compiled, the data sent by SourceProcess never reaches the Dense Process.

Related code:

Code snippet 1 :

class DNF(AbstractProcess):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)
        shape = validate_shape(kwargs.pop("shape"))

        self.u = Var(shape=shape, init=np.zeros(shape))

        self.s_in = InPort(shape=shape)

        self.s_out = OutPort(shape=shape)



@implements(proc=DNF, protocol=LoihiProtocol)
class DNFProcessModel(AbstractSubProcessModel):
    def __init__(self, proc):
        shape = proc.init_args.get("shape")

        self.population = LIF(shape=shape, du=409, dv=2047, vth=200)

        self.input_conn = connect(proc.in_ports.s_in, self.population.a_in, [Weights(25)])
        kernel = MultiPeakKernel(amp_exc=83, width_exc=3.75, amp_inh=-70, width_inh=7.5)
        self.recurrent_conn = connect(self.population.s_out, self.population.a_in, [Convolution(kernel)])

        self.population.s_out.connect(proc.out_ports.s_out)

        proc.vars.u.alias(self.population.vars.u)

Code snippet 2 :

class SourceProcess(AbstractProcess):
    """
    Process that sends arbitrary vectors

    Parameters
    ----------
    shape: tuple, shape of the process
    """

    def __init__(self,
                 **kwargs: ty.Union[ty.Tuple, np.ndarray]) -> None:
        super().__init__(**kwargs)
        shape = kwargs.get("shape")
        data = kwargs.get("data")

        self.data = Var(shape=shape, init=data)
        self.a_out = OutPort(shape=shape)


@implements(proc=SourceProcess, protocol=LoihiProtocol)
@requires(CPU)
@tag("bit_accurate_loihi")
class SourceProcessBoolModel(PyLoihiProcessModel):
    data: np.ndarray = LavaPyType(np.ndarray, bool)
    a_out: PyOutPort = LavaPyType(PyOutPort.VEC_DENSE, bool)

    def run_spk(self) -> None:
        """Send data every timestep"""
        self.a_out.send(self.data)

Code snippet 3 :

def main():
    shape = (20, )

    run_cfg = Loihi1SimCfg(select_tag="bit_accurate_loihi",
                           select_sub_proc_model=True)
    condition = RunSteps(num_steps=10)

    source = SourceProcess(shape=shape, data=np.ones(shape, dtype=bool))
    dnf = DNF(shape=shape)

    source.a_out.connect(dnf.s_in)

    dnf.run(condition=condition, run_cfg=run_cfg)

    dnf.stop()


if __name__ == '__main__':
    main()

Dependabot security alert for PY

Objective of issue: Fix Dependabot alert

Lava-DNF version:

  • 0.1.0 (current version)

Lava version:

  • 0.4.0 (current version)

I'm submitting a ...

  • bug report

Current behavior:

  • Dependabot alerts that the py dependencies has a vulnerability.

Expected behavior:

  • Dependabot does not alert that any dependencies have vulnerabilities.

Steps to reproduce:

  • Review Dependabot issues.

Related code:

insert short code snippets here

Other information:

insert the output from lava debug here

Initial release of lava-dnf

The lava-dnf repository has so far been empty, except for a README file and CI/CD. With the first release, we would like to add support for a connect function that enables users to generate connectivity between neural populations.

BUILD FAILED - Error while executing Twine

Hi,
I have some errors when Lava is being built. Tests pass without problems but it fails when Running Twine check for generated artifacts.
Lava is already installed and working (version 0.2.0) and lava-dl as well. See error code below:

PyBuilder version 0.13.3
Build started at 2022-05-25 17:32:58
------------------------------------------------------------
[INFO]  Installing or updating plugin "pypi:pybuilder_bandit, module name 'pybuilder_bandit'"
[INFO]  Processing plugin packages 'pybuilder_bandit' to be installed with {}
[INFO]  Activated environments: unit
[INFO]  Building lava-dnf version 0.1.0
[INFO]  Executing build in /home/garciaal/.local/share/Trash/files/lava-dnf
[INFO]  Going to execute tasks: analyze, publish
[INFO]  Processing plugin packages 'coverage~=5.2' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'flake8~=3.7' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'pypandoc~=1.4' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'setuptools>=38.6.0' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'sphinx_rtd_theme' to be installed with {}
[INFO]  Processing plugin packages 'sphinx_tabs' to be installed with {}
[INFO]  Processing plugin packages 'twine>=1.15.0' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'unittest-xml-reporting~=3.0.4' to be installed with {'upgrade': True}
[INFO]  Processing plugin packages 'wheel>=0.34.0' to be installed with {'upgrade': True}
[INFO]  Creating target 'build' VEnv in '/home/garciaal/.local/share/Trash/files/lava-dnf/target/venv/build/cpython-3.9.7.final.0'
[INFO]  Processing dependency packages 'lava' from https://github.com/lava-nc/lava/releases/download/v0.2.0/lava-nc-0.2.0.tar.gz to be installed with {'force_reinstall': True}
[INFO]  Processing dependency packages 'requirements.txt' to be installed with {}
[INFO]  Creating target 'test' VEnv in '/home/garciaal/.local/share/Trash/files/lava-dnf/target/venv/test/cpython-3.9.7.final.0'
[INFO]  Processing dependency packages 'requirements.txt' to be installed with {}
[INFO]  Requested coverage for tasks: pybuilder.plugins.python.unittest_plugin:run_unit_tests
[INFO]  Running unit tests
[INFO]  Executing unit tests from Python modules in /home/garciaal/.local/share/Trash/files/lava-dnf/tests/lava
[INFO]  Executed 207 unit tests
[INFO]  All unit tests passed.
[INFO]  Executing flake8 on project sources.
[INFO]  Building distribution in /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0
[INFO]  Copying scripts to /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/scripts
[INFO]  Writing setup.py as /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/setup.py
[INFO]  Collecting coverage information for 'pybuilder.plugins.python.unittest_plugin:run_unit_tests'
[WARN]  ut_coverage_branch_threshold_warn is 0 and branch coverage will not be checked
[WARN]  ut_coverage_branch_partial_threshold_warn is 0 and partial branch coverage will not be checked
[INFO]  Running unit tests
[INFO]  Executing unit tests from Python modules in /home/garciaal/.local/share/Trash/files/lava-dnf/tests/lava
[INFO]  Executed 207 unit tests
[INFO]  All unit tests passed.
[WARN]  Test coverage below 70% for lava.lib.dnf.inputs.rate_code_spike_gen.models: 45%
[INFO]  Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests coverage is 93%
[INFO]  Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests branch coverage is 93%
[INFO]  Overall pybuilder.plugins.python.unittest_plugin.run_unit_tests partial branch coverage is 95%
[INFO]  Overall lava-dnf coverage is 93%
[INFO]  Overall lava-dnf branch coverage is 93%
[INFO]  Overall lava-dnf partial branch coverage is 95%
[INFO]  Building binary distribution in /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0
[INFO]  Running Twine check for generated artifacts
------------------------------------------------------------
BUILD FAILED - Error while executing Twine ['check', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava-dnf-0.1.0.linux-x86_64.tar.gz', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava-dnf-0.1.0.tar.gz', '/home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0.1.0/dist/lava_dnf-0.1.0-py3-none-any.whl']. See /home/garciaal/.local/share/Trash/files/lava-dnf/target/reports/distutils/twine_check.log for full details:
	Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
	.1.0/dist/lava_dnf-0.1.0-py3-none-any.whl: FAILED
	ERROR    `long_description` has syntax errors in markup and would not be        
	         rendered on PyPI.                                                      
	         No content rendered from RST source.                                   
	WARNING  `long_description_content_type` missing. defaulting to `text/x-rst`.   
	Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
	.1.0/dist/lava-dnf-0.1.0.linux-x86_64.tar.gz: PASSED with warnings
	WARNING  `long_description_content_type` missing. defaulting to `text/x-rst`.   
	WARNING  `long_description` missing.                                            
	Checking /home/garciaal/.local/share/Trash/files/lava-dnf/target/dist/lava-dnf-0
	.1.0/dist/lava-dnf-0.1.0.tar.gz: PASSED with warnings
	WARNING  `long_description_content_type` missing. defaulting to `text/x-rst`.   
	WARNING  `long_description` missing.                                             (site-packages/pybuilder/plugins/python/distutils_plugin.py:421)
------------------------------------------------------------
Build finished at 2022-05-25 17:34:04
Build took 66 seconds (66761 ms)

Thanks in advance,

Alex.

Refactor DVS input class of motion tracking demo

Replace the current DVSFileInput class used in the motion tracking demo (demos/motion_tracking/dvs_file_input/process.py) with the InivationProcess class (lava-peripherals) to minimize maintenance of the demo to a minimum while increasing compatibility with future I/O related changes.

About the support of the new lava package

Hello ,
It seems this library requires the previous lava version 0.2.0. If I remove the requirement about the release version, would this library still work correctly?

`RateCodeSpikeGen` does not behave correctly when incoming pattern amplitude is higher than 12000

When RateCodeSpikeGen receives a pattern containing one amplitude that is higher than 12000, the expected behavior would be that the neuron corresponding to that amplitude would spike every time step (saturate).

However, currently, it is not the case. Current behavior is that said neuron will not fire at all.

This is due to the fact that, when computing distances, upon new pattern arrival, we do the following :

distances[idx_non_negligible] = \
            np.rint(TIME_STEPS_PER_MINUTE / pattern[idx_non_negligible])\
            .astype(int)

This would result to distances having a 0 at indices where the pattern is higher than 12000 (since TIME_STEP_PER_MINUTE is set to 6000 right now). 0 representing an infinite distance with the current way RateCodeSpikeGen is implemented, this is not correct.
It should saturate to 1, in that case.

Merge Motion Tracking into main of lava-dnf

  • Make it terminate gracefully
  • Use connect function again
  • Bug fixes
  • Check if Loihi is available, if not run in simulation otherwise on Loihi (follow lava-dl example) (optional)
  • Save the executable and run demo from stored executable (users can choose to recompile in the code)
  • Unit test in lava-loihi tests/lava/lib/dnf
  • Merge PR in lava-dnf
  • Merge PR in lava-loihi (unit test)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.