Coder Social home page Coder Social logo

leabra's Introduction

Leabra

Binder Build Status

This repository holds a Python implementation of the Leabra (Local, Error-driven and Associative, Biologically Realistic Algorithm) framework. The reference implementation for Leabra is in emergent developped by the Computational Cognitive Neuroscience Laboratory at the University of Colorado Boulder. This Python implementation targets emergent 8.1.0, and only implements the rate-coded mode —which includes some spiking behavior, but is different from the discrete spiking mode (which is not implemented).

This work is the fruit of the collaboration of the Computational Cognitive Neuroscience Laboratory at the University of Colorado Boulder and the Mnemosyne Project-Team at Inria Bordeaux, France.

Status & Roadmap

This is a work in progress. Most of the basic algorithms of Leabra are implemented, but some mechanisms are still missing. While the current implementation passes several quantitative tests of equivalence with the emergent implementation (8.1.1, r11060), the number and diversity of tests is too low to guarantee that the implementation is correct yet.

  • Unit, Layer, Connection, Network class
  • XCAL learning rule
  • Basic notebook examples
  • Quantitative equivalence with emergent
  • Neuron tutorial notebook
  • Inhibition tutorial notebook
  • Weight balance mechanism

Installation & Usage

Install dependencies:

pip install -r requirements.txt

Then, launch Jupyter to see usage examples:

jupyter notebook index.ipynb

Run Notebooks Online

Notebooks can be run online without installation with the Binder service. The service is still experimental, and may be down or unstable.

Useful Resources

License

To be decided.

leabra's People

Contributors

apmon avatar benureau avatar dcw3 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

leabra's Issues

Quantitative behavior of `adapt`

The adapt behavior is close but not equivalent to the emergent version yet. The decrease in I_net is not present in the python version, and the act decreases much more than in the emergent version. See differences in screenshots:

emergent version:
screen shot 2016-09-12 at 6 04 32 pm

python version:
screen shot 2016-09-12 at 6 11 47 pm

Some of that difference is due to #2 (to solve first).

Quantitative equivalence of I_net

Between emergent r9985 and r10726, the behavior of I_net integration seems to have changed, with positive (not negative) values seemingly set to 0 when below 0.01, as the output from running unit_test.py shows:

I_net:000 [py] -0.0286363636 == -0.0286364000 [emergent] (no adapt)
I_net:001 [py] -0.0261513899 == -0.0261514000 [emergent] (no adapt)
I_net:002 [py] -0.0238820544 == -0.0238821000 [emergent] (no adapt)
I_net:003 [py] -0.0218096448 == -0.0218096000 [emergent] (no adapt)
I_net:004 [py] -0.0199170723 == -0.0199171000 [emergent] (no adapt)
I_net:005 [py] -0.0181887313 == -0.0181887000 [emergent] (no adapt)
I_net:006 [py] -0.0166103703 == -0.0166104000 [emergent] (no adapt)
I_net:007 [py] -0.0151689745 == -0.0151690000 [emergent] (no adapt)
I_net:008 [py] -0.0138526586 == -0.0138527000 [emergent] (no adapt)
I_net:009 [py] -0.0126505684 == -0.0126506000 [emergent] (no adapt)
I_net:010 [py]  0.1191805346 ==  0.1191810000 [emergent] (no adapt)
I_net:011 [py]  0.1358746595 ==  0.1358750000 [emergent] (no adapt)
I_net:012 [py]  0.1231849472 ==  0.1231850000 [emergent] (no adapt)
I_net:013 [py]  0.1054292917 ==  0.1054290000 [emergent] (no adapt)
I_net:014 [py]  0.0887004263 !=  0.0000000000 [emergent] (no adapt)
I_net:015 [py]  0.1908104279 ==  0.1908100000 [emergent] (no adapt)
I_net:016 [py]  0.1593441943 ==  0.1593440000 [emergent] (no adapt)
I_net:017 [py]  0.1330253965 ==  0.1330250000 [emergent] (no adapt)
I_net:018 [py]  0.1110427799 ==  0.1110430000 [emergent] (no adapt)
I_net:019 [py]  0.0926899493 !=  0.0000000000 [emergent] (no adapt)
I_net:020 [py]  0.1909089031 ==  0.1909090000 [emergent] (no adapt)
I_net:021 [py]  0.1593538509 ==  0.1593540000 [emergent] (no adapt)
I_net:022 [py]  0.1330144077 ==  0.1330140000 [emergent] (no adapt)
I_net:023 [py]  0.1110285648 ==  0.1110290000 [emergent] (no adapt)
I_net:024 [py]  0.0926767385 !=  0.0000000000 [emergent] (no adapt)
I_net:025 [py]  0.1909090906 ==  0.1909090000 [emergent] (no adapt)
I_net:026 [py]  0.1593538692 ==  0.1593540000 [emergent] (no adapt)
I_net:027 [py]  0.1330143868 ==  0.1330140000 [emergent] (no adapt)
I_net:028 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:029 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:030 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:031 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:032 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:033 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:034 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:035 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:036 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:037 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:038 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:039 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:040 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:041 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:042 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:043 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:044 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:045 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:046 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:047 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:048 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:049 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:050 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:051 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:052 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:053 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:054 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:055 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:056 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:057 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:058 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:059 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:060 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:061 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:062 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:063 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:064 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:065 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:066 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:067 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:068 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:069 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:070 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:071 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:072 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:073 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:074 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:075 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:076 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:077 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:078 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:079 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:080 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:081 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:082 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:083 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:084 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:085 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:086 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:087 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:088 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:089 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:090 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:091 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:092 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:093 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:094 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:095 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:096 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:097 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:098 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:099 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:100 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:101 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:102 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:103 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:104 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:105 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:106 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:107 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:108 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:109 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:110 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:111 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:112 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:113 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:114 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:115 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:116 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:117 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:118 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:119 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:120 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:121 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:122 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:123 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:124 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:125 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:126 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:127 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:128 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:129 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:130 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:131 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:132 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:133 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:134 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:135 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:136 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:137 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:138 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:139 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:140 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:141 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:142 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:143 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:144 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:145 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:146 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:147 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:148 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:149 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:150 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:151 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:152 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:153 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:154 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:155 [py]  0.1909090909 ==  0.1909090000 [emergent] (no adapt)
I_net:156 [py]  0.1593538693 ==  0.1593540000 [emergent] (no adapt)
I_net:157 [py]  0.1330143867 ==  0.1330140000 [emergent] (no adapt)
I_net:158 [py]  0.1110285377 ==  0.1110290000 [emergent] (no adapt)
I_net:159 [py]  0.0926767133 !=  0.0000000000 [emergent] (no adapt)
I_net:160 [py]  0.0564935065 ==  0.0564935000 [emergent] (no adapt)
I_net:161 [py]  0.0110181196 ==  0.0110181000 [emergent] (no adapt)
I_net:162 [py] -0.0013182966 == -0.0013183000 [emergent] (no adapt)
I_net:163 [py] -0.0044451089 == -0.0044451100 [emergent] (no adapt)
I_net:164 [py] -0.0049864074 == -0.0049864100 [emergent] (no adapt)
I_net:165 [py] -0.0048191211 == -0.0048191200 [emergent] (no adapt)
I_net:166 [py] -0.0044769353 == -0.0044769400 [emergent] (no adapt)
I_net:167 [py] -0.0041102019 == -0.0041102000 [emergent] (no adapt)
I_net:168 [py] -0.0037597610 == -0.0037597500 [emergent] (no adapt)
I_net:169 [py] -0.0034352838 == -0.0034352800 [emergent] (no adapt)
I_net:170 [py] -0.0031376910 == -0.0031376800 [emergent] (no adapt)
I_net:171 [py] -0.0028655580 == -0.0028655600 [emergent] (no adapt)
I_net:172 [py] -0.0026169357 == -0.0026169300 [emergent] (no adapt)
I_net:173 [py] -0.0023898582 == -0.0023898500 [emergent] (no adapt)
I_net:174 [py] -0.0021824772 == -0.0021824800 [emergent] (no adapt)
I_net:175 [py] -0.0019930897 == -0.0019931000 [emergent] (no adapt)
I_net:176 [py] -0.0018201359 == -0.0018201400 [emergent] (no adapt)
I_net:177 [py] -0.0016621903 == -0.0016621900 [emergent] (no adapt)
I_net:178 [py] -0.0015179506 == -0.0015179500 [emergent] (no adapt)
I_net:179 [py] -0.0013862277 == -0.0013862300 [emergent] (no adapt)
I_net:180 [py] -0.0012659352 == -0.0012659300 [emergent] (no adapt)
I_net:181 [py] -0.0011560813 == -0.0011560900 [emergent] (no adapt)
I_net:182 [py] -0.0010557602 == -0.0010557600 [emergent] (no adapt)
I_net:183 [py] -0.0009641446 == -0.0009641470 [emergent] (no adapt)
I_net:184 [py] -0.0008804792 == -0.0008804800 [emergent] (no adapt)
I_net:185 [py] -0.0008040740 == -0.0008040730 [emergent] (no adapt)
I_net:186 [py] -0.0007342990 == -0.0007342990 [emergent] (no adapt)
I_net:187 [py] -0.0006705788 == -0.0006705790 [emergent] (no adapt)
I_net:188 [py] -0.0006123881 == -0.0006123930 [emergent] (no adapt)
I_net:189 [py] -0.0005592470 == -0.0005592500 [emergent] (no adapt)
I_net:190 [py] -0.0005107173 == -0.0005107280 [emergent] (no adapt)
I_net:191 [py] -0.0004663988 == -0.0004664090 [emergent] (no adapt)
I_net:192 [py] -0.0004259262 == -0.0004259350 [emergent] (no adapt)
I_net:193 [py] -0.0003889657 == -0.0003889740 [emergent] (no adapt)
I_net:194 [py] -0.0003552124 == -0.0003552230 [emergent] (no adapt)
I_net:195 [py] -0.0003243882 == -0.0003243950 [emergent] (no adapt)
I_net:196 [py] -0.0002962388 == -0.0002962500 [emergent] (no adapt)
I_net:197 [py] -0.0002705322 == -0.0002705450 [emergent] (no adapt)
I_net:198 [py] -0.0002470562 == -0.0002470670 [emergent] (no adapt)
I_net:199 [py] -0.0002256175 == -0.0002256270 [emergent] (no adapt)
F..
======================================================================
FAIL: test_emergent_neuron (__main__.UnitTestsBehavior)
Test quantitative equivalence with emergent on the neuron tutorial.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "unit_test.py", line 178, in test_emergent_neuron
    self.assertTrue(check)
AssertionError: False is not true

----------------------------------------------------------------------
Ran 8 tests in 0.617s

FAILED (failures=1)

Speed and stability information

Hello,

This is not a bug as much as a request for some more information. At the moment I am using emergent and the PBWM model for my PhD project. However, communicating from a python script to the TCP server implemented within emergent is quite slow (who am I kidding, it is very slow). Since it is used for its reinforcement learning capabilities, this means that many iterations of the same task are required for the model to learn. However, at the moment it takes a virtual agent nearly 2h to complete a task. I think you can understand where my problem lies, when I need the agent to complete at least 1000 times this task to get some significant results.
This is why I would be quite interested in building the PBWM model using this python implementation of the leabra framework. However, my concerns at this point is that this repository is still a work in progress. Therefore, I would like to know if in your opinion it would be possible to implement a stable version of the PBWM? And also to know how long it takes, for rather large networks (6 layers with at least 100 neurons each), to propagate one trial?

Thank you very much in advance for your help.
Sincerely

Quantitative behavior of Unit

The python implementation has a very similar behavior as the emergent version, but it is not quantitatively the same. For instance, on the two following runs, max v_m_eq (or at t=121) is roughly equal to 0.625 in the Python version, but 0.65 in the emergent one.

emergent:
screen shot 2016-09-12 at 6 22 36 pm

Python:
screen shot 2016-09-12 at 6 20 50 pm

Some difference in parameters (which have been checked, though) may be to blame, or the half-step method used in emergent is creating a difference. Or something else.

This might affect #1.

Quantitative equivalence for weights update

The current Python code has somewhat a weights update similar behavior as emergent, but it is still far from quantitative equivalence. This test is conducted on one input neuron and one output neuron network. The input neuron is forced at 0.95 activation, and the output neuron should output as close to 1.0 as possible. The connection weight start as 0.5.

λ:data § nosetests ../network_test.py
.F
======================================================================
FAIL: Quantitative test on the pair of neurons scenario
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/fabien/research/renc/projects/basal/leabra/tests/network_test.py", line 77, in test_simple_pattern_learning
    self.assertTrue(check)
AssertionError: False is not true
-------------------- >> begin captured stdout << ---------------------
wt:0 [py] 0.5650068270 != 0.5000000000 [emergent]
wt:1 [py] 0.6144305580 != 0.5882080000 [emergent]
wt:2 [py] 0.6505487449 != 0.6548680000 [emergent]
wt:3 [py] 0.6757719600 != 0.7033050000 [emergent]
wt:4 [py] 0.6921352764 != 0.7375150000 [emergent]
wt:5 [py] 0.7011855944 != 0.7608890000 [emergent]
wt:6 [py] 0.7040242682 != 0.7759490000 [emergent]
wt:7 [py] 0.7009828160 != 0.7844460000 [emergent]
wt:8 [py] 0.6921667615 != 0.7875390000 [emergent]
wt:9 [py] 0.6779418603 != 0.7855580000 [emergent]
wt:10 [py] 0.6585741393 != 0.7782380000 [emergent]
wt:11 [py] 0.6343030064 != 0.7658020000 [emergent]
wt:12 [py] 0.6054100360 != 0.7483020000 [emergent]
wt:13 [py] 0.5722791354 != 0.7257010000 [emergent]
wt:14 [py] 0.5354419339 != 0.6979620000 [emergent]
wt:15 [py] 0.4955992401 != 0.6651280000 [emergent]
wt:16 [py] 0.4536154399 != 0.6274090000 [emergent]
wt:17 [py] 0.4104754985 != 0.5852450000 [emergent]
wt:18 [py] 0.3672306142 != 0.5393680000 [emergent]
wt:19 [py] 0.3249049103 != 0.4907760000 [emergent]

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
Ran 2 tests in 1.230s

FAILED (failures=1)

Can not include Leabra as dependency because it is not on pypi

Hello,

the PsyNeuLink modeling library uses the Leabra package in a part of our codebase, but we're unable to include it in our requirements.txt file because pypi requires that all of a package's dependencies be uploaded to pypi themselves (i.e. it does not allow for direct dependencies).

Currently, we check to see if a user has the package installed when they attempt to instantiate a LeabraMechanism. If it's not installed, we raise an error that points them to your git repo for installation.

Naturally, a more elegant solution would be to simply include the Leabra package in our requirements, so that it would be automatically installed like our other dependencies via pip. Given this issue, It would be very convenient for us if leabra were uploaded to pypi, if it is possible and practical for you to do so.

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.