Comments (18)
@ingablundell What is your opinion?
from nestml.
@Silmathoron Thank you for this! I am still not sure this is more intuitive. Your problem with the "curr_sum" notation was, if I remember this correctly, that it is unintuitive to write down a convolution of a function that you don't know. But when you write down the equation plus the initial values the solution will be unique (you won't be able to state anything weird with a computer). So from a mathematical point of view it makes a lot of sense to define the convolution. And I think it is nice to be able to see exactly what the equation is we are solving from looking at the ODE.
from nestml.
@ingablundell that was not really my argument: I was saying that if he provides an ODE and its initial conditions in the equation block, the user expects you to solve this ODE in a step by step fashion, which means he does not expect NESTML to use a step function, thus the use of the convolution.
My objective was:
- to reduce the number of necessary codelines (only two in addition to the function definition -- shape or ODE)
- to avoid apparent code redundancy (if you do not know the internals of NESTML, how can you understand the necessity for both L74 and L151 for the implicit definition?)
- to make the implicit and explicit cases identical in terms of function calls (only
apply_spikes(...)
function - to make it more intuitive
The last point is arguably not guaranteed since it is based only on my gut feeling and a discussion with Sam during the workshop, which makes it a rather small user panel...
EDIT: maybe the best would be to send a form on NEST_USER to directly get the opinion of the guys who might use NESTML...
from nestml.
- yes, that would reduce the code a bit.
- L151 we don't actually need at all. And are in the process of getting rid of.
- In which way do they differ in our notation at the moment?
- Maybe you are right, and we should ask on NEST_USER. I also think, if your suggestion is more intuitive to you, it might well be to others. So, in my opinion, we should at least implement both ways. What do you think @jougs? But I am not sure I want to get rid of the convolution notation at the moment, because to me it is a nice clean way of writing it down.
Also: Am I right in assuming that you want no explicit initial values for I_syn_ex'?
from nestml.
@ingablundell ok, removing L151 would at the very least make both implementations more coherent! (which also addresses point 3)
I think asking the users is best... you could implement both, but I think this could easily lead to possibly confusing code constructs...
So either you should make both statements (*_sum
and apply_spikes
) uncompatible, or find an other way to make sure they are not involved in several places. I must say I'm a priori against this.
Eventually, if you decide to keep these *_sum
functions, maybe they should be fused in one unique convolve
function which would have a return value of the same type as the shape function...
EDIT: and no, I do want to be able to set initial values for I_syn_ex'
(we need it to solve the ODE anyway, it might even be better to make it compulsory, or at least raise a warning if it is not present, to say it was initialized at zero)
from nestml.
I just realized that my proposal might not be as simple to implement as I first thought...
There are two problems:
- can we always consider (in the implicit case) that the spikes are applied on the highest derivative? I think so, but there could be exceptions, so I'll have to look at other existing synaptic shapes (if there are some)
- for these custom shapes, we cannot necessarily find the formula of the peak value, so there might be the need for a third "normalization" parameter in the
apply_spikes
function for the implicit declaration.
Any thoughts/comment on that?
from nestml.
@Silmathoron
a) This problem is solved by stating all initial values (also which are equals 0). Basically, the problem statement is the same compared to the curr_/cond_sum notation.
b) Can you exemplify more on the problem?
from nestml.
I'm sorry, I did not get your a) point... which issue were you referring to?
Maybe to clarify:
- for
alpha
synapses, the spikes must be applied to the first derivative of the current or conductance, and to normalize the peak value to either1 pA
or1 nS
, the spikes are weighted bye / tau_syn
- for
exp
synapses, the spikes are applied directly to the current/conductance, so this normalization constant is not needed.
This means that:
- as long as we are dealing with the standard
alpha
andexp
, we can recognize them and add the spikes to the correct time derivative - However, we cannot be sure that the spikes will always have to be added to the highest derivative in the GSL system so we need to make it clear to the user that he/she should specify the right derivative in
apply_spikes
- for the normalization constant, however, we have a problem because we do not know which parameter should be used as the synaptic time constant; therefore, I think the "normalization" parameter is necessary, which would give
apply_spikes(I_syn_ex', spikes_ex, normalization=e/tau_syn_ex)
To summarize, it would lead to:
psc_exp
implicit -->apply_spikes(I_syn_ex, spikes_ex)
(normalization is not necessary)cond_exp
implicit -->apply_spikes(g_ex, spikes_ex)
(normalization is not necessary)psc_alpha
implicit -->apply_spikes(I_syn_ex', spikes_ex, normalization=1pA*e/tau_syn_ex)
(spikes applied to the 1st derivative)cond_alpha
implicit -->apply_spikes(g_ex', spikes_ex, normalization=1nS*e/tau_syn_ex)
(spikes applied to the 1st derivative)psc_exp
explicit -->apply_spikes(I_syn_ex, spikes_ex, shape=I_shape_ex)
cond_exp
explicit -->apply_spikes(g_ex, spikes_ex, shape=g_shape_ex)
psc_alpha
explicit -->apply_spikes(I_syn_ex, spikes_ex, shape=I_shape_ex)
(normalization is included in the shape function)cond_alpha
explicit -->apply_spikes(g_ex, spikes_ex, shape=g_shape_ex)
(normalization is included in the shape function)
from nestml.
-
I prefer "convolve" too. We chose *_sum because a lot of people from our institute thought that users might not know what that means.
-
This is what I was trying to say when you were in Juelich. Either you state
a) all initial values up to the order n-1 (n being the order of the maximum derivative) or
b) you must state which equations the sum of delta functions are added to, i.e. which derivatives the spikes are "applied to".
To me b) is a bit weird (maybe because I'm neither a modeler nor a physicist) and we would want it to be possible to apply spikes not only the derivative of maximum order (i.e. n-1 here). This:
psc_alpha implicit --> apply_spikes(I_syn_ex', spikes_ex, normalization=1pA*e/tau_syn_ex) (spikes applied to the 1st derivative)
is definitely a possibility but we would also have to allow calling apply_spikes several times for one "shape", right? E.g. we might want I(0)=0; I'(0)=a; I''(0)=b; a and b being some constants. Then we would have to call apply_spikes for I and I'.
from nestml.
- Ok... well I don't know about that, but anyway even if we keep a name with
_sum
, I think it is weird to have 2 functions that do the same thing. - I don't understand your "either"... you have to state both a) and b), don't you? I don't see how stating only one of the two would allow you to skip the other.
I think you try to answer this question already in your last sentence... but I really don't understand what you mean...
Why on earth would spikes apply on both I
and I'
?
from nestml.
-
Yes, I agree. Two functions for the same thing is confusing.
-
In your file, you don't state any initial values for I', right? And if I understood your answer above correctly you don't want to or intend to? The reason for this not being necessary is that what apply_spikes(I', spikes, normalization=1pA*e/tau) implies, is:
I''= ... +\sum_{ti\le t}\sum{k\in spike}w_k \delta(t-t_i) \frac{e}{\tau}
I'= ...
Therefore we don't need to state any intitial values for I'.
About applying spikes on both I and I': No idea. What I meant was just, that I could conceive people stating a number of initial values that don't equal zero. But maybe that wouldn't happen.
from nestml.
I completely agree with you until the "Therefore": we still need to state the initial value of I'
otherwise the problem is not closed. Initial values are always necessary for all variables defined in the state
block.
I'm not sure I understand your sentence
I could conceive people stating a number of initial values that don't equal zero.
Could you elaborate on that? Do you mean "initialize a state variable with a non-zero value" or "initializing a finite number of state variables"?
Well anyway, as I said before, I think all variables in the state
block must be initialized.
from nestml.
What do you mean by
otherwise the problem is not closed?
Did you possibly overlook the
\frac{e}{\tau}
in the equation? By adding the deltas we implicitly use initial values 0 for I and e/tau for I'.
I'm a bit confused. Did you forget to state the initial values for I' in your test.nestml file? I am not actually trying to advocate not stating initial values.
from nestml.
I meant that you're missing initial conditions and therefore that your system is not uniquely determined: diracs do not provide the equivalent of initial conditions, so I do not understand your
By adding the deltas we implicitly use initial values 0 for I and e/tau for I'.
Do you mean that in NESTML you set I = 0
and I' = e/tau
if the user provides the deltas? If so I must say I find surprising to implicitly initialize variables with missing initial conditions to a non-zero value...
But since the problem is mathematically not uniquely defined without all initial conditions, I think that at least a warning should be raised, stating that, by default, the values are initialized (to zero, if possible).
from nestml.
I try to summarize the current state of the discussion regarding spike handling (#368 is related to the current issue). The idea is to change the current nestml syntax with respect to initial values as follows:
a) For every differential equation form the equations
-block all initial values (e.g. from the order 0 to the highest order-1) must be explicitly stated.
b) Buffers store the type, cf. #368
c) The shape is associated The return value of the convolve
-function is determined through the buffer that is used as the second parameter.
initial_values:
# ok, it is just a shape. it should/can be unitless
g_in real = 0
g_in' 1/ms = e / tau_syn_in
...
end
equations:
g_in'' = (-1)/(tau_syn_in)**(2)*g_in+(-2)/tau_syn_in*g_in'
...
function I_syn_in pA = convolve(g_in, spikeInh) * (V_m - E_in)
end
@ingablundell @Silmathoron What do you think?
from nestml.
@DimitriPlotnikov this looks nice to me, except for the unitless part, for which I'm not sure this is a good idea.
Moreover, in your example, g_in''
is not a shape but the actual conductance and is given by a differential equation, so it should definitely be in nS ms^-2
hence it would make more sense to have g8in
in nS
from nestml.
@Silmathoron
It is sort of a discussion point. Actually, you get the conductance in the convolve(g_in, spikeInh)
(since you know the type of the buffer). Beforehand, it is actually just a different notation for an alpha function. But I'm open for any reasonable interpretation (the current state of nestml model corresponds to your suggestion).
I'm even starting to think, that it would be a good idea to mark shape function explicitly. Since it could avoid the confusion and would align to the functional notation of the shape function. E.g.:
initial_values:
# ok, it is just a shape. it should/can be unitless
g_in real = 0
g_in' 1/ms = e / tau_syn_in
...
end
equations:
shape g_in'' = (-1)/(tau_syn_in)**(2)*g_in+(-2)/tau_syn_in*g_in'
...
function I_syn_in pA = convolve(g_in, spikeInh) * (V_m - E_in)
end
from nestml.
Done in #382
from nestml.
Related Issues (20)
- [Feature Request] Continuous input conductance in compartmental models HOT 1
- Add details on Hodgkin Huxley models, based on asynchronicity discussion
- Improve flexibility of ``emit_spike()`` HOT 1
- Use grid cards for NESTML target platform docs HOT 1
- NESTML dependency astropy not happy with new numpy HOT 2
- Add non-default install path option to ``NESTCodeGeneratorUtils.generate_code_for()`` HOT 3
- TODO list compartmental models
- Take ODE-toolbox solver recommendation into account
- Revert to Python 3.8 compatibility for EBRAINS environment
- Add CI stage testing against apt installed NEST
- Emit warning for uninitialised state/parameter/internal variables
- Support priorities for onCondition blocks
- Check NEST initialization of state/parameters/internals HOT 1
- Add model transformer for non-dimensionalisation HOT 1
- Make integers and bools recordable HOT 1
- ``print()`` does not work for variables moved from synapse to neuron
- Rename ``resolution()`` to ``timestep()`` HOT 4
- Check for ANTLR runtime 4.10.1 but 4.13 is installed via requirements.txt HOT 1
- Issue with delta kernel convolution in inline expression HOT 1
- Implement continuous input in NESTML compartmental
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from nestml.