This library hit an interesting point. We have all of the features to recreate things like @ode_def
, including computation of things like the inverse of W = I - gamma*J where J is the Jacobian.
But at the same time I have to admit defeat. The reason is because the computations are getting slow. Now, inverting matrices in computational graphs is quite a stress test so that's okay, but it's taking 10's of seconds instead of the about a second it took with ParameterizedFunctions.jl and SymEngine.jl, so I am afraid the current architecture will not scale well at all.
The issue goes back to our discussion and hope for context. We want to know a ton of information about our variables, but then actually keeping that information inside of the computational graph is too much. We wanted it so that way properties could be well-defined, but then what has ended up happening is that the context is used to gather the variables into specific arrays in the system's construction, and then only used from those arrays. So the computational graphs for the equations holds information it doesn't use only to automate the finding of variables, which in many cases is overridden so that way the variables end up in a different order than what the automated finder gives (since it's not all that great...).
Now let me be clear. Graph construction is cheap, but graph operations are bad.
I think we have a clear path around this though. The true format for the equations should be Julia AST earlier. Definitely operations on the graph should be done on Julia ASTs (something @YingboMa was considering before), but the question is how far back do we go.
- Do we have the user give us vectors of expressions and then vectors of the variables for each time, more manually building the system?
- During system construction, do we turn the eqs into Julia AST?
If we do 1, the interface changes quite a bit. If we do 2, we can keep the same user interface and switch up the internals to make it more optimized just for this goal. It would also let us keep the automation. I am inclined to go with route 2.
Either way, we would need to re-write our graph computations. But since they are all in Julia ASTs we can better make use of some existing tooling. I am considering using Reduce.jl here, though the only issue is Windows packaging which I've discussed with @chakravala. That would give us simplification, differentiation, etc. routines for free and it would be better than what we had. Thus, from route 2, we can keep the same user-facing tooling but redo the internal computation part.
With a Julia AST, we will need to enforce that the symbols used for the names are unique. That's one thing to note. We can make the AST use componentized names, i.e. comp1.a
would be the name of the variable in component 1 and that would solve #36 . It is actually a valid Julia symbol: x = Symbol("comp.a")
. Building the variable vectors would then require doing this kind of renaming.
@AleMorales might have some useful input here. I've also mentioned this to @jrevels but I don't know if Cassette really does all that much here (?).