deshanadesai / acorns Goto Github PK
View Code? Open in Web Editor NEWhttps://arxiv.org/pdf/2007.05094.pdf An Easy-To-Use Code Generator for Gradients and Hessians
License: MIT License
https://arxiv.org/pdf/2007.05094.pdf An Easy-To-Use Code Generator for Gradients and Hessians
License: MIT License
Dear devs,
recently I have been trying to differentiate complicated expressions that eventually result in nested calls to pow
. I find that the forward mode derivative for these cases is not taken correctly. Here is the MWE:
c_function = """
function_test(double x){
double x2 = pow(x,2);
double H = pow(x2,2);
}
deriv = acorns.autodiff(c_function, 'H', ['x'], func = 'function_test', output_filename = 'test',
output_func = 'compute_grad_nested_pow')
This gives as the derivative (pow(pow,(2-1)) * (2 * 0 + pow * 0 * log(pow)))
which looks like it's using pow
as a variable instead of a function.
Computing directly the derivative of
c_function = """
function_test(double x){
double H = pow(x,4);
}
gives the correct result, (pow(x,(4-1)) * (4 * 1 + x * 0 * log(x)))
. Unfortunately my expressions are complicated enough that no such simplifications are possible.
Dear devs,
First, thanks for the useful library! It hits the perfect sweet spot for the problem I am considering.
I have found that differentiating an expression inside a pow call does not work correctly. Here is a MWE:
c_function = "int function_test(double r, double L){ \
double blah = pow(L+r,0.5); \
return 0; \
}"
acorns.autodiff(c_function, 'blah', ['r','L'], func = 'function_test', output_filename = 'test_grad_back',
output_func = 'compute_grad_back')
This gives as the derivatives:
void compute_grad_back(double values[], int num_points, double ders[]){
for(int i = 0; i < num_points; ++i)
{
double r = values[i* 2 + 0 ];
double L = values[i* 2 + 1 ];
ders[i*2+0]= (pow((L + r),(0.5-1)) * (0.5 * 0 + (L + r) * 0 * log((L + r)))); // df/(r)
ders[i*2+1]= (pow((L + r),(0.5-1)) * (0.5 * 0 + (L + r) * 0 * log((L + r)))); // df/(L)
}
}
i.e. zero for both derivatives, which is of course not the correct answer. It appears that the issue is with the ._forward_diff of the base.
Another issues is when the base of the pow call is negative. Then the current implementation gives nan
even when the answer should be sensible, e.g. when differentiating pow(pr,4)
where pr
is -1e-4.
Could you please fix the license?
Line 3 in 59b208e
It is missing the year and authors.
After that, could you please add the license to be package along with the source distribution?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.