On Tue, May 4, 2010 at 9:23 PM, josef.p...@gmail.com wrote:
In [2] I didn't see anything about higher derivatives, so to get the
Hessian I still had to do a finite difference (Jacobian) on the
complex_step_grad. Even then the results look pretty good.
Yes, the traditional complex step does
Just to make this thread more useful for someone interested in these
topics, this seems to be the book on automatic differentiation (it's
one of the references in the autodiff website)
Evaluating Derivatives: Principles and Techniques of Algorithmic
Differentiation (2nd ed.), by Andreas Griewank
Hello, I have written a very simple code that computes the gradient by finite
differences of any general function. Keeping the same idea, I would like
modify the code using numpy to make it faster.
Any ideas?
Thanks.
def grad_finite_dif(self,x,user_data = None):
If your x data are equispaced I would do something like this
def derive( func, x):
Approximate the first derivative of function func at points x.
# compute the values of y = func(x)
y = func(x)
# compute the step
dx = x[1] - x[0]
# kernel array for second order accuracy centered
playing devil's advocate I'd say use Algorithmic Differentiation
instead of finite differences ;)
that would probably speed things up quite a lot.
On Tue, May 4, 2010 at 11:36 PM, Davide Lasagna lasagnadav...@gmail.com wrote:
If your x data are equispaced I would do something like this
def
On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter
sebastian.wal...@gmail.com wrote:
playing devil's advocate I'd say use Algorithmic Differentiation
instead of finite differences ;)
that would probably speed things up quite a lot.
I would suggest that too, but aside from FuncDesigner[0]
I forgot to mention one thing: if you are doing optimization, a good
solution is a modeling package like AMPL (or GAMS or AIMMS, but I only
know AMPL, so I will restrict my attention to it). AMPL has a natural
modeling language and provides you with automatic differentiation.
It's not free, but
On Tue, May 4, 2010 at 8:23 PM, Guilherme P. de Freitas
guilhe...@gpfreitas.com wrote:
On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter
sebastian.wal...@gmail.com wrote:
playing devil's advocate I'd say use Algorithmic Differentiation
instead of finite differences ;)
that would probably speed