>
>  I assume it works with multi-dimensional functions f(x,y,z)?
>

Yes, in a sense. ForwardDiff only accepts univariate functions, but the 
single argument to those functions can be a Number or Vector. So, if you 
wanted to the gradient of the f you gave, and x, y, z are scalars, you 
could rewrite f to accept a Vector and do:

    julia> ForwardDiff.gradient(f, [x, y, z])
 
Otherwise, if you wanted to do something like differentiate w.r.t. 
different arguments, you could simply use closures (see my comment in this 
issue 
<https://github.com/JuliaDiff/ForwardDiff.jl/issues/32#issuecomment-137987987>
).

What are the limitations to it?
>

ForwardDiff supports up to a third-order generalization of the Hessian, but 
you can nest the results to go to higher orders. The usage example given in 
the README <https://github.com/JuliaDiff/ForwardDiff.jl#forwarddiffjl> 
demonstrates 
nested derivatives in ForwardDiff (in that specific example, we take the 
Jacobian of the gradient and then check its equivalence with the Hessian).

ForwardDiff also only supports differentiation of functions where the 
involved number types are subtypes of Real. I think generalization to 
Complex numbers is possible, but we haven't played with it yet.

Where would you still use analytic derivatives?
>

If you mean hard-coding the known derivative of a function, I imagine that, 
in most cases, you'll be able to write a faster function by hand than 
ForwardDiff can generate. But analytic higher-order expressions can 
sometimes be tough and time consuming to code efficiently. I would say that 
ForwardDiff is worth trying out in any situation that normally calls for 
numerical differentiation.

Best,
Jarrett

On Saturday, September 5, 2015 at 1:16:36 PM UTC-4, Michael Prentiss wrote:
>
> This looks very impressive.  I assume it works with multi-dimensional 
> functions f(x,y,z)?
>
> It also looks very fast.  What are the limitations to it?  Where would you 
> still use analytic derivatives?
>

Reply via email to