Would you be willing to share a problem that you're interested in that 
isn't amenable to symbolic differentiation, maybe on a gist or something? 
I've been looking for non-trivial problems to try to apply PowerSeries to.

When I said numerical differentiation is "unstable", I meant something like 
"hard to do without losing several digits". For example, you point out that 
by choosing a different h, it's possible to improve over Calculus.jl's 
current behavior by 2-3 decimal digits, from 7e-7 to 2e-9 in absolute 
error. But the best precision you could hope for in a floating point answer 
to this problem is

julia> eps(sin(1.0))
1.1102230246251565e-16

so even with your improvements, you've lost a lot of precision. That might 
or might not be a problem depending on what you'd like to do next with the 
value you get out.


On Monday, January 20, 2014 12:30:38 PM UTC-8, Hans W Borchers wrote:
>
> Numerical differentiation is by far not as unstable as you seem to think.
> And I have a long experience in using numerical derivatives for 
> optimization problems where you don't stop to look up symbolic derivatives 
> applying a CAS.
> The function obviously was only an example.
> For most of the functions I have used Julia's symbolic capabilities will 
> not be sufficient.
>
>
> On Monday, January 20, 2014 9:07:02 PM UTC+1, Jason Merrill wrote:
>>
>> This implementation could certainly use some love, but finite difference 
>> differentiation is always unstable, and the situation gets worse as you 
>> take higher order derivatives.
>>
>> You might want to consider using the differentiate method to take your 
>> derivatives symbolically (if this works for the functions you're using), or 
>> have a look at the differentiation example in PowerSeries.jl. To do a 
>> symbolic second derivative, you can do, e.g.
>>
>> julia> using Calculus
>> julia> @eval d2(x) = $(differentiate(differentiate(:(sin(x)))))
>> julia> d2(1.0)
>> -0.8414709848078965
>>  
>>
>

Reply via email to