Think of a special function not (yet) implemented in Julia, like Lambert W, 
or Struve, or ... So you define it yourself, e.g. by applying 
Newton-Raphson. How would you symbolically differentiate that? Or your 
application forces you to define a procedure with if-else constructs and 
loops, how to differentiate? Even an advanced "automated differentiation" 
approach will most often not be capable of solving this for you.

Or imagine having to minimize a function such as

function repelling_balls(x::Array{Float64, 1})
  if length(x) != 45; error("Arg. 'x' must be a vector of length 45."); end
  d = 2.0
  for i = 1:14, j = (i+1):15
      s = sqrt((x[i]-x[j])^2 + (x[i+15]-x[j+15])^2 + (x[i+30]-x[j+30])^2)
      if s < d; d = s; end
  end
  return -d
end

that models 15 repelling balls/points in [0, 1]^3.

As "minimax" problem this function is smooth except for a subset of lower 
dimension. Some optimization procedures using gradients will be able to 
circumvent this subset and find a (local) minimum. How will you 
differentiate that (i.e., build a gradient) if not numerically; and if yes, 
is it worth the effort?

Consider that an optimization or root finding procedure is a (hopefully) 
converging process that will not return an absolutely precise result. A 
function itself may be defined through an optimization or root finding task 
and whose derivative will have a technical meaning.

In real-world applications, and tasks Matlab is made for, you do not care 
for 15 digits, and gradients five, six, seven digits accurate are more than 
enough to solve your problem. Normally, your functions are not defined for 
eps() accuracy anyway.

When I came here, I thought Julia was meant for "technical computing" and 
not so much for pure Math exercises. In scientific computing, calculating 
numerical derivatives and gradients is essential for many applications.


On Tuesday, January 21, 2014 3:28:28 AM UTC+1, Jason Merrill wrote:
>
> Would you be willing to share a problem that you're interested in that 
> isn't amenable to symbolic differentiation, maybe on a gist or something? 
> I've been looking for non-trivial problems to try to apply PowerSeries to.
>
> When I said numerical differentiation is "unstable", I meant something 
> like "hard to do without losing several digits". For example, you point out 
> that by choosing a different h, it's possible to improve over Calculus.jl's 
> current behavior by 2-3 decimal digits, from 7e-7 to 2e-9 in absolute 
> error. But the best precision you could hope for in a floating point answer 
> to this problem is
>
> julia> eps(sin(1.0))
> 1.1102230246251565e-16
>
> so even with your improvements, you've lost a lot of precision. That might 
> or might not be a problem depending on what you'd like to do next with the 
> value you get out.
>
>
> On Monday, January 20, 2014 12:30:38 PM UTC-8, Hans W Borchers wrote:
>>
>> Numerical differentiation is by far not as unstable as you seem to think.
>> And I have a long experience in using numerical derivatives for 
>> optimization problems where you don't stop to look up symbolic derivatives 
>> applying a CAS.
>> The function obviously was only an example.
>> For most of the functions I have used Julia's symbolic capabilities will 
>> not be sufficient.
>>
>>
>> On Monday, January 20, 2014 9:07:02 PM UTC+1, Jason Merrill wrote:
>>>
>>> This implementation could certainly use some love, but finite difference 
>>> differentiation is always unstable, and the situation gets worse as you 
>>> take higher order derivatives.
>>>
>>> You might want to consider using the differentiate method to take your 
>>> derivatives symbolically (if this works for the functions you're using), or 
>>> have a look at the differentiation example in PowerSeries.jl. To do a 
>>> symbolic second derivative, you can do, e.g.
>>>
>>> julia> using Calculus
>>> julia> @eval d2(x) = $(differentiate(differentiate(:(sin(x)))))
>>> julia> d2(1.0)
>>> -0.8414709848078965
>>>  
>>>
>>

Reply via email to