Ivar, if I sounded impolite, I feel sorry. And I don't want to 'criticize' 
the Calculus package, I am just trying to understand the status of the 
package and where it is going. I'd love to use Julia as a technical 
computing platform in the future, but for this I would require a strong 
calculus or 'NumericalMath' package that covers lots of numerical 
functions. When you say, Calculus is not developed much at the moment, 
maybe it's too early for me to change.

Hans Werner


On Tuesday, January 21, 2014 2:02:49 PM UTC+1, Ivar Nesje wrote:
>
> When I came here, I thought Julia was meant for "technical computing" and 
>> not so much for pure Math exercises. In scientific computing, calculating 
>> numerical derivatives and gradients is essential for many applications.
>
>
> Julia is not meant to be unusable for pure Math exercises. It is designed 
> to be free, and it is early in its development so things change as more 
> things gets implemented and new opinions are expressed. I have followed the 
> development for 7 months, and I am really amazed how polite the discussion 
> is, and how almost every question, hard or stupid, gets a (mostly) useful 
> response.
>
> The package you criticize is not part of the standard library, and 
> currently it is more of a "proof of concept" package, than anything else. 
> There seems to be lots of interest in using Calculus.jl, but unfortunately 
> the development of that package is missing a strong lead. In the last 7 
> months only small bugfixes has been added to that repository. 
>
> If you want to define a Julia function in terms of a Newton iteration, I 
> would think that the best option is to implement your own derivative 
> approximation where you can control the step. I often find that iterative 
> solvers gives a non smooth response in the solution, and a finite 
> difference over a small step will be very inaccurate.
>
> Ivar
>
> kl. 13:05:15 UTC+1 tirsdag 21. januar 2014 skrev Hans W Borchers følgende:
>>
>> Think of a special function not (yet) implemented in Julia, like Lambert 
>> W, or Struve, or ... So you define it yourself, e.g. by applying 
>> Newton-Raphson. How would you symbolically differentiate that? Or your 
>> application forces you to define a procedure with if-else constructs and 
>> loops, how to differentiate? Even an advanced "automated differentiation" 
>> approach will most often not be capable of solving this for you.
>>
>> Or imagine having to minimize a function such as
>>
>> function repelling_balls(x::Array{Float64, 1})
>>   if length(x) != 45; error("Arg. 'x' must be a vector of length 45."); 
>> end
>>   d = 2.0
>>   for i = 1:14, j = (i+1):15
>>       s = sqrt((x[i]-x[j])^2 + (x[i+15]-x[j+15])^2 + (x[i+30]-x[j+30])^2)
>>       if s < d; d = s; end
>>   end
>>   return -d
>> end
>>
>> that models 15 repelling balls/points in [0, 1]^3.
>>
>> As "minimax" problem this function is smooth except for a subset of lower 
>> dimension. Some optimization procedures using gradients will be able to 
>> circumvent this subset and find a (local) minimum. How will you 
>> differentiate that (i.e., build a gradient) if not numerically; and if yes, 
>> is it worth the effort?
>>
>> Consider that an optimization or root finding procedure is a (hopefully) 
>> converging process that will not return an absolutely precise result. A 
>> function itself may be defined through an optimization or root finding task 
>> and whose derivative will have a technical meaning.
>>
>> In real-world applications, and tasks Matlab is made for, you do not care 
>> for 15 digits, and gradients five, six, seven digits accurate are more than 
>> enough to solve your problem. Normally, your functions are not defined for 
>> eps() accuracy anyway.
>>
>> When I came here, I thought Julia was meant for "technical computing" and 
>> not so much for pure Math exercises. In scientific computing, calculating 
>> numerical derivatives and gradients is essential for many applications.
>>
>>
>> On Tuesday, January 21, 2014 3:28:28 AM UTC+1, Jason Merrill wrote:
>>>
>>> Would you be willing to share a problem that you're interested in that 
>>> isn't amenable to symbolic differentiation, maybe on a gist or something? 
>>> I've been looking for non-trivial problems to try to apply PowerSeries to.
>>>
>>> When I said numerical differentiation is "unstable", I meant something 
>>> like "hard to do without losing several digits". For example, you point out 
>>> that by choosing a different h, it's possible to improve over Calculus.jl's 
>>> current behavior by 2-3 decimal digits, from 7e-7 to 2e-9 in absolute 
>>> error. But the best precision you could hope for in a floating point answer 
>>> to this problem is
>>>
>>> julia> eps(sin(1.0))
>>> 1.1102230246251565e-16
>>>
>>> so even with your improvements, you've lost a lot of precision. That 
>>> might or might not be a problem depending on what you'd like to do next 
>>> with the value you get out.
>>>
>>>
>>> On Monday, January 20, 2014 12:30:38 PM UTC-8, Hans W Borchers wrote:
>>>>
>>>> Numerical differentiation is by far not as unstable as you seem to 
>>>> think.
>>>> And I have a long experience in using numerical derivatives for 
>>>> optimization problems where you don't stop to look up symbolic derivatives 
>>>> applying a CAS.
>>>> The function obviously was only an example.
>>>> For most of the functions I have used Julia's symbolic capabilities 
>>>> will not be sufficient.
>>>>
>>>>
>>>> On Monday, January 20, 2014 9:07:02 PM UTC+1, Jason Merrill wrote:
>>>>>
>>>>> This implementation could certainly use some love, but finite 
>>>>> difference differentiation is always unstable, and the situation gets 
>>>>> worse 
>>>>> as you take higher order derivatives.
>>>>>
>>>>> You might want to consider using the differentiate method to take your 
>>>>> derivatives symbolically (if this works for the functions you're using), 
>>>>> or 
>>>>> have a look at the differentiation example in PowerSeries.jl. To do a 
>>>>> symbolic second derivative, you can do, e.g.
>>>>>
>>>>> julia> using Calculus
>>>>> julia> @eval d2(x) = $(differentiate(differentiate(:(sin(x)))))
>>>>> julia> d2(1.0)
>>>>> -0.8414709848078965
>>>>>  
>>>>>
>>>>

Reply via email to