This is a bug, I've opened an issue 
here: https://github.com/JuliaOpt/JuMP.jl/issues/695

As a workaround, if you replace sqrt(y0) with 0.0 then the NaNs go away. 
Clearly it shouldn't affect the result since y0 is a constant.

On Tuesday, March 8, 2016 at 2:46:12 AM UTC-5, JP Dussault wrote:
>
> Hi,
>
> I experience NaN in second derivatives with JuMp 0.12  which  were not 
> present with the previous version. I scaled down one example, a 
> Brachistochrone model (see 
> http://link.springer.com/article/10.1007%2Fs11081-013-9244-4)  to n=4 
> yielding 6 variables and the hessian clearly shows many NaN entries. I 
> double checked with an "equivalent" AMPL version and the hessian is clean 
> (no NaN).
>
> Here is the model as I ran it within the MPB Newton solver notebook.
>
> n=4
> nlp = Model(solver=NewtonSolver())
>     eps = 1e-8
>
>     xn = 5.0
>     yn = 1.0 
>     x0 = 0.0
>     y0 = 0.0
>
>     @defVar(nlp, x[j=1:n-1], start = xn*(j/n))
>     @defVar(nlp, y[j=1:n-1], start = (j/n))
>     
>     @defNLExpr(nlp, dx1,  (x[1] - x0))
>     @defNLExpr(nlp, dxn,  (xn - x[n-1])) 
>     @defNLExpr(nlp, dx[j=2:n-1],  (x[j] - x[j-1]))
>
>     @defNLExpr(nlp, dy[j=2:n-1],  (y[j] - y[j-1]))
>     @defNLExpr(nlp, dy1,  (y[1] - y0))
>     @defNLExpr(nlp, dyn,  (yn - y[n-1]))
>
>     @defNLExpr(nlp, s[j=2:n-1],  sqrt(dx[j]^2 + dy[j]^2))
>     @defNLExpr(nlp, s1,  sqrt(dx1^2 + dy1^2))
>     @defNLExpr(nlp, sn,  sqrt(dxn^2 + dyn^2))
>
>     @defNLExpr(nlp, f[j=2:n-1],  s[j] /(sqrt(y[j-1])+sqrt(y[j])))
>     @defNLExpr(nlp, f1,  s1 /(sqrt(y0) + sqrt(y[1])))
>     @defNLExpr(nlp, fn,  sn /(sqrt(y[n-1])+sqrt(yn)))
>
>     
>     @setNLObjective(
>                     nlp,
>                     Min,
>                     sum{f[i], i = 2:n-1} + f1 + fn
>                     )
>
>
> status = solve(nlp);
>
> Thx, 
>
>
> JPD
>
>
> Le samedi 27 février 2016 23:14:12 UTC+1, Miles Lubin a écrit :
>>
>> The JuMP team is happy to announce the release of JuMP 0.12.
>>
>> This release features a complete rewrite of JuMP's automatic 
>> differentiation functionality, which is the largest change in JuMP's 
>> nonlinear optimization support since JuMP 0.5. Most of the changes are 
>> under the hood, but as previously announced 
>> <https://groups.google.com/forum/#!topic/julia-opt/wt36Y6nzysY> there 
>> are a couple of syntax changes:
>> - The first parameter to @defNLExpr *and* @defExpr should now be the 
>> model object. All linear and nonlinear subexpressions are now attached to 
>> their corresponding model.
>> - If solving a sequence of nonlinear models, you should now use nonlinear 
>> parameters instead of Julia's variable binding rules.
>>
>> Many nonlinear models should see performance improvements in JuMP 0.12, 
>> let us know if you observe otherwise.
>>
>> We also now support user-defined functions 
>> <http://jump.readthedocs.org/en/latest/nlp.html#user-defined-functions> 
>> and *automatic differentiation of user-defined functions*. This is quite 
>> a significant new feature which allows users to integrate (almost) 
>> arbitrary code as a nonlinear function within JuMP expressions, thanks to 
>> ForwardDiff.jl <https://github.com/JuliaDiff/ForwardDiff.jl>. We're 
>> looking forward to seeing how this feature can be used in practice; please 
>> give us feedback on the syntax and any rough edges you run into.
>>
>> Other changes include:
>> - Changed syntax for iterating over AffExpr objects
>> - Stopping the solver from within a callback now causes the solver to 
>> return :UserLimit instead of throwing an error.
>> - getDual() now works for conic problems (thanks to Emre Yamangil)
>>
>> Given the large number of changes, bugs are possible. Please let us know 
>> of any incorrect or confusing behavior.
>>
>> Miles, Iain, and Joey
>>
>

Reply via email to