On Wednesday, March 9, 2016 at 12:52:38 AM UTC-5, Evan Fields wrote: > > Great to hear. Two minor questions which aren't clear (to me) from the > documentation: > - Once a user defined function has been defined and registered, can it be > incorporated into NL expressions via @defNLExpr? >
Yes. > - The documentation references both ForwardDiff.jl and > ReverseDiffSparse.jl. Which is used where? What are the tradeoffs users > should be aware of? > ForwardDiff is only used for used-defined functions with the autodiff=true option. ReverseDiffSparse is used for all other derivative computations. Using ForwardDiff to compute a gradient of a user-defined function is not particularly efficient for functions with high-dimensional input. > Semi-unrelated: two days ago I was using JuMP 0.12 and NLopt to solve what > should have been a very simple (2 variable) nonlinear problem. When I fed > the optimal solution as the starting values for the variables, the > solve(model) command (or NLopt) hung indefinitely. Perturbing my starting > point by .0001 fixed that - solve returned a solution > instantaneously-by-human-perception. Am I doing something dumb? > I've also observed hanging within NLopt but haven't had a chance to debug it (anyone is welcome to do so!). Hanging usually means that NLopt is iterating without converging, since NLopt has no output <https://github.com/JuliaOpt/NLopt.jl/issues/16>. Try setting an iteration limit.
