Very cool that you added user-defined functions (and AD). Congrats on the 
new version.

On Saturday, February 27, 2016 at 11:14:16 PM UTC+1, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.12.
>
> This release features a complete rewrite of JuMP's automatic 
> differentiation functionality, which is the largest change in JuMP's 
> nonlinear optimization support since JuMP 0.5. Most of the changes are 
> under the hood, but as previously announced 
> <https://groups.google.com/forum/#!topic/julia-opt/wt36Y6nzysY> there are 
> a couple of syntax changes:
> - The first parameter to @defNLExpr *and* @defExpr should now be the 
> model object. All linear and nonlinear subexpressions are now attached to 
> their corresponding model.
> - If solving a sequence of nonlinear models, you should now use nonlinear 
> parameters instead of Julia's variable binding rules.
>
> Many nonlinear models should see performance improvements in JuMP 0.12, 
> let us know if you observe otherwise.
>
> We also now support user-defined functions 
> <http://jump.readthedocs.org/en/latest/nlp.html#user-defined-functions> 
> and *automatic differentiation of user-defined functions*. This is quite 
> a significant new feature which allows users to integrate (almost) 
> arbitrary code as a nonlinear function within JuMP expressions, thanks to 
> ForwardDiff.jl <https://github.com/JuliaDiff/ForwardDiff.jl>. We're 
> looking forward to seeing how this feature can be used in practice; please 
> give us feedback on the syntax and any rough edges you run into.
>
> Other changes include:
> - Changed syntax for iterating over AffExpr objects
> - Stopping the solver from within a callback now causes the solver to 
> return :UserLimit instead of throwing an error.
> - getDual() now works for conic problems (thanks to Emre Yamangil)
>
> Given the large number of changes, bugs are possible. Please let us know 
> of any incorrect or confusing behavior.
>
> Miles, Iain, and Joey
>

Reply via email to