[julia-users] Re: ANN: JuMP 0.12 released

2016-03-29 Thread Miles Lubin
Are you suggesting a clarification in the documentation or new functionality? The docs clearly explain when ForwardDiff is used and what the performance drawbacks are. If you know of a package in Julia which we could use to perform reverse mode AD on user-defined functions, we'll gladly accept

[julia-users] Re: ANN: JuMP 0.12 released

2016-03-29 Thread feza
I suggest clarification in the documents regarding which mode of automatic differentiation since this can have a large impact on computation time. It seems like this 'ForwardDiff is only used for used-defined functions with the autodiff=true option. ReverseDiffSparse is used for all other

[julia-users] Re: ANN: JuMP 0.12 released

2016-03-09 Thread Miles Lubin
On Wednesday, March 9, 2016 at 12:52:38 AM UTC-5, Evan Fields wrote: > > Great to hear. Two minor questions which aren't clear (to me) from the > documentation: > - Once a user defined function has been defined and registered, can it be > incorporated into NL expressions via @defNLExpr? > Yes.

[julia-users] Re: ANN: JuMP 0.12 released

2016-03-08 Thread Tony Kelman
I was also recently seeing NLopt hang during some of JuMP's tests, so I don't think it's just you. On Tuesday, March 8, 2016 at 9:52:38 PM UTC-8, Evan Fields wrote: > > Great to hear. Two minor questions which aren't clear (to me) from the > documentation: > - Once a user defined function has

[julia-users] Re: ANN: JuMP 0.12 released

2016-03-08 Thread Evan Fields
Great to hear. Two minor questions which aren't clear (to me) from the documentation: - Once a user defined function has been defined and registered, can it be incorporated into NL expressions via @defNLExpr? - The documentation references both ForwardDiff.jl and ReverseDiffSparse.jl. Which is

[julia-users] Re: ANN: JuMP 0.12 released

2016-03-08 Thread Miles Lubin
This is a bug, I've opened an issue here: https://github.com/JuliaOpt/JuMP.jl/issues/695 As a workaround, if you replace sqrt(y0) with 0.0 then the NaNs go away. Clearly it shouldn't affect the result since y0 is a constant. On Tuesday, March 8, 2016 at 2:46:12 AM UTC-5, JP Dussault wrote: > >

[julia-users] Re: ANN: JuMP 0.12 released

2016-02-28 Thread cdm
a review of the REQUIRE sets for each package gives some high level sense of the differences: https://github.com/JuliaOpt/JuMP.jl/blob/master/REQUIRE https://github.com/JuliaOpt/Convex.jl/blob/master/REQUIRE enjoy !!! cdm

[julia-users] Re: ANN: JuMP 0.12 released

2016-02-27 Thread Patrick Kofod Mogensen
Very cool that you added user-defined functions (and AD). Congrats on the new version. On Saturday, February 27, 2016 at 11:14:16 PM UTC+1, Miles Lubin wrote: > > The JuMP team is happy to announce the release of JuMP 0.12. > > This release features a complete rewrite of JuMP's automatic >