Hello,
I updated, and now I get the following error:
julia> include("Plotting.jl")
INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/JuMP.ji
for module JuMP.
INFO: Recompiling stale cache file
/home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module
ReverseDiffSparse.
INFO: Recompiling stale cache file
/home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/HDF5.ji
for module HDF5.
ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError:
GradientNumber not defined
while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, in
expression starting on line 433
while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in
expression starting on line 19
while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, in
expression starting on line 13
while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, in
expression starting on line 22
The code, that fails is the following:
"""
Helper function to convert the value of an optimization results, but also
simple real values.
"""
my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
my_value(value::Real) = value
my_value(val_vector::Vector) = [my_value(value) for value in val_vector]
Any idea how to fix this?
Uwe
On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.14. The release
> should clear most, if not all, deprecation warnings on Julia 0.5 and is
> compatible with ForwardDiff 0.2. The full release notes are here
> <https://github.com/JuliaOpt/JuMP.jl/blob/master/NEWS.md#version-0140-august-7-2016>,
>
> and I'd just like to highlight a few points:
>
> - *All JuMP users read this*: As previously announced
> <https://groups.google.com/d/msg/julia-opt/vUK1NHEHqfk/WD-6lSbMCAAJ>, we
> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using
> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i in
> 1:N) instead of sum{x[i], i in 1:N}. In this release, the new syntax is
> available for testing if using Julia 0.5. No deprecation warnings are
> printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we will
> begin printing deprecation warnings for the old syntax.
>
> - *Advanced JuMP users read this*: We have introduced a new syntax for
> "anonymous" objects, which means that when declaring an optimization
> variable, constraint, expression, or parameter, you may omit the name of
> the object within the macro. The macro will instead return the object
> itself which you can assign to a variable if you'd like. Example:
>
> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i])
>
> This syntax should be comfortable for advanced use cases of JuMP (e.g.,
> within a library) and should obviate some confusions about JuMP's variable
> scoping rules.
>
> - We also have a new input form for nonlinear expressions that has the
> potential to extend JuMP's scope as an AD tool. Previously all nonlinear
> expressions needed to be input via macros, which isn't convenient if the
> expression is generated programmatically. You can now set nonlinear
> objectives and add nonlinear constraints by providing a Julia Expr object
> directly with JuMP variables spliced in. This means that you can now
> generate expressions via symbolic manipulation and add them directly to a
> JuMP model. See the example in the documentation
> <http://www.juliaopt.org/JuMP.jl/0.14/nlp.html#raw-expression-input>.
>
> Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi
> Madani, and Jarrett Revels for contributions to this release which are
> cited in the release notes.
>
> Miles, Iain, and Joey
>
>
>