Ok, I replaced all occurrences of GradientNumber in my code with Dual (not 
DualNumber!),
and now the code works again.

But this is NOT an implementation detail. It could be an implementation 
detail, if a function for converting
Dual or GradientNumber to real would be part of the package, but it is not.

Therefore it would be nice, if this info could be added to the upgrading 
guide.

Best regards:

Uwe

On Monday, August 8, 2016 at 10:18:10 PM UTC+2, Kristoffer Carlsson wrote:
>
> It is more of an implementation detail but there is DualNumber now.
>
> On Monday, August 8, 2016 at 7:27:42 PM UTC+2, Uwe Fechner wrote:
>>
>> Well, but in the upgrading guide there is no replacement for 
>> GradientNumber mentioned.
>>
>> Any idea?
>>
>> Uwe
>>
>> On Monday, August 8, 2016 at 7:14:45 PM UTC+2, Miles Lubin wrote:
>>>
>>> ForwardDiff 0.2 introduced some breaking changes, you will need to 
>>> update your code (GradientNumber is no longer defined). See the upgrading 
>>> guide <http://www.juliadiff.org/ForwardDiff.jl/upgrade.html>.
>>>
>>> On Monday, August 8, 2016 at 11:10:50 AM UTC-6, Uwe Fechner wrote:
>>>>
>>>> Hello,
>>>> I updated, and now I get the following error:
>>>> julia> include("Plotting.jl")
>>>> INFO: Recompiling stale cache file 
>>>> /home/ufechner/.julia/lib/v0.4/JuMP.ji for module JuMP.
>>>> INFO: Recompiling stale cache file 
>>>> /home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
>>>> ReverseDiffSparse.
>>>> INFO: Recompiling stale cache file 
>>>> /home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
>>>> INFO: Recompiling stale cache file 
>>>> /home/ufechner/.julia/lib/v0.4/HDF5.ji for module HDF5.
>>>> ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
>>>> GradientNumber not defined
>>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, 
>>>> in expression starting on line 433
>>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
>>>> expression starting on line 19
>>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, 
>>>> in expression starting on line 13
>>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, 
>>>> in expression starting on line 22
>>>>
>>>> The code, that fails is the following:
>>>> """
>>>> Helper function to convert the value of an optimization results, but 
>>>> also
>>>> simple real values.
>>>> """
>>>> my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
>>>> my_value(value::Real) = value
>>>> my_value(val_vector::Vector) = [my_value(value) for value in val_vector]
>>>>
>>>> Any idea how to fix this?
>>>>
>>>> Uwe
>>>>
>>>> On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>>>>>
>>>>> The JuMP team is happy to announce the release of JuMP 0.14. The 
>>>>> release should clear most, if not all, deprecation warnings on Julia 0.5 
>>>>> and is compatible with ForwardDiff 0.2. The full release notes are 
>>>>> here 
>>>>> <https://github.com/JuliaOpt/JuMP.jl/blob/master/NEWS.md#version-0140-august-7-2016>,
>>>>>  
>>>>> and I'd just like to highlight a few points:
>>>>>
>>>>> - *All JuMP users read this*: As previously announced 
>>>>> <https://groups.google.com/d/msg/julia-opt/vUK1NHEHqfk/WD-6lSbMCAAJ>, we 
>>>>> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of 
>>>>> using 
>>>>> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i 
>>>>> in 1:N) instead of sum{x[i], i in 1:N}. In this release, the new 
>>>>> syntax is available for testing if using Julia 0.5. No deprecation 
>>>>> warnings 
>>>>> are printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we 
>>>>> will begin printing deprecation warnings for the old syntax.
>>>>>
>>>>> - *Advanced JuMP users read this*: We have introduced a new syntax 
>>>>> for "anonymous" objects, which means that when declaring an optimization 
>>>>> variable, constraint, expression, or parameter, you may omit the name of 
>>>>> the object within the macro. The macro will instead return the object 
>>>>> itself which you can assign to a variable if you'd like. Example:
>>>>>
>>>>> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
>>>>> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 
>>>>>
>>>>> This syntax should be comfortable for advanced use cases of JuMP 
>>>>> (e.g., within a library) and should obviate some confusions about JuMP's 
>>>>> variable scoping rules.
>>>>>
>>>>> - We also have a new input form for nonlinear expressions that has the 
>>>>> potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
>>>>> expressions needed to be input via macros, which isn't convenient if the 
>>>>> expression is generated programmatically. You can now set nonlinear 
>>>>> objectives and add nonlinear constraints by providing a Julia Expr 
>>>>> object directly with JuMP variables spliced in. This means that you can 
>>>>> now 
>>>>> generate expressions via symbolic manipulation and add them directly to a 
>>>>> JuMP model. See the example in the documentation 
>>>>> <http://www.juliaopt.org/JuMP.jl/0.14/nlp.html#raw-expression-input>.
>>>>>
>>>>> Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi 
>>>>> Madani, and Jarrett Revels for contributions to this release which are 
>>>>> cited in the release notes.
>>>>>
>>>>> Miles, Iain, and Joey
>>>>>
>>>>>
>>>>>

Reply via email to