Thanks, Ken. 

Julia : Version 0.2.0-rc2+54
Optim: v0.1.3

yes, I type "using Optim"

Best,
Weichi

On Wednesday, September 30, 2015 at 2:39:20 PM UTC-7, Ken B wrote:
>
> *Strange, it works fine for me (see example below). Which version of julia 
> and optim.jl are you on? Do you type `using Optim` or are you just 
> importing the `optimize` function of the package?*
>
>
> *julia> **function func(x::Vector) #rosenbrock*
>
>        *    return (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2*
>
>        *end*
>
> *func (generic function with 1 method)*
>
>
> *julia> **using Optim*
>
>
> *julia> **optimize(func, [2.0, 3.0])*
>
> *Results of Optimization Algorithm*
>
> * * Algorithm: Nelder-Mead*
>
> * * Starting Point: [2.0,3.0]*
>
> * * Minimum: [1.0000653515652895,1.0001323531697637]*
>
> * * Value of Function at Minimum: 0.000000*
>
> * * Iterations: 56*
>
> * * Convergence: true*
>
> *   * |x - x'| < NaN: false*
>
> *   * |f(x) - f(x')| / |f(x)| < 1.0e-08: true*
>
> *   * |g(x)| < NaN: false*
>
> *   * Exceeded Maximum Number of Iterations: false*
>
> * * Objective Function Calls: 110*
>
> * * Gradient Call: 0*
>
>
> *julia> **l, u = [-5.0, -5.0], [5.0, 5.0]*
>
> *([-5.0,-5.0],[5.0,5.0])*
>
>
> *julia> **df = DifferentiableFunction(func)*
>
> *Optim.DifferentiableFunction(func,g!,fg!)*
>
>
> *julia> **fminbox(df, [2.0, 3.0], l, u)*
>
> *Results of Optimization Algorithm*
>
> * * Algorithm: Conjugate Gradient*
>
> * * Starting Point: [2.0,3.0]*
>
> * * Minimum: [0.9999999925533738,0.9999999851053851]*
>
> * * Value of Function at Minimum: -0.000000*
>
> * * Iterations: 4*
>
> * * Convergence: true*
>
> *   * |x - x'| < 2.2e-16: false*
>
> *   * |f(x) - f(x')| / |f(x)| < 1.5e-08: false*
>
> *   * |g(x)| < 1.5e-08: true*
>
> *   * Exceeded Maximum Number of Iterations: false*
>
> * * Objective Function Calls: 20*
>
> * * Gradient Call: 16*
>
> On Wednesday, 30 September 2015 01:32:37 UTC+2, Weichi Ding wrote:
>>
>> Hi,
>>
>> I'm new to the Optim package and am trying to find a minimum using 
>> bounding box constraints.
>> I wrote a function like this:
>>
>> function func(x::Vector)
>> ...
>> end
>>
>> It works fine using optimize()
>>
>> >optimize(func,[360e-12, 240e-12,15e-15])
>> Results of Optimization Algorithm
>>  * Algorithm: Nelder-Mead
>>  * Starting Point: [3.6e-10,2.4e-10,1.5e-14]
>>  * Minimum: 
>> [0.14043324412015543,-0.04701342334377595,0.000548836796380928]
>>  * Value of Function at Minimum: -160.000000
>>  * Iterations: 54
>>  * Convergence: true
>>    * |x - x'| < NaN: false
>>    * |f(x) - f(x')| / |f(x)| < 1.0e-08: true
>>    * |g(x)| < NaN: false
>>    * Exceeded Maximum Number of Iterations: false
>>  * Objective Function Calls: 103
>>  * Gradient Call: 0
>>
>> But when I try to use fminbox(), I got some error messages. Can you tell 
>> me what I missed? Thanks.
>>
>> l=[1e-10,1e-10,1e-15]
>> u=[1000e-12,1000e-12,500e-15]
>> fminbox(func,[360e-12, 240e-12,15e-15],l,u)
>>
>> ERROR: no method func(Array{Float64,1},Array{Float64,1})
>>  in fminbox at /home/dingw/.julia/Optim/src/fminbox.jl:138
>>  in fminbox at /home/dingw/.julia/Optim/src/fminbox.jl:190
>>
>> >d1 = DifferentiableFunction(func)
>> DifferentiableFunction(func,g!,fg!)
>>
>> >fminbox(d1,[360e-12, 240e-12,15e-15],l,u)
>>
>> ERROR: no method 
>> fminbox(DifferentiableFunction,Array{Float64,1},Array{Float64,1},Array{Float64,1})
>>
>>
>>
>>

Reply via email to