After using the Optim package for quite a while, I'm now dabbling with 
NLopt.jl for the first time to solve a bounded multivariate optimization 
problem. For some reason, I can't seem to get NLopt to do anything though, 
and while the problem is hard to recreate as a small working example, I 
thought I'd try my luck here with a bit of pseudo-code to see whether 
anyone has got an idea for what could be going on. The problem I'm trying 
to solve is a minimum distance estimator that tries to find 64 parameters 
to minimize the distance between values of a function of those parameters 
and some observed values. The idea is:

observations = (observed values, Array{Float64,2})
weight = (weight for each observation, Array{Float64,2})

function objective(parameters::Array{Float64,1}, obs=observations, 
wgt=weight)
    values = f(parameters) # construct a matrix of the same size as 
observations
  diff = [(obs - values).*wgt][:]
  return [diff'*diff][1]
end

x_0 = (initial values, Vector{Float64})
lb = (some values, Vector{Float64})
ub = (some values, Vector{Float64})

opt = Opt(:GN_ESCH, length(x_0))
min_objective!(opt, objective)
upper_bounds!(opt, ub)
lower_bounds!(opt, lb)
(optf, optx, flag) = optimize(opt, x_0)

The result of this calculation is, for whichever method I specify, just the 
vector of initial values, x_0 (or a vector with values halfway between ub 
and lb). The problem seems to be that the value of the objective function 
is not correctly calculated in the optimization routine, as the returned 
function value is always 0.0. I get close to reasonable results when I use 
the Optim package and just call Optim.optimize(objective, x_0, iterations = 
5000), but I'd like to put bounds on the parameters and so a working NLopt 
version would be preferable.

The full problem is available in this repo 
<https://github.com/nilshg/psidJulia/blob/master/estimate.jl>.

Reply via email to