Hi Rob,

I tried your code on my machine (Debian GNU/Linux, x86-64), and it seems to work for me (unmodified, with nlopt_set_initial_step1 commented out). The output is:

  1000
found maximum at f = 1398.9 after 1759 evaluations
  0.00619       0
  0.01767       10000
  0.0047        0
  0.02252       10000
  0.0033        0
  0.00468       0
  0.01701       10000
  0.00925       0
  0.01865       10000
  0.00991       9999.92
  -0.00092      0
  -0.00394      0
  -0.0098       -10000
  -0.00886      -10000
  -0.00129      0
  -0.00979      -10000
  -0.02236      -10000
  -0.00212      0
  -0.00393      0
  -0.00882      -10000
face-value: 100000, net-delta: 2.18279e-11


Note, by the way, that because your objective and constraints involve absolute values, they are not differentiable. However, it is always possible to transform absolute values into additional nonlinear constraints that are differentiable.

For example, if you have a constraint
        |x| + |y| <= 3
you can transform this into:
        x + y <= 3
        x - y <= 3
        -x + y <= 3
        -x - y <= 3
or alternatively into:
        t1 >= x
        t1 >= -x
        t2 >= y
        t2 >= -y
        t1 + t2 <= 3
where t1 and t2 are new dummy variables.

As another example, if you are minimizing the objective
        x - |y|
you can instead minimize
        x - t
where t is a new dummy variable and you introduce the constraints
        t <= y
        t <= -y

In fact, looking at your problem, it seems to me that if you do these transformations you actually get an LP (linear programming problem), which is convex and has all sorts of specialized algorithms available for it.

Even if you don't take advantage of the fact that you have an LP (e.g. because you want to add nonlinear terms later), you should strongly consider reformulating your problem in terms of a differentiable objective and differentiable constraints. COBLYA internally constructs an approximate derivative, so it will usually converge much more quickly if your problem is differentiable. Also, if it is differentiable, you can compute an analytical derivative and use algorithms like MMA and SLSQP that exploit this.

See the section on "Equivalent formulations" in the NLopt introduction.

Steven
_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to