On Sun, Jun 20, 2010 at 01:26:04PM +0400, Kuznetsov Roman wrote:
> When I'd installed the optim package I found that it was not good
> package at all.
> 1) Functions from optim have no description. (so I don't know the way
> which is used by them (d2_min))

But d2_min does have a description...

> 2) List of function indicates that that the optim package more like dump
> of written  functions.

This remark is not understandable (at least by me). Please explain.

> So I'd been looking through Internet and came across with two algorithm:
> CONDOR (very large and written on C) and Italian TRESNEI (which is very
> close to matlab optimization).

What do you mean by "close to matlab optimization"?

> Please look more for TRESNEI at http://tresnei.de.unifi.it 
> They have a full description of their algorithm and they seriously
> tested it and compared with lsqnonlin function. And as a result they've
> found that the TRESNEY is not worse of the lsqnonlin.
> I've slightly change it code that it can work with octave.
> 1) For unconstrained optimization we can write "l" and "u" parameters
> like a "[]".
> 2) There was one problem with output. I fixed it.
> I attached a file which proofs this function works at least with octave
> 3.2.2.
> In file results you can see easy example how to use TRESNEI.
> 
> I hope you call TRESNEI builders and include it in octave.
 
A few remarks:

- The hurdle for inclusion into core Octave is higher, I'd think
  Octave Forge is the appropriate place to consider inclusion.

- The license of TRESNEI seems not specified; the code seems to be in
  the public domain, but this is not explicitely stated. Though Octave
  Forge contains code of several licenses, I'd think we should only
  include GPL code into the optim package. So the authores would have
  to agree to put the code under this license.

- TRESNEI seems to be curve-fitting code, so lsqnonlin is not quite
  appropriate to compare with, lsqcurvefit is more appropriate.

- The optim package already includes leasqr for curve-fitting, which,
  according to my experiance, compares favourably with
  lsqcurvefit. Also, it now features linear and non-linear constraints
  in least squares fitting, which neither lsqcurvefit nor TRESNEI
  does. leasqr also uses a variant of the Gauss algorithm, but not
  with the trust region method, but with the Levenberg Marquardt
  method. If we should want to implement the trust region method, I'd
  guess it would be better to extend or re-use the code of leasqr with
  it, since thus leasqr's constraints feature can be used.

- TRESNEI seems to have a feature beyond leasqr --- it seems to be
  useable as an algortithm for constraint satisfaction (without least
  squares fitting). This is interesting; but I don't know much on
  (mere) constraint-satisfaction algorithms. Note that sqp of core
  Octave can seemingly already be used as such an algorithm if one
  feeds it with a constant objective function.

> PS: CONDOR algorithm uses various of methods and I suppose it's more
> intuitive. But I have no time to test and implement it in Octave code. I
> don't know may be I'll do it in the future.

> ...

> # 1. Prepare data.
> t=[-10:0.1:10];
> f=@(x) x .^ 2 + 2;                # Real function
> X=f(t);                           # Implementation of real function
> ff=@(a) a (1) .* t .^ 2. + a (2); # So we should get "a" equals [1,2]
> fun=@(a) (X.-ff(a))(:);           # Residual function (column vector)
> x=[12;3];                         # Start point
> e_i=[numel(fun(x)),0];            # We have no inequalities then the second 
> parameter equals zero
> options=struct("jacobian","off"); # Point that jacobian should be computed 
> numericaly
> # 2. Calculate
> TRESNEI(x,e_i,fun,[],[],options)
> 
> # 3. Results
> 
> 
> # ******** Problem data *****************************
> # Problem    =  fun
> # variable dimension =             2
> # number of constraints =        201
> # number of equalities =         201
> # number of inequalities =         0
> # number of fixed variables =      0
> 
> 
> # ******** Iteration history 
> #      it     ||F||_2       first-order   trust-region   trust-region    norm 
> of       step        t 
> #                           optimality       radius        solution       
> step       direction 
> #      0    7.0546e+03      4.5181e+06      1.00e+00
> #      1    6.4141e+03      4.1079e+06      1.00e+00         C          
> 1.00e+00        TR      0.00e+00 
> #      2    5.1332e+03      3.2876e+06      2.00e+00         C          
> 2.00e+00        TR      0.00e+00 
> #      3    2.5714e+03      1.6469e+06      4.00e+00         C          
> 4.00e+00        TR      0.00e+00 
> #      4    1.5801e-05      1.0116e-02      8.00e+00         N          
> 4.10e+00        TR      0.00e+00 
> #      5    1.0608e-13      6.5513e-11      8.20e+00         N          
> 5.03e-08        TR      0.00e+00 
> 
> 
> # ******** Final output
> # Successful Termination. Nonlinear Residual Condition Satisfied.  Output 
> flag=  0
>  
> # Number of iterations performed:                 5
> # Number of function evaluations (no Jacobian):   6
>  
> # 2-Norm of the residual F:    1.06075e-13
> # First-order optimality:      6.55130e-11
> # *************************************************
> # ans =
> 
> #    1.0000
> #    2.0000
> 
> # SO you see it works. If we don't need all this table then let options be 
> like

Agreed for this example.

Olaf

------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
Octave-dev mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/octave-dev

Reply via email to