Like you said, least-squares optimization is a specific (restricted)
type of a general convex optimization problem. Since its a restricted
type of an optimization problem it can potentially be more efficiently
solved than the general convex optimization problem. The advantage in
this specific case is that the Hessian can be well approximated by the
Jacobian of the vector function, so the second derivatives can be
avoided, unlike in the general optimization problem.

The idea to separate the least-squares problems into the fitting
package from the optimization package was Gilles idea. I raised
objections, but it is still happening. Numerically it doesn't change
anything, but it does makes things more complicated since there are
lots of different subclasses of the general optimization problem. Are
they going to be separated also?

-Konstantin

On Mon, Aug 19, 2013 at 11:29 AM, Ajo Fod <ajo....@gmail.com> wrote:
> The idea of a vector optimizer is very strange.
>
> Optimization is usually about finding the min/max of a scalar function
> subject to a vector of constraints. I can't think of an exception.
>
> I recently reviewed JOptimizer, JMSL and IPOpt. None of them mention
> "vector optimization". nonlinear.vector in CM is a norm minimization
> problem which is a subclass of nonlinear.scalar.gradient.
>
> Cheers,
> -Ajo.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to