"Dener, Alp via petsc-users" writes:
> About Levenberg-Marquardt: a user started the branch to eventually contribute
> an LM solver, but I have not heard any updates on it since end of April. For
> least-squares type problems, you can try using the regularized Gauss-Newton
> solver (-tao_type
Hi Zak,
Gauss-Newton finds the least-squares solution of overdetermined systems, e.g.
nonlinear regression. It minimizes the squared L2-norm of a nonlinear residual
||r(x)||_2^2 where the Jacobian J = dr/dx is rectangular with full column rank.
Since this J is not invertible, Gauss-Newton uses
Hi Alp,
Thanks for the help. Quasi-Newton seems promising - the Tao solver
eventually converges, sometimes after hundreds or even thousands of
iterations, with each iterate proceeding very quickly thanks to not
evaluating the Hessian. I have only tried this with the problem set up as a
general
Hi Zak,
You got it right with the TaoBRGNGetSubsolver -> TaoGetKSP workflow. This will
get you the KSP object correctly.
BRGN is not a stand-alone solver. It’s a wrapper that combines the
user-provided residual and Jacobian callbacks to assemble the gradient and
Hessian under the Gauss-Newton
Hello,
STCG is being used to compute a search direction by inverting the Hessian of
the objective onto the gradient. The Hessian has to be positive definitive for
this search direction to be a valid descent direction. To enforce this, STCG
terminates the KSP solution when it encounters