Hi Ellen,

Per Jed’s suggestion, seeing the monitor and ls_monitor outputs would certainly 
be helpful.

The line search for CG (and other Tao algorithms) have safeguard steps for 
failures. When the line search fails to determine a valid step length for the 
computed CG direction, the search direction falls back to gradient descent for 
a second line search. If the gradient descent step succeeds here, then the CG 
updates restart again from that point (discarding previously updated 
information completely). LS failure is reported to the user only if this 
safeguard also fails to produce a viable step length, which then suggests that 
the computed gradient at that point may be incorrect or have significant 
numerical errors.

If you can afford a slow run for debugging, you can use “-tao_test_gradient” to 
check your gradient against the FD approximation at every iteration throughout 
the run. If you’re confident that the gradient is accurate, I would recommend 
testing with “-tao_bncg_type gd” for a pure gradient descent run, and also 
trying out “-tao_type bqnls” for the quasi-Newton method (only requires the 
gradient, no Hessian).

—
Alp Dener
Postdoctoral Researcher
Argonne National Laboratory
https://www.anl.gov/profile/alp-dener


On February 26, 2020 at 11:44:15 AM, Jed Brown 
([email protected]<mailto:[email protected]>) wrote:

Could you share output for your current configuration with -tao_monitor 
-tao_ls_monitor -tao_view?

"Ellen M. Price" <[email protected]> writes:

> Hello PETSc users!
>
> I am using Tao for an unconstrained minimization problem. I have found
> that CG works better than the other types for this application. After
> about 85 iterations, I get an error about line search failure. I'm not
> clear on what this means, or how I could mitigate the problem, and
> neither the manual nor FAQ give any guidance. Can anyone suggest things
> I could try to help the method converge? I have function and gradient
> info, but no Hessian.
>
> Thanks,
> Ellen Price

Reply via email to