If you have a problem that's close to a variational minimization, you can start 
with this

-snes_type qn -snes_qn_scale_type jacobian -snes_linesearch_type cp

It's L-BFGS and Jacobian scaling with a Powell update and critical point line 
search. Such methods are mentioned in some nonlinear finite element books, like 
Wriggers or Bathe.

What sort of problems are you solving?

"Tang, Qi" <[email protected]> writes:

> This is very helpful. JFNK + Qn in that paper sounds really promising.
>
> Jed, is there a flag I can quickly try to recover some performance related to 
> that paper (snes ex48.c)? Thanks.
>
> Qi
>
>
>
>
> On May 3, 2022, at 7:27 AM, Matthew Knepley <[email protected]> wrote:
>
> 
> On Tue, May 3, 2022 at 9:08 AM Tang, Qi 
> <[email protected]<mailto:[email protected]>> wrote:
> Pierre and Matt,
>
> Thanks a lot for the suggestion. It looks like lag Jacobian is exactly what I 
> need. We will try that.
>
> I always thought ngmres is a fancy version of Anderson. Is there any 
> reference or example related to what you said in which one actually 
> implemented an approximated Jacobian through ngmres? This sounds very 
> interesting.
>
> Jed and Peter do it here:
>
> @inproceedings{brown2013quasinewton,
>   author    = {Jed Brown and Peter Brune},
>   title     = {Low-rank quasi-{N}ewton updates for robust {J}acobian lagging 
> in {N}ewton-type methods},
>   year      = {2013},
>   booktitle = {International Conference on Mathematics and Computational 
> Methods Applied to Nuclear Science and Engineering},
>   pages     = {2554--2565},
>   petsc_uses={KSP},
> }
>   Thanks,
>
>     Matt
>
> Qi
>
>
> On May 3, 2022, at 4:51 AM, Matthew Knepley 
> <[email protected]<mailto:[email protected]>> wrote:
>
> 
> On Tue, May 3, 2022 at 2:58 AM Pierre Seize 
> <[email protected]<mailto:[email protected]>> wrote:
> Hi,
>
> If I may, is this what you want ?
>
> https://petsc.org/main/docs/manualpages/SNES/SNESSetLagJacobian.html<https://urldefense.com/v3/__https://petsc.org/main/docs/manualpages/SNES/SNESSetLagJacobian.html__;!!HXCxUKc!zIB-QYseFS9GsRBrb4wzwezVTB9DKqY_PBYGWYql4tLtLTBwX552ukXeZk_z0ZASYNl5x6QlBwDt6Q$>
>
> Yes, this is a good suggestion.
>
> Also, you could implement an approximation to the Jacobian.
>
> You could then improve it at each iteration using a secant update. This is 
> what the Generalized Broyden methods do. We call them NGMRES.
>
>   Thanks,
>
>     Matt
>
> Pierre
>
> On 03/05/2022 06:21, Tang, Qi wrote:
>> Hi,
>> Our code uses FDcoloring to compute Jacobian. The log file indicates most of 
>> time is spent in evaluating residual (2600 times in one Newton solve) while 
>> it only needs 3 nonlinear iterations and 6 total linear iterations thanks to 
>> the fieldsplit pc.
>>
>> As a temporary solution, is it possible to evaluate Jacobian only once in 
>> one Newton solve? This should work well based on my other experience if pc 
>> is very efficient. But I cannot find such a flag.
>>
>> Is there any other solution, other than implementing the analytical Jacobian?
>>
>> Thanks,
>> Qi
>
>
> --
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/<https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!HXCxUKc!zIB-QYseFS9GsRBrb4wzwezVTB9DKqY_PBYGWYql4tLtLTBwX552ukXeZk_z0ZASYNl5x6RvtZtRBQ$>
>
>
> --
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/<https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!HXCxUKc!3R5dzbFDNby-ZfSb5dXQveHsyPvN5XTpun8DMPjLhpmBGYLOjeyCeM67D7l9r2CGGSke49OpshuKDw$>

Reply via email to