Pierre and Matt, Thanks a lot for the suggestion. It looks like lag Jacobian is exactly what I need. We will try that.
I always thought ngmres is a fancy version of Anderson. Is there any reference or example related to what you said in which one actually implemented an approximated Jacobian through ngmres? This sounds very interesting. Qi On May 3, 2022, at 4:51 AM, Matthew Knepley <[email protected]> wrote: On Tue, May 3, 2022 at 2:58 AM Pierre Seize <[email protected]<mailto:[email protected]>> wrote: Hi, If I may, is this what you want ? https://petsc.org/main/docs/manualpages/SNES/SNESSetLagJacobian.html<https://urldefense.com/v3/__https://petsc.org/main/docs/manualpages/SNES/SNESSetLagJacobian.html__;!!HXCxUKc!zIB-QYseFS9GsRBrb4wzwezVTB9DKqY_PBYGWYql4tLtLTBwX552ukXeZk_z0ZASYNl5x6QlBwDt6Q$> Yes, this is a good suggestion. Also, you could implement an approximation to the Jacobian. You could then improve it at each iteration using a secant update. This is what the Generalized Broyden methods do. We call them NGMRES. Thanks, Matt Pierre On 03/05/2022 06:21, Tang, Qi wrote: > Hi, > Our code uses FDcoloring to compute Jacobian. The log file indicates most of > time is spent in evaluating residual (2600 times in one Newton solve) while > it only needs 3 nonlinear iterations and 6 total linear iterations thanks to > the fieldsplit pc. > > As a temporary solution, is it possible to evaluate Jacobian only once in one > Newton solve? This should work well based on my other experience if pc is > very efficient. But I cannot find such a flag. > > Is there any other solution, other than implementing the analytical Jacobian? > > Thanks, > Qi -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/<https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!HXCxUKc!zIB-QYseFS9GsRBrb4wzwezVTB9DKqY_PBYGWYql4tLtLTBwX552ukXeZk_z0ZASYNl5x6RvtZtRBQ$>
