Re: [petsc-users] subprocess (block) tolerance.

2018-10-04 Thread Smith, Barry F.
> On Oct 4, 2018, at 6:09 PM, HeeHo Park wrote: > > Barry and Jed, > > Thank you for your answers. I think I need to learn more about domain > decomposition as I am a bit confused. > Is it true that we are using BiCGstab here to solve the system of equations, > using Additive Schwartz as

Re: [petsc-users] subprocess (block) tolerance.

2018-10-04 Thread HeeHo Park
Barry and Jed, Thank you for your answers. I think I need to learn more about domain decomposition as I am a bit confused. Is it true that we are using BiCGstab here to solve the system of equations, using Additive Schwartz as a domain decomposition preconditioner, and that precondition matrix

Re: [petsc-users] subprocess (block) tolerance.

2018-10-04 Thread Smith, Barry F.
Since KSP Object: (flow_sub_) 1 MPI processes type: preonly this means only a single iteration of the inner solver is used so the numbers in red are not used. You could do something like -flow_ksp_type fgmres -flow_sub_ksp_type gmres -flow_sub_ksp_rtol 1.e-2 but it wouldn't

Re: [petsc-users] subprocess (block) tolerance.

2018-10-04 Thread Jed Brown
The subdomain KSP (flow_sub_) has type "preonly" so it always does exactly one iteration. If you were to use an iterative subdomain solver (e.g., -flow_sub_ksp_type gmres) then those tolerances would be used. HeeHo Park writes: > Hi, I'm running PFLOTRAN and in PFLOTRAN, we have flow_ and