On Mon, Jan 18, 2016 at 8:29 AM, Hoang Giang Bui <[email protected]> wrote:
> > > On Thu, Jan 14, 2016 at 8:08 PM, Barry Smith <[email protected]> wrote: > >> >> > On Jan 14, 2016, at 12:57 PM, Jed Brown <[email protected]> wrote: >> > >> > Hoang Giang Bui <[email protected]> writes: >> >> One more question I like to ask, which is more on the performance of >> the >> >> solver. That if I have a coupled problem, says the point block is [u_x >> u_y >> >> u_z p] in which entries of p block in stiffness matrix is in a much >> smaller >> >> scale than u (p~1e-6, u~1e+8), then AMG with hypre in PETSc still >> scale? >> > >> > You should scale the model (as Barry says). But the names of your >> > variables suggest that the system is a saddle point problem, in which >> > case there's a good chance AMG won't work at all. For example, >> > BoomerAMG produces a singular preconditioner in similar contexts, such >> > that the preconditioned residual drops smoothly while the true residual >> > stagnates (the equations are not solved at all). So be vary careful if >> > you think it's "working". >> >> > > Using block size 4 with the scaling, the hypre AMG does not converge. So > it's somehow right. > > > >> The PCFIEDSPLIT preconditioner is designed for helping to solve saddle >> point problems. >> >> >> > > Does PCFIELDSPLIT support variable block size? For example using P2/P1 > discretization, the number of nodes carrying [u_x u_y u_z] is different > with number of nodes carrying p. PCFieldSplitSetBlockSize would not be > correct in this case. > You misunderstand the blocking. You would put ALL velocities (P2) in one block and ALL pressure (P1) in another. The PCFieldSplitSetBlockSize() call is for co-located discretizations, which P2/P2 is not. Matt > Giang > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
