Re: [petsc-users] Question about Set-up of Full MG and its Output

2016-12-07 Thread Barry Smith
Frank, There is no "default" interpolation for PCMG. It is always defined depending on how you setup the solver. If you use KSP with a DM then it uses calls to DM to generate the interpolation (for example with DMDA it uses either piecewise bi/trilinear interpolation or piecewise

Re: [petsc-users] Question about Set-up of Full MG and its Output

2016-12-07 Thread frank
Hello, Thank you. Now I am able to see the trace of MG. I still have a question about the interpolation. I wan to get the matrix of the default interpolation method and print it on terminal. The code is as follow: ( KSP is already set by petsc options)

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Eric Chamberland
Hi Nicolas, for us the solution has been to "manually" create a MatNest with, ie, block A00 containing only u-u coupling and block A11 containing p-p coupling. Thus, we are able to assign block size of 3 for A00 and block size of 1 for A11. The other thing we did, is to be able to number

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Karin
Thank you all. These are the answers I was looking for! Best regards, Nicolas 2016-12-07 15:08 GMT+01:00 Mark Adams : > Note, for best performance with ML and GAMG you want to give it the near > kernel for the 00 block. These are the 6 "rigid body modes" or zero energy > modes.

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Mark Adams
Note, for best performance with ML and GAMG you want to give it the near kernel for the 00 block. These are the 6 "rigid body modes" or zero energy modes. PETSc provides some tools to do that (eg, MatNullSpaceCreateRigidBody). On Wed, Dec 7, 2016 at 4:45 PM, Lawrence Mitchell <

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Lawrence Mitchell
On 07/12/16 13:43, Karin wrote: > Thanks Barry. > I must emphasize that my unknowns are not numbered in a regular way : > I am using a P2-P1 finite element and the middle nodes do not carry a > pressure DOF. So the global numbering is somewhat like : >

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Karin
Thanks Barry. I must emphasize that my unknowns are not numbered in a regular way : I am using a P2-P1 finite element and the middle nodes do not carry a pressure DOF. So the global numbering is somewhat like :

Re: [petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Barry Smith
> On Dec 7, 2016, at 7:06 AM, Karin wrote: > > Dear PETSc gurus, > > I am using FieldSplit to solve a poro-mechanics problem. Thus, I am dealing > with 3 displacement DOF and 1 pressure DOF. > In order to precondition the 00 block (aka the displacement block), I am >

Re: [petsc-users] Way to remove zero entries from an assembled matrix

2016-12-07 Thread Barry Smith
I don't think it is the zero entries. Please run both the first two version below with the option -ksp_view_mat binary -ksp_view_rhs binary and in each case email to petsc-ma...@mcs.anl.gov the resulting file called binaryoutput Barry > On Dec 7, 2016, at 2:49 AM, Leidy Catherine

[petsc-users] FieldSplit, multigrid and blocksize

2016-12-07 Thread Karin
Dear PETSc gurus, I am using FieldSplit to solve a poro-mechanics problem. Thus, I am dealing with 3 displacement DOF and 1 pressure DOF. In order to precondition the 00 block (aka the displacement block), I am using a multigrid method (ml or gamg). Nevertheless, I have the feeling that the

Re: [petsc-users] Way to remove zero entries from an assembled matrix

2016-12-07 Thread "Leidy Catherine Ramirez Villalba"
Hi Barry, Thanks for your reply! I must say that I still do not master still the different solvers and options, so my problem might be due to a wrong formulation for the solver, or the final state of the matrix is not good, even if I verify that is assembled and the exit looks quite ok.