Re: [petsc-users] How to have a local copy (sequential) of a parallel matrix

2016-07-06 Thread ehsan sadrfaridpour
Thanks all, Sorry for lots of questions. Thanks to your advice, I didn't create the local matrix and it seems the problem is solved. I mean it seems that I shouldn't create the local matrix at all. And this is my final working code. Mat *m_WA_nt_local; IS set; if(rank ==0){

[petsc-users] What block size means in amg aggregation type

2016-07-06 Thread Eduardo Jourdan
Hi, I am kind of new to algebraic multigrid methods. I tried to figure it on my own but I'm not be sure about it. How the block size (bs) of a blocked matrix affects the AMG AGG? I mean, if bs = 4, then in the coarsening phase and setup, blocks of 4x4 matrix elements are considered to remain in

Re: [petsc-users] Duplicate cells when exporting a distributed dmplex

2016-07-06 Thread Matthew Knepley
On Tue, Jul 5, 2016 at 4:17 AM, Morten Nobel-Jørgensen wrote: > Hi all, > > I hope someone can help me with the following: > > I’m having some problems when exporting a distributed DMPlex – the cells > (+cell types) seems to be duplicated. > > When I’m running the code on a

Re: [petsc-users] (edit GAMG) petsc 3.7.2 memory usage is much higher when compared to 3.6.1

2016-07-06 Thread Barry Smith
Hassan, My statement "This memory usage increase is not expected." holds only for the fgmres/bjacobi. Mark continues to make refinements to the GAMG code that could easily result in more or less memory being used so I am not surprised by the numbers you report below. Do not bother

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-06 Thread Barry Smith
> On Jul 6, 2016, at 4:19 PM, frank wrote: > > Hi Barry, > > Thank you for you advice. > I tried three test. In the 1st test, the grid is 3072*256*768 and the process > mesh is 96*8*24. > The linear solver is 'cg' the preconditioner is 'mg' and 'telescope' is used > as the

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-06 Thread frank
Hi Barry, Thank you for you advice. I tried three test. In the 1st test, the grid is 3072*256*768 and the process mesh is 96*8*24. The linear solver is 'cg' the preconditioner is 'mg' and 'telescope' is used as the preconditioner at the coarse mesh. The system gives me the "Out of Memory"

Re: [petsc-users] Transient poisson example in petsc

2016-07-06 Thread Justin Chang
Julian, I hand wrote my own time stepping scheme (backward Euler) for SNES ex12.c because I had to enforce TAO's convex optimization solvers at every time level. I am sure Matt or one of the other PETSc developers can tell you how to make today's SNES ex12.c transient. Thanks, Justin On

Re: [petsc-users] snes true and preconditioned residuals for left npc

2016-07-06 Thread Barry Smith
> On Jul 6, 2016, at 12:17 PM, Xiangdong wrote: > > Hello everyone, > > I am using snes_type aspin, which is actually newtonls + npc (nasm). After > each newton iteration, if I call SNESGetFunction, the preconditioned residual > is obtained. However, if I use