Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Jed Brown
Please always use reply-all so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Lawrence Mitchell
On 29 Jul 2014, at 13:37, Jed Brown j...@jedbrown.org wrote: Please always use reply-all so that your messages go to the list. Sorry, fat-fingered the buttons. Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: On 28 Jul 2014, at 23:27, Jed Brown j...@jedbrown.org wrote:

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Jed Brown
Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: So my coarse space is spanned by the fine one, so I copy coarse dofs to the corresponding fine ones and then linearly interpolate to get the coefficient value at the missing fine dofs. Good, and is restriction the transpose? Some

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Lawrence Mitchell
On 29/07/14 14:35, Jed Brown wrote: Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: So my coarse space is spanned by the fine one, so I copy coarse dofs to the corresponding fine ones and then linearly interpolate to get the coefficient value at the missing fine dofs. Good,

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Jed Brown
Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: No, I'm L2-projecting (with mass-lumping) for the restriction. So if I weren't lumping, I think this is the dual of the prolongation. A true L2 projection is a dense operation (involves the inverse mass matrix). But here, we're

[petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Que Cat
Dear Petsc-Users, I called the SNESVISetVariableBounds http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESVISetVariableBounds.html#SNESVISetVariableBounds and received the following error: [0]PETSC ERROR: Must call DMShellSetGlobalVector() or DMShellSetCreateGlobalVector()

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Jed Brown
Que Cat quecat...@gmail.com writes: Dear Petsc-Users, I called the SNESVISetVariableBounds http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESVISetVariableBounds.html#SNESVISetVariableBounds and received the following error: [0]PETSC ERROR: Must call

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Que Cat
Hi Jed, I removed only one line from the error message that indicated the unnecessary information like directory path, program name, processor type. In this problem, I created the vector using VecCreateMPI, not with DM. Does it cause such problem? I have the unstructured grid and can not use DM

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Jed Brown
Que Cat quecat...@gmail.com writes: Hi Jed, I removed only one line from the error message that indicated the unnecessary information like directory path, program name, processor type. In this problem, I created the vector using VecCreateMPI, not with DM. Does it cause such problem? I have

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Lawrence Mitchell
On 29 Jul 2014, at 16:58, Jed Brown j...@jedbrown.org wrote: Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: No, I'm L2-projecting (with mass-lumping) for the restriction. So if I weren't lumping, I think this is the dual of the prolongation. A true L2 projection is a dense

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Que Cat
I also used the ghosted vector (VecCreateGhost). Does it have any relation to the DMGetGlobalVector() ? Que On Tue, Jul 29, 2014 at 12:42 PM, Que Cat quecat...@gmail.com wrote: Yes, I don't use DM at all and no information about who is calling DMGetGlobalVector(). When I use snesnewtonls

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Matthew Knepley
On Tue, Jul 29, 2014 at 1:19 PM, Que Cat quecat...@gmail.com wrote: I also used the ghosted vector (VecCreateGhost). Does it have any relation to the DMGetGlobalVector() ? We clearly have the problem that a full stack in not being printed. I am open to the idea that it is a problem in our

Re: [petsc-users] Poor multigrid convergence in parallel

2014-07-29 Thread Jed Brown
Lawrence Mitchell lawrence.mitch...@imperial.ac.uk writes: So my approach was to transfer using projection and then use riesz representation to get the residual from the dual space back into the primal space so I can apply the operator at the next level. Is there an obvious reason why this

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Abhyankar, Shrirang G.
This error is also seen in src/snes/examples/tutorials/ex53.c. [Shri@Shrirangs-MacBook-Pro tutorials (master)]$ ./ex53 Vec Object: 1 MPI processes type: seq 1 1 1 1 1 -1 -1 0 -1 -1 [0]PETSC ERROR: - Error Message --

Re: [petsc-users] Error with SNESVISetVariableBounds

2014-07-29 Thread Que Cat
I received the same error message for example 53 in my machine as well. ./ex53 Vec Object: 1 MPI processes type: seq 1 1 1 1 1 -1 -1 0 -1 -1 [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: [0]PETSC ERROR: Must