Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread Barry Smith
> On Apr 27, 2017, at 12:43 PM, neok m4700 wrote: > > Hi Matthew, > > Thank you for the clarification, however, it is unclear why there is an > additional unknown in the case of periodic bcs. > > Please see attached to this email what I'd like to achieve, the number of

Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread neok m4700
Hi Matthew, Thank you for the clarification, however, it is unclear why there is an additional unknown in the case of periodic bcs. Please see attached to this email what I'd like to achieve, the number of unknowns does not change when switching to the periodic case for e.g. a laplace operator.

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Jed Brown
Mark Adams writes: > On Wed, Apr 26, 2017 at 7:30 PM, Barry Smith wrote: > >> >> Yes, you asked for LU so it used LU! >> >>Of course for smaller coarse grids and large numbers of processes this >> is very inefficient. >> >>The default behavior for

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Barry Smith
> On Apr 27, 2017, at 12:59 AM, Garth N. Wells wrote: > > On 27 April 2017 at 00:30, Barry Smith wrote: >> >> Yes, you asked for LU so it used LU! >> >> Of course for smaller coarse grids and large numbers of processes this is >> very inefficient. >>

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Barry Smith
> On Apr 27, 2017, at 8:27 AM, Garth N. Wells wrote: > > On 27 April 2017 at 13:45, Mark Adams wrote: >> Barry, we seem to get an error when you explicitly set this. >> >> Garth, Maybe to set the default explicitly you need to use pc_type asm >> -sub_pc_type

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Barry Smith
> On Apr 27, 2017, at 7:45 AM, Mark Adams wrote: > > Barry, we seem to get an error when you explicitly set this. Of course you get an error, you are asking PETSc to do a parallel LU; PETSc does NOT have a parallel LU as you well know. How could you possibly think this

Re: [petsc-users] strange convergence

2017-04-27 Thread Barry Smith
Run again using LU on both blocks to see what happens. > On Apr 27, 2017, at 2:14 AM, Hoang Giang Bui wrote: > > I have changed the way to tie the nonconforming mesh. It seems the matrix now > is better > > with -pc_type lu the output is > 0 KSP preconditioned resid

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Mark Adams
> > > 1) Can we motivate why you would ever want a parallel coarse grid? I > cannot think of a reason. > > AMG coarsening is not 100% reliable, in many respects, but on complex domains with a lot of levels you can fail eventually and stopping coarsening prematurely can be a stop gap measure.

Re: [petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread Matthew Knepley
On Thu, Apr 27, 2017 at 3:46 AM, neok m4700 wrote: > Hi, > > I am trying to change my problem to using periodic boundary conditions. > > However, when I use DMDASetUniformCoordinates on the DA, the spacing > changes. > > This is due to an additional point e.g. in

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Matthew Knepley
On Thu, Apr 27, 2017 at 9:07 AM, Mark Adams wrote: > > >> Does the matrix operator(s) associated with the ksp have an options >> prefix? >> >> > I don't think so. run with -help to check. > > >> >> >> >> >> >> If I get GAMG to use more than one process for the coarse grid (a

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Mark Adams
> > Does the matrix operator(s) associated with the ksp have an options prefix? > > I don't think so. run with -help to check. > >> > >> > >> If I get GAMG to use more than one process for the coarse grid (a GAMG > >> setting), can I get a parallel LU (exact) solver to solve it using > >> only

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Garth N. Wells
On 27 April 2017 at 13:45, Mark Adams wrote: > Barry, we seem to get an error when you explicitly set this. > > Garth, Maybe to set the default explicitly you need to use pc_type asm > -sub_pc_type lu. That is the true default. > > More below but this is the error message: > >

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Mark Adams
Barry, we seem to get an error when you explicitly set this. Garth, Maybe to set the default explicitly you need to use pc_type asm -sub_pc_type lu. That is the true default. More below but this is the error message: 17:46 knepley/feature-plasma-example *=

Re: [petsc-users] Multigrid coarse grid solver

2017-04-27 Thread Mark Adams
On Wed, Apr 26, 2017 at 7:30 PM, Barry Smith wrote: > > Yes, you asked for LU so it used LU! > >Of course for smaller coarse grids and large numbers of processes this > is very inefficient. > >The default behavior for GAMG is probably what you want. In that case >

[petsc-users] explanations on DM_BOUNDARY_PERIODIC

2017-04-27 Thread neok m4700
Hi, I am trying to change my problem to using periodic boundary conditions. However, when I use DMDASetUniformCoordinates on the DA, the spacing changes. This is due to an additional point e.g. in dm/impls/da/gr1.c else if (dim == 2) { if (bx == DM_BOUNDARY_PERIODIC) hx = (xmax-xmin)/(M);

Re: [petsc-users] strange convergence

2017-04-27 Thread Hoang Giang Bui
I have changed the way to tie the nonconforming mesh. It seems the matrix now is better with -pc_type lu the output is 0 KSP preconditioned resid norm 3.308678584240e-01 true resid norm 9.006493082896e+06 ||r(i)||/||b|| 1.e+00 1 KSP preconditioned resid norm 2.004313395301e-12