Re: [petsc-users] DMPlexDistributeField

2019-07-10 Thread Matthew Knepley via petsc-users
Crap! The wrong thing got pushed. Thanks, Matt On Wed, Jul 10, 2019 at 9:49 PM Adrian Croucher wrote: > Probably don't want the two extra PetscSFView() calls (lines 1687, 1688) > though- presumably they were just for temporary debugging? > > - Adrian > On 11/07/19 2:18 PM, Adrian

Re: [petsc-users] DMPlexDistributeField

2019-07-10 Thread Adrian Croucher via petsc-users
Probably don't want the two extra PetscSFView() calls (lines 1687, 1688) though- presumably they were just for temporary debugging? - Adrian On 11/07/19 2:18 PM, Adrian Croucher wrote: On 10/07/19 3:40 PM, Matthew Knepley wrote: Sorry this took so long. I put in a PR for this:

Re: [petsc-users] DMPlexDistributeField

2019-07-10 Thread Adrian Croucher via petsc-users
On 10/07/19 3:40 PM, Matthew Knepley wrote: Sorry this took so long. I put in a PR for this: https://bitbucket.org/petsc/petsc/pull-requests/1858/knepley-fix-plex-distribute-overlap/diff I think it fixes your problem. Thanks very much Matt, that does fix it. Good stuff! - Adrian -- Dr

Re: [petsc-users] Making the convergence faster

2019-07-10 Thread Jed Brown via petsc-users
This is typical for weak preconditioners. Have you tried -pc_type gamg or -pc_type mg (algebraic and geometric multigrid, respectively)? On a structured grid with smooth coefficients, geometric multigrid is possible with low setup cost and convergence in a few iterations independent of problem