Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
On Thu, Oct 10, 2019 at 8:31 AM Dave May wrote: > On Thu, 10 Oct 2019 at 13:21, Matthew Knepley via petsc-users < > petsc-users@mcs.anl.gov> wrote: > >> On Wed, Oct 9, 2019 at 5:10 PM Danyang Su via petsc-users < >> petsc-users@mcs.anl.gov> wrote: >> >>> Dear All, >>> >>> I have a question

Re: [petsc-users] MAT_NEW_NONZERO_LOCATION_ERR

2019-10-10 Thread Thibaut Appel via petsc-users
Hi Hong, Thank you that was unclear to me, now I understand its purpose! Thibaut On 08/10/2019 16:18, Zhang, Hong wrote: Thibaut : Sorry, I did not explain it clearly. You call MatSetOption(A,MAT_NEW_NONZERO_LOCATION_ERR,PETSC_TRUE); AFTER matrix is assembled. Then no more new zero is allowed

Re: [petsc-users] Block preconditioning for 3d problem

2019-10-10 Thread Mark Adams via petsc-users
I can think of a few sources of coupling in the solver: 1) line search and 2) Krylov, and 3) the residual test (scaling issues). You could turn linesearch off and use Richardson (with a fixed number of iterations) or exact solves as Jed suggested. As far as scaling can you use the same NL problem

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Mark Adams via petsc-users
A related question, what is the state of having something like a distributed DMPlexCreateFromCellList method, but maybe your H5 efforts would work. My bone modeling code is old and a pain, but the apps specialized serial mesh generator could write an H5 file instead of the current FEAP file. Then

Re: [petsc-users] Question -with-64-bit-blas-indices

2019-10-10 Thread Smith, Barry F. via petsc-users
> On Oct 10, 2019, at 12:38 PM, Randall Mackie via petsc-users > wrote: > > With the release of PETSc 3.12, there is a new option mentioned in the list > of changes: > > Added --with-64-bit-blas-indices that will switch to 64 bit indices when > using MKL libraries for BLAS/LAPACK and build

Re: [petsc-users] Block preconditioning for 3d problem

2019-10-10 Thread Jed Brown via petsc-users
Dave Lee via petsc-users writes: > Hi PETSc, > > I have a nonlinear 3D problem for a set of uncoupled 2D slabs. (Which I > ultimately want to couple once this problem is solved). > > When I solve the inner linear problem for each of these 2D slabs > individually (using KSPGMRES) the convergence

Re: [petsc-users] Block preconditioning for 3d problem

2019-10-10 Thread Dave Lee via petsc-users
Hi Jed and Mark, thanks for your helpful comments. Yes the nonlinear outer problem is uncoupled between the slabs, it is only the linear inner problem where they are coupled. I've tried to make the slab DOFs close in memory, and also tried using a tight tolerance on the outer KSP (1.0e-20), but

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
On Thu, Oct 10, 2019 at 7:53 PM Danyang Su wrote: > On 2019-10-10 4:28 p.m., Matthew Knepley wrote: > > On Thu, Oct 10, 2019 at 4:26 PM Danyang Su wrote: > >> Hi All, >> >> Your guess is right. The memory problem occurs after >> DMPlexCreateFromCellList and DMPlexDistribute. The mesh related

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Mark Adams via petsc-users
Now that I think about it, the partitioning and distribution can be done with existing API, I would assume, like is done with matrices. I'm still wondering what the H5 format is. I assume that it is not built for a hardwired number of processes to read in parallel and that the parallel read is

Re: [petsc-users] Block preconditioning for 3d problem

2019-10-10 Thread Jed Brown via petsc-users
Why are your slabs decoupled at present? (Have you done a transform in the vertical?) Is the linear convergence significantly different when you include the multiple layers? Dave Lee writes: > Hi Jed and Mark, > > thanks for your helpful comments. Yes the nonlinear outer problem is >

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Danyang Su via petsc-users
Hi Matt, My previous test is terminated after calling subroutine A as shown below. >> In Subroutine A   call DMPlexDistribute(dmda_flow%da,stencil_width,    &     PETSC_NULL_SF,distributedMesh,ierr)   CHKERRQ(ierr)   if (distributedMesh /=

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
On Thu, Oct 10, 2019 at 9:00 PM Danyang Su wrote: > Labels should be destroyed with the DM. Just make a small code that does > nothing but distribute the mesh and end. If you > run with -malloc_test you should see if everythign is destroyed properly. > > Thanks, > > Matt > > Attached is