Re: [petsc-users] Iterative solver behavior with increasing number of mpi

2019-04-17 Thread Mark Adams via petsc-users
GAMG is almost algorithmically invariant but the graph coarsening is not invariant not deterministic. You should not see much difference in teration could but a little decay is expected. On Wed, Apr 17, 2019 at 12:36 PM Matthew Knepley via petsc-users < petsc-users@mcs.anl.gov> wrote: > On Wed,

Re: [petsc-users] Using -malloc_dump to examine memory leak

2019-04-17 Thread Smith, Barry F. via petsc-users
Please remember to respond-to-all in your emails, otherwise the mail comes only to me and I may not have full information to respond to your queries correctly. > On Apr 16, 2019, at 11:56 PM, Yuyun Yang wrote: > > So using -objects_dump showed nothing below the line: > The following

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Balay, Satish via petsc-users
On Wed, 17 Apr 2019, Satish Balay wrote: > Its not ideal - but having local changes in our spack clones (change > git url, add appropriate version lines to branches that one is > working on) is possible [for a group working in this mode]. [balay@pj03 petsc]$ pwd /home/balay/petsc [balay@pj03

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Balay, Satish via petsc-users
On Wed, 17 Apr 2019, Smith, Barry F. via petsc-users wrote: > This is fine for "hacking" on PETSc but worthless for any other package. > Here is my concern, when someone > realizes there is a problem with a package they are using through a package > manager they think, crud I have to > > 1)

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Smith, Barry F. via petsc-users
> On Apr 17, 2019, at 12:56 AM, Jed Brown wrote: > > "Smith, Barry F. via petsc-users" writes: > >> So it sounds like spack is still mostly a "package manager" where people >> use "static" packages and don't hack the package's code. This is not >> unreasonable, no other package manager

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Smith, Barry F. via petsc-users
> On Apr 17, 2019, at 1:35 AM, Balay, Satish wrote: > > On Wed, 17 Apr 2019, Smith, Barry F. via petsc-users wrote: > >> This is fine for "hacking" on PETSc but worthless for any other package. >> Here is my concern, when someone >> realizes there is a problem with a package they are

Re: [petsc-users] Error with VecDestroy_MPIFFTW+0x61

2019-04-17 Thread Sajid Ali via petsc-users
Hi Matt/Barry, I've implemented this for 1D-complex-mpi vec and tested it. Here is the modified source file -> https://bitbucket.org/sajid__ali/petsc/src/86fb19b57a7c4f8f42644e5160d2afbdc5e03639/src/mat/impls/fft/fftw/fftw.c Functions definitions at

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Jed Brown via petsc-users
"Smith, Barry F." writes: > This is fine for "hacking" on PETSc but worthless for any other package. > Here is my concern, when someone > realizes there is a problem with a package they are using through a package > manager they think, crud I have to > > 1) find the git repository for this

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-17 Thread Smith, Barry F. via petsc-users
> On Apr 17, 2019, at 6:49 AM, Matthew Knepley wrote: > > On Wed, Apr 17, 2019 at 2:40 AM Smith, Barry F. via petsc-users > wrote: > > > > On Apr 17, 2019, at 1:35 AM, Balay, Satish wrote: > > > > On Wed, 17 Apr 2019, Smith, Barry F. via petsc-users wrote: > > > >> This is fine for

Re: [petsc-users] Iterative solver behavior with increasing number of mpi

2019-04-17 Thread Balay, Satish via petsc-users
Yes - the default preconditioner is block-jacobi - with one block on each processor. So when run on 1 proc vs 8 proc - the preconditioner is different (with 1block for bjacobi vs 8blocks for bjacobi)- hence difference in convergence. Satish On Wed, 17 Apr 2019, Marian Greg via petsc-users