Re: [petsc-users] PETSc turtorial example generating unexpected result

2019-01-14 Thread Mark Adams via petsc-users
This code puts the error in x: call VecAXPY(x,neg_one,u,ierr) I suspect you printed these numbers after this statement. On Mon, Jan 14, 2019 at 8:10 PM Maahi Talukder via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello all, > > I complied and run the example *ex2f.F90* located in >

Re: [petsc-users] MPI Iterative solver crash on HPC

2019-01-14 Thread Zhang, Hong via petsc-users
Fande: According to this PR https://bitbucket.org/petsc/petsc/pull-requests/1061/a_selinger-feature-faster-scalable/diff Should we set the scalable algorithm as default? Sure, we can. But I feel we need do more tests to compare scalable and non-scalable algorithms. On theory, for small to

Re: [petsc-users] PETSC for singular system

2019-01-14 Thread Smith, Barry F. via petsc-users
> On Jan 14, 2019, at 7:26 AM, Matthew Knepley via petsc-users > wrote: > > On Mon, Jan 14, 2019 at 7:55 AM Yaxiong Chen wrote: > So must I figure out the index of the zero columns and row to get the null > space first, And then I remove it to generator Cholesky or LU > preconditionor?Is

Re: [petsc-users] MPI Iterative solver crash on HPC

2019-01-14 Thread Fande Kong via petsc-users
Hi Hong, According to this PR https://bitbucket.org/petsc/petsc/pull-requests/1061/a_selinger-feature-faster-scalable/diff Should we set the scalable algorithm as default? Thanks, Fande Kong, On Fri, Jan 11, 2019 at 10:34 AM Zhang, Hong via petsc-users < petsc-users@mcs.anl.gov> wrote: > Add

Re: [petsc-users] MPI Iterative solver crash on HPC

2019-01-14 Thread Zhang, Hong via petsc-users
This time, it crashes at [6]PETSC ERROR: #1 MatTransposeMatMultSymbolic_MPIAIJ_MPIAIJ() line 1989 in /lustre/home/vef002/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c ierr = PetscMalloc1(bi[pn]+1,); which allocates local portion of B^T*A. You may also try to increase number of cores to reduce

Re: [petsc-users] MPI Iterative solver crash on HPC

2019-01-14 Thread Mark Adams via petsc-users
The memory requested is an insane number. You may need to use 64 bit integers. On Mon, Jan 14, 2019 at 8:06 AM Sal Am via petsc-users < petsc-users@mcs.anl.gov> wrote: > I ran it by: mpiexec -n 8 valgrind --tool=memcheck -q --num-callers=20 > --log-file=valgrind.log-osa.%p ./solveCSys -malloc

Re: [petsc-users] PETSC for singular system

2019-01-14 Thread Matthew Knepley via petsc-users
On Mon, Jan 14, 2019 at 7:55 AM Yaxiong Chen wrote: > So must I figure out the index of the zero columns and row to get the > null space first, And then I remove it to generator Cholesky or LU > preconditionor?Is this case, should the nontrivial null space be (1,0,0,0)? > No 1) If you have a