Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Zhang, Hong
It is not problem with Matload twice. The file has one matrix, but is loaded twice. Replacing pc with ksp, the code runs fine. The error occurs when PCSetUp_LU() is called with SAME_NONZERO_PATTERN. I'll further look at it later. Hong From: Zhang, Hong

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Zhang, Hong
I am investigating it. The file has two matrices. The code takes following steps: PCCreate(PETSC_COMM_WORLD, ); MatCreate(PETSC_COMM_WORLD,); MatLoad(A,fd); PCSetOperators(pc,A,A); PCSetUp(pc); MatCreate(PETSC_COMM_WORLD,); MatLoad(A,fd); PCSetOperators(pc,A,A); PCSetUp(pc); //crash here

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Satish Balay
On Fri, 21 Oct 2016, Barry Smith wrote: > > valgrind first balay@asterix /home/balay/download-pine/x/superlu_dist_test $ mpiexec -n 2 $VG ./ex16 -f ~/datafiles/matrices/small First MatLoad! Mat Object: 2 MPI processes type: mpiaij row 0: (0, 4.) (1, -1.) (6, -1.) row 1: (0, -1.) (1,

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Barry Smith
valgrind first > On Oct 21, 2016, at 6:33 PM, Satish Balay wrote: > > On Fri, 21 Oct 2016, Barry Smith wrote: > >> >>> On Oct 21, 2016, at 5:16 PM, Satish Balay wrote: >>> >>> The issue with this test code is - using MatLoad() twice [with the >>>

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Satish Balay
On Fri, 21 Oct 2016, Barry Smith wrote: > > > On Oct 21, 2016, at 5:16 PM, Satish Balay wrote: > > > > The issue with this test code is - using MatLoad() twice [with the > > same object - without destroying it]. Not sure if thats supporsed to > > work.. > >If the file

Re: [petsc-users] matrix preallocation

2016-10-21 Thread Barry Smith
We don't currently have a MatReset (corresponding to PCRest() etc) but it is the right thing for you in this situation I think. A shallow MatReset() would destroy all the matrix data structures but not the Layout information (likely you want this one) while a deep reset would even get

Re: [petsc-users] matrix preallocation

2016-10-21 Thread Jed Brown
"Kong, Fande" writes: > Hi, > > For mechanics problems, the contact surface changes during each nonlinear > iteration. Therefore, the sparsity of matrix also changes during each > nonlinear iteration. We know the preallocaiton is important for performance. > > My question

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Barry Smith
> On Oct 21, 2016, at 5:16 PM, Satish Balay wrote: > > The issue with this test code is - using MatLoad() twice [with the > same object - without destroying it]. Not sure if thats supporsed to > work.. If the file has two matrices in it then yes a second call to MatLoad()

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Satish Balay
The issue with this test code is - using MatLoad() twice [with the same object - without destroying it]. Not sure if thats supporsed to work.. Satish On Fri, 21 Oct 2016, Hong wrote: > I can reproduce the error on a linux machine with petsc-maint. It crashes > at 2nd solve, on both processors:

[petsc-users] matrix preallocation

2016-10-21 Thread Kong, Fande
Hi, For mechanics problems, the contact surface changes during each nonlinear iteration. Therefore, the sparsity of matrix also changes during each nonlinear iteration. We know the preallocaiton is important for performance. My question is: it is possible to re-allocate memory during each

Re: [petsc-users] Column #j is wrong in parallel from message "Inserting a new nonzero (i, j) into matrix"

2016-10-21 Thread Dave May
On 21 October 2016 at 18:55, Eric Chamberland < eric.chamberl...@giref.ulaval.ca> wrote: > Hi, > > I am on a new issue with a message: > [1]PETSC ERROR: - Error Message > -- > [1]PETSC ERROR: Argument out of range >

Re: [petsc-users] Column #j is wrong in parallel from message "Inserting a new nonzero (i, j) into matrix"

2016-10-21 Thread Eric Chamberland
Hi, I am on a new issue with a message: [1]PETSC ERROR: - Error Message -- [1]PETSC ERROR: Argument out of range [1]PETSC ERROR: New nonzero at (374328,1227) caused a malloc Use MatSetOption(A,

Re: [petsc-users] Looking for a quick example of a symmetric KKT system

2016-10-21 Thread Jed Brown
Why doesn't a Stokes problem fulfill your needs? Patrick Sanan writes: > Yes, but AFAIK that example produces a 2x2 system - I was hoping for > something with a variable problem size, ideally with some sort of > physics motivating the underlying optimization problem. >

Re: [petsc-users] Looking for a quick example of a symmetric KKT system

2016-10-21 Thread Patrick Sanan
Yes, but AFAIK that example produces a 2x2 system - I was hoping for something with a variable problem size, ideally with some sort of physics motivating the underlying optimization problem. On Fri, Oct 21, 2016 at 7:23 PM, Justin Chang wrote: > Something like this? > >

Re: [petsc-users] Looking for a quick example of a symmetric KKT system

2016-10-21 Thread Justin Chang
Something like this? http://www.mcs.anl.gov/petsc/petsc-current/src/tao/constrained/examples/tutorials/toy.c.html On Friday, October 21, 2016, Patrick Sanan wrote: > Are there any examples already in PETSc or TAO that assemble such a > system (which could thus be

[petsc-users] Looking for a quick example of a symmetric KKT system

2016-10-21 Thread Patrick Sanan
Are there any examples already in PETSc or TAO that assemble such a system (which could thus be dumped)? SNES example ex73f90t assembles a non-symmetric KKT system.

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-21 Thread Hong
I can reproduce the error on a linux machine with petsc-maint. It crashes at 2nd solve, on both processors: Program received signal SIGSEGV, Segmentation fault. 0x7f051dc835bd in pdgsequ (A=0x1563910, r=0x176dfe0, c=0x178f7f0, rowcnd=0x7fffcb8dab30, colcnd=0x7fffcb8dab38,

[petsc-users] How to scatter values

2016-10-21 Thread 丁老师
Dear professor: I  partitioned my 2D cartesian grid with  4rows*4cols CPUS.    12  13 14 15      8   9  10  11 4   5  6  7 0   1  2   3        Now i need to scatter the values belonging to cpu5 to every cpu along x 

Re: [petsc-users] PetscFE questions

2016-10-21 Thread Matthew Knepley
On Fri, Oct 21, 2016 at 2:26 AM, Julian Andrej wrote: > On Thu, Oct 20, 2016 at 5:18 PM, Matthew Knepley > wrote: > > On Thu, Oct 20, 2016 at 9:42 AM, Julian Andrej > wrote: > >> > >> Thanks for the suggestion. I guess DMCreateSubDM

Re: [petsc-users] PetscFE questions

2016-10-21 Thread Julian Andrej
Yeah, thanks for pointing out my mistake. Next time i'm going to think one more time before writing ;) On Fri, Oct 21, 2016 at 12:17 PM, Lawrence Mitchell wrote: > >> On 21 Oct 2016, at 08:26, Julian Andrej wrote: >> >> On Thu, Oct 20, 2016

Re: [petsc-users] PetscFE questions

2016-10-21 Thread Lawrence Mitchell
> On 21 Oct 2016, at 08:26, Julian Andrej wrote: > > On Thu, Oct 20, 2016 at 5:18 PM, Matthew Knepley wrote: >> On Thu, Oct 20, 2016 at 9:42 AM, Julian Andrej wrote: >>> >>> Thanks for the suggestion. I guess DMCreateSubDM can

Re: [petsc-users] PetscFE questions

2016-10-21 Thread Julian Andrej
On Thu, Oct 20, 2016 at 5:18 PM, Matthew Knepley wrote: > On Thu, Oct 20, 2016 at 9:42 AM, Julian Andrej wrote: >> >> Thanks for the suggestion. I guess DMCreateSubDM can work, but is >> cumbersome to handle for the normal solution process since the mass