[petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices

2014-03-21 Thread 吕超




Your faithfully:

 Last e-mail has some literal error, sorry~

 program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear 
elements on the unit square for Laplacian.

 After preallocation using   

 ierr  = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* More 
than necessary */,


 Results of commands of mpiexec -n 2 ./ex3 and mpiexec -n 3 ./ex3 are 
Norm of error 2.22327e-06 Iterations 6 and Norm of error 3.12849e-07 
Iterations 8. Both results are good!

 However, if I use mpiexec -n 4 ./ex3 or 5,6,7...precesses, error 
[2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other positions 
for different processes) caused a malloc! appear!. For me, this error is 
unbelievable, because first, the preallocation is more than necessary,how can 
the new malloc appear? Second, the global number 4 point originally has no 
neighbor vertices whose global number is 29! This error has tortured me for a 
long time.

 This error seems meaningless, however, my recent 3d finite element method 
cannot be caculated by more processes owing to the new nonzero malloc error! 
And this is why I want to use 4 or much more processes to compute ex3.c.

 Thank you for all previous assistence and hope you have a good life!

your sincerely

LV CHAO

2014/3/21





Re: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices

2014-03-21 Thread Barry Smith

  Thank you for reporting this. It was our error. In fact 4 is not enough under 
certain circumstances; consider where each process has only a single degree of 
freedom (vertex) then it is coupled to 8 other vertices ALL on other processes. 
Thus we really need to use 8 instead of 4 as the maximum number of off process 
coupling.

  I have fixed this in master so it now runs on any number of processes.

   Barry

On Mar 21, 2014, at 9:11 AM, 吕超 luc...@mail.iggcas.ac.cn wrote:

 
 
 Your faithfully:
 
  Last e-mail has some literal error, sorry~
 
  program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear 
 elements on the unit square for Laplacian.
 
  After preallocation using   
 
  ierr  = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* 
 More than necessary */,
 
  Results of commands of mpiexec -n 2 ./ex3 and mpiexec -n 3 ./ex3 are 
 Norm of error 2.22327e-06 Iterations 6 and Norm of error 3.12849e-07 
 Iterations 8. Both results are good!
 
  However, if I use mpiexec -n 4 ./ex3 or 5,6,7...precesses, error 
 [2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other 
 positions for different processes) caused a malloc! appear!. For me, this 
 error is unbelievable, because first, the preallocation is more than 
 necessary,how can the new malloc appear? Second, the global number 4 point 
 originally has no neighbor vertices whose global number is 29! This error has 
 tortured me for a long time.
 
  This error seems meaningless, however, my recent 3d finite element 
 method cannot be caculated by more processes owing to the new nonzero malloc 
 error! And this is why I want to use 4 or much more processes to compute 
 ex3.c.
 
  Thank you for all previous assistence and hope you have a good life!
 
 your sin cerely
 
 LV CHAO
 
 2014/3/21
 
 
 
 



Re: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices

2014-03-21 Thread Jed Brown
Barry Smith bsm...@mcs.anl.gov writes:

   Thank you for reporting this. It was our error. In fact 4 is not
   enough under certain circumstances; consider where each process has
   only a single degree of freedom (vertex) then it is coupled to 8
   other vertices ALL on other processes. Thus we really need to use 8
   instead of 4 as the maximum number of off process coupling.

Note that _your_ code should generally not have this problem because you
should use a non-pathological partition.


pgpWf1NEACFhu.pgp
Description: PGP signature


[petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices

2014-03-20 Thread 吕超
Your faithfully:

 program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear 
elements on the unit square for Laplacian.

 After preallocation using   

 ierr  = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* More 
than necessary */,


 Results of commands of mpiexec -n 2 ./ex3 and mpiexec -n 2 ./ex3 are 
Norm of error 2.22327e-06 Iterations 6 and Norm of error 3.12849e-07 
Iterations 8. Both results is good!

 However, if I use mpiexec -n 4 ./ex3 or 5,6,7...precesses, error 
[2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other positions 
for different processes) caused a malloc! appear!. For me, this error is 
unbelievable, because first, the preallocation is more than necessary,how can 
the new malloc appear? Second, the global number 4 point originally have no 
neighbor vertices whose global number is 29! This error have tortured me for 
long times.

 This error seems meaningless, however, my recent 3d finite element method 
cannot be caculated by more processesowing to the new nonzero malloc!And this 
is why I want to use 4 or much more processes to compute ex3.c.

 Thank you for all previous assistence and hope you have a good life!

your sincerely

LV CHAO

2014/3/21