Re: [petsc-users] Parallelize in the y direction

2021-08-20 Thread Barry Smith
Trying to solve many "one-dimensional" problems each in parallel on different subset of ranks will be massive pain to do specifically. I recommend just forming a single matrix for all these systems and solving it with KSPSolve and block Jacobi preconditioning or even a parallel direct

Re: [petsc-users] Reaching limit number of communicator with Spectrum MPI

2021-08-20 Thread Junchao Zhang
Feimi, I'm able to reproduce the problem. I will have a look. Thanks a lot for the example. --Junchao Zhang On Fri, Aug 20, 2021 at 2:02 PM Feimi Yu wrote: > Sorry, I forgot to destroy the matrix after the loop, but anyway, the > in-loop preconditioners are destroyed. Updated the code here

[petsc-users] Euclid or Boomeramg vs ILU: questions.

2021-08-20 Thread Ed Bueler
Viktor -- As a basic comment, note that ILU can be used in parallel, namely on each processor block, by either non-overlapping domain decomposition: -pc_type bjacobi -sub_pc_type ilu or with overlap: -pc_type asm -sub_pc_type ilu See the discussion of block Jacobi and ASM at

Re: [petsc-users] Reaching limit number of communicator with Spectrum MPI

2021-08-20 Thread Feimi Yu
Sorry, I forgot to destroy the matrix after the loop, but anyway, the in-loop preconditioners are destroyed. Updated the code here and the google drive. Feimi On 8/20/21 2:54 PM, Feimi Yu wrote: Hi Barry and Junchao, Actually I did a simple MPI "dup and free" test before with Spectrum

Re: [petsc-users] Reaching limit number of communicator with Spectrum MPI

2021-08-20 Thread Feimi Yu
Hi Barry and Junchao, Actually I did a simple MPI "dup and free" test before with Spectrum MPI, but that one did not have any problem. I'm not a PETSc programmer as I mainly use deal.ii's PETSc wrappers, but I managed to write a minimal program based on petsc/src/mat/tests/ex98.c to reproduce

Re: [petsc-users] Euclid or Boomeramg vs ILU: questions.

2021-08-20 Thread Sanjay Govindjee
Mark's suggestion will definitely help a lot.  Remove the displacement bc equations or include them in the matrix by zeroing out the row and putting a 1 on the diagonal.  The Lagrange multiplier will cause grief. On 8/20/21 11:21 AM, Mark Adams wrote: Constraints are a pain with

Re: [petsc-users] Euclid or Boomeramg vs ILU: questions.

2021-08-20 Thread Mark Adams
Constraints are a pain with scalable/iterative solvers. If you order the constraints last then ILU should work as well as it can work, but AMG gets confused by the constraint equations. You could look at PETSc's Stokes solvers, but it would be best if you could remove the constrained equations

Re: [petsc-users] Improving efficiency of slepc usage

2021-08-20 Thread Jose E. Roman
Maybe too much fill-in during factorization. Try using an external linear solver such as MUMPS as explained in section 3.4.1 of SLEPc's users manual. Jose > El 20 ago 2021, a las 16:12, Matthew Knepley escribió: > > On Fri, Aug 20, 2021 at 6:55 AM dazza simplythebest > wrote: > Dear Jose,

[petsc-users] Using Elemetal with petsc4py to solve AX = B paralelly

2021-08-20 Thread Guangpu Zhu
Dear Sir/Madam, I am trying to use the petsc4py to solve AX = B parallelly, where A is a large dense matrix. The Elemental package in petsc4py is very suitable for the dense matrix, but I can't find any example or learning material about it on the PETSc website and other websites.

Re: [petsc-users] Parallelize in the y direction

2021-08-20 Thread Matthew Knepley
On Fri, Aug 20, 2021 at 7:53 AM Joauma Marichal < joauma.maric...@uclouvain.be> wrote: > Dear Sir or Madam, > > I am looking for advice regarding some of PETSc functionnalities. I am > currently using PETSc to solve the Navier-Stokes equations on a 3D mesh > decomposed over several processors.

Re: [petsc-users] Improving efficiency of slepc usage

2021-08-20 Thread Matthew Knepley
On Fri, Aug 20, 2021 at 6:55 AM dazza simplythebest wrote: > Dear Jose, > Many thanks for your response, I have been investigating this issue > with a few more calculations > today, hence the slightly delayed response. > > The problem is actually derived from a fluid dynamics problem, so to

Re: [petsc-users] Reaching limit number of communicator with Spectrum MPI

2021-08-20 Thread Junchao Zhang
Feimi, if it is easy to reproduce, could you give instructions on how to reproduce that? PS: Spectrum MPI is based on OpenMPI. I don't understand why it has the problem but OpenMPI does not. It could be a bug in petsc or user's code. For reference counting on MPI_Comm, we already have petsc

[petsc-users] Parallelize in the y direction

2021-08-20 Thread Joauma Marichal
Dear Sir or Madam, I am looking for advice regarding some of PETSc functionnalities. I am currently using PETSc to solve the Navier-Stokes equations on a 3D mesh decomposed over several processors. However, until now, the processors are distributed along the x and z directions but not along

[petsc-users] Euclid or Boomeramg vs ILU: questions.

2021-08-20 Thread Наздрачёв Виктор
*Hello, dear PETSc team!* I have a 3D elasticity with heterogeneous properties problem. There is unstructured grid with aspect ratio varied from 4 to 25. Dirichlet BCs (bottom zero displacements) are imposed via linear constraint equations using Lagrange multipliers. Also, Neumann (traction)

Re: [petsc-users] Improving efficiency of slepc usage

2021-08-20 Thread dazza simplythebest
Dear Jose, Many thanks for your response, I have been investigating this issue with a few more calculations today, hence the slightly delayed response. The problem is actually derived from a fluid dynamics problem, so to allow an easier exploration of things I first downsized the resolution