Re: [petsc-users] 捕获

2023-12-14 Thread Matthew Knepley
On Thu, Dec 14, 2023 at 1:27 AM 291--- via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear SLEPc Developers, > > I a am student from Tongji University. Recently I am trying to write a c++ > program for matrix solving, which requires importing the PETSc library that > you have developed.

Re: [petsc-users] PETSc 3.14 to PETSc 3.20: Different (slower) convergence for classical AMG (sequential and especially in parallel)

2023-12-14 Thread LEDAC Pierre
Hello Mark, Thanks for your answer. Indeed, I didn't see the information that classical AMG was not really supported: -solver2_pc_gamg_type : Type of AMG method (only 'agg' supported and useful) (one of) classical geo agg (PCGAMGSetType) We switched very recently from GAMG("agg") to

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-14 Thread Pierre Jolivet
> On 14 Dec 2023, at 8:02 PM, Sreeram R Venkat wrote: > > Hello Pierre, > > Thank you for your reply. I tried out the HPDDM CG as you said, and it seems > to be doing the batched solves, but the KSP is not converging due to a NaN or > Inf being generated. I also noticed there are a lot of

Re: [petsc-users] PETSc 3.14 to PETSc 3.20: Different (slower) convergence for classical AMG (sequential and especially in parallel)

2023-12-14 Thread Mark Adams
On Thu, Dec 14, 2023 at 10:15 AM LEDAC Pierre wrote: > Hello Mark, > > > Thanks for your answer. Indeed, I didn't see the information that > classical AMG was not really supported: > > > -solver2_pc_gamg_type : Type of AMG method > (only > 'agg' supported and useful) (one of) classical geo agg

[petsc-users] Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-14 Thread Fackler, Philip via petsc-users
I'm using the following sequence of functions related to the Jacobian matrix: DMDACreate1d(..., ); DMSetFromOptions(da); DMSetUp(da); DMSetMatType(da, MATAIJKOKKOS); DMSetMatrixPreallocateSkip(da, PETSC_TRUE); Mat J; DMCreateMatrix(da, ); MatSetPreallocationCOO(J, ...); I recently added the call

Re: [petsc-users] Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-14 Thread Jed Brown
17 GB for a 1D DMDA, wow. :-) Could you try applying this diff to make it work for DMDA (it's currently handled by DMPlex)? diff --git i/src/dm/impls/da/fdda.c w/src/dm/impls/da/fdda.c index cad4d926504..bd2a3bda635 100644 --- i/src/dm/impls/da/fdda.c +++ w/src/dm/impls/da/fdda.c @@ -675,19

Re: [petsc-users] Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-14 Thread Matthew Knepley
On Thu, Dec 14, 2023 at 2:06 PM Fackler, Philip via petsc-users < petsc-users@mcs.anl.gov> wrote: > I'm using the following sequence of functions related to the Jacobian > matrix: > > DMDACreate1d(..., ); > DMSetFromOptions(da); > DMSetUp(da); > DMSetMatType(da, MATAIJKOKKOS); >

Re: [petsc-users] Call to DMSetMatrixPreallocateSkip not changing allocation behavior

2023-12-14 Thread Jed Brown
I had a one-character typo in the diff above. This MR to release should work now. https://gitlab.com/petsc/petsc/-/merge_requests/7120 Jed Brown writes: > 17 GB for a 1D DMDA, wow. :-) > > Could you try applying this diff to make it work for DMDA (it's currently > handled by DMPlex)? > >

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-14 Thread Sreeram R Venkat
Thanks, I will try to create a minimal reproducible example. This may take me some time though, as I need to figure out how to extract only the relevant parts (the full program this solve is used in is getting quite complex). I'll also try out some of the BoomerAMG options to see if that helps.

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-14 Thread Pierre Jolivet
> On 14 Dec 2023, at 11:45 PM, Sreeram R Venkat wrote: > > Thanks, I will try to create a minimal reproducible example. This may take me > some time though, as I need to figure out how to extract only the relevant > parts (the full program this solve is used in is getting quite complex). You

[petsc-users] PETSc configuration to solve Poisson equation on a 2D cartesian grid of points with nVidia GPUs (CUDA)

2023-12-14 Thread Vittorio Sciortino
Dear PETSc developers, My name is Vittorio Sciortion, I am a PhD student in Italy and I am really curious about the applications and possibilities of your library. I would ask you two questions about PETSc. My study case consists in the development of a 2D electrostatic Particle In Cell