On Thu, Dec 14, 2023 at 1:27 AM 291--- via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Dear SLEPc Developers,
>
> I a am student from Tongji University. Recently I am trying to write a c++
> program for matrix solving, which requires importing the PETSc library that
> you have developed.
Hello Mark,
Thanks for your answer. Indeed, I didn't see the information that classical AMG
was not really supported:
-solver2_pc_gamg_type : Type of AMG method (only
'agg' supported and useful) (one of) classical geo agg (PCGAMGSetType)
We switched very recently from GAMG("agg") to
> On 14 Dec 2023, at 8:02 PM, Sreeram R Venkat wrote:
>
> Hello Pierre,
>
> Thank you for your reply. I tried out the HPDDM CG as you said, and it seems
> to be doing the batched solves, but the KSP is not converging due to a NaN or
> Inf being generated. I also noticed there are a lot of
On Thu, Dec 14, 2023 at 10:15 AM LEDAC Pierre wrote:
> Hello Mark,
>
>
> Thanks for your answer. Indeed, I didn't see the information that
> classical AMG was not really supported:
>
>
> -solver2_pc_gamg_type : Type of AMG method
> (only
> 'agg' supported and useful) (one of) classical geo agg
I'm using the following sequence of functions related to the Jacobian matrix:
DMDACreate1d(..., );
DMSetFromOptions(da);
DMSetUp(da);
DMSetMatType(da, MATAIJKOKKOS);
DMSetMatrixPreallocateSkip(da, PETSC_TRUE);
Mat J;
DMCreateMatrix(da, );
MatSetPreallocationCOO(J, ...);
I recently added the call
17 GB for a 1D DMDA, wow. :-)
Could you try applying this diff to make it work for DMDA (it's currently
handled by DMPlex)?
diff --git i/src/dm/impls/da/fdda.c w/src/dm/impls/da/fdda.c
index cad4d926504..bd2a3bda635 100644
--- i/src/dm/impls/da/fdda.c
+++ w/src/dm/impls/da/fdda.c
@@ -675,19
On Thu, Dec 14, 2023 at 2:06 PM Fackler, Philip via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> I'm using the following sequence of functions related to the Jacobian
> matrix:
>
> DMDACreate1d(..., );
> DMSetFromOptions(da);
> DMSetUp(da);
> DMSetMatType(da, MATAIJKOKKOS);
>
I had a one-character typo in the diff above. This MR to release should work
now.
https://gitlab.com/petsc/petsc/-/merge_requests/7120
Jed Brown writes:
> 17 GB for a 1D DMDA, wow. :-)
>
> Could you try applying this diff to make it work for DMDA (it's currently
> handled by DMPlex)?
>
>
Thanks, I will try to create a minimal reproducible example. This may take
me some time though, as I need to figure out how to extract only the
relevant parts (the full program this solve is used in is getting quite
complex).
I'll also try out some of the BoomerAMG options to see if that helps.
> On 14 Dec 2023, at 11:45 PM, Sreeram R Venkat wrote:
>
> Thanks, I will try to create a minimal reproducible example. This may take me
> some time though, as I need to figure out how to extract only the relevant
> parts (the full program this solve is used in is getting quite complex).
You
Dear PETSc developers,
My name is Vittorio Sciortion, I am a PhD student in Italy and I am
really curious about the applications and possibilities of your
library. I would ask you two questions about PETSc.
My study case consists in the development of a 2D electrostatic Particle
In Cell
11 matches
Mail list logo