First, you don't want a DMShell. Just use da_swarm.
See src/dm/tutorials/ex20.c
You can run this test with
> cd src/dm/tutorials
> make ex20
> ./ex20
or
> ./ex20 -mode 1
See the end of ex20.c for these (lack of) arguments
Now change that code to one periodic direction and test.
This could be a
Hello,
I am running some matrix computations on 64 nodes, using 640 MPI tasks.
And I wanted to check the memory usage with the -memory_view flag.
I get the following output:
Summary of Memory Usage in PETSc
Maximum (over computational time) process memory:total 3.1056e+11 max
> On Apr 12, 2023, at 4:08 PM, Jorti, Zakariae via petsc-users
> wrote:
>
> Hello,
>
> I am running some matrix computations on 64 nodes, using 640 MPI tasks.
> And I wanted to check the memory usage with the -memory_view flag.
> I get the following output:
>
> Summary of Memory Usage in
Hello Barry,
I appreciate the clarification.
I tried to check the memory usage with seff and I got the following results:
seff 7274633
Job ID: 7274633
Cluster: perlmutter
User/Group: zjorti/zjorti
State: COMPLETED (exit code 0)
Nodes: 64
Cores per node: 256
CPU Utilized: 3-13:19:59
CPU
No idea.
You can allocate a few vectors of known size, fill them with some value
VecSet(x,2.0) and check the various memory reports to what they report.
Barry
> On Apr 12, 2023, at 6:21 PM, Jorti, Zakariae wrote:
>
> Hello Barry,
>
> I appreciate the clarification.
> I tried to
Hello,
I am using petsc DMSwarm library for some Lagrangian particle tracking. Until
now, I was working in a closed box and was therefore initializing my swarm
object as:
// Create a DMDA staggered and without ghost cells (for DMSwarm to work)
DMDACreate3d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,