On Sun, Apr 9, 2017 at 6:04 AM, Mark Adams wrote:
> You seem to have two levels here and 3M eqs on the fine grid and 37 on
> the coarse grid. I don't understand that.
>
> You are also calling the AMG setup a lot, but not spending much time
> in it. Try running with -info and
Hi Mark,
Thanks for your reply.
On Wed, Apr 12, 2017 at 9:16 AM, Mark Adams wrote:
> The problem comes from setting the number of MG levels (-pc_mg_levels 2).
> Not your fault, it looks like the GAMG logic is faulty, in your version at
> least.
>
What I want is that GAMG
Thanks, Gaetan, for your suggestions and for running the code…Now I know for
sure there is something wrong with my installation!
Cheers
Rodrigo
From: Gaetan Kenway [mailto:gaet...@gmail.com]
Sent: Wednesday, April 12, 2017 12:20 PM
To: Rodrigo Felicio
Cc: petsc-users@mcs.anl.gov
Subject: Re:
Maybe try doing
pComm=MPI.COMM_WORLD instead of the PETSc.COMM_WORLD. I know it shouldn't
matter, but it's worth a shot. Also then you won't need the tompi4py() i
guess.
Gaetan
On Wed, Apr 12, 2017 at 10:17 AM, Gaetan Kenway wrote:
> Hi Rodrigo
>
> I just ran your example on
Hi Rodrigo
I just ran your example on Nasa's Pleiades system. Here's what I got:
PBS r459i4n11:~> time mpiexec -n 5 python3.5 another_split_ex.py
number of subcomms = 2.5
petsc rank=2, petsc size=5
sub rank 1/3, color:0
petsc rank=4, petsc size=5
sub rank 2/3, color:0
petsc rank=0, petsc size=5
Going over my older codes I found out that I have already tried the approach of
splitting PETSc.COMM_WORLD, but whenever I try to create a matrix using a
subcommuicator, the program fails. For example, executing the following python
code attached to this msg, I get the following output
time
One other quick note:
Sometimes it appears that mpi4py for petsc4py do not always play nicely
together. I think you want to do the petsc4py import first and then the
mpi4py import. Then you can split the MPI.COMM_WORLD all you want and
create any petsc4py objects on them. Or if that doesn't
Hello,
I have problems determining the orientation of the face normals of a DMPlex.
I create a DMPlex, for example with DMPlexCreateHexBoxMesh().
Next, I get the face normals using DMPlexComputeGeometryFVM(DM dm, Vec
*cellgeom, Vec *facegeom). facegeom gives the correct normals, but I don't
know
The problem comes from setting the number of MG levels (-pc_mg_levels 2).
Not your fault, it looks like the GAMG logic is faulty, in your version at
least.
GAMG will force the coarsest grid to one processor by default, in newer
versions. You can override the default with:
Thanks Jed and Gaetan.
I will try that approach of splitting PETSc.COMM_WORLD, but I still need to
load mpi4py (probably after PETSc), because PETSc.Comm is very limited, i.e.,
it does not have the split function, for example. My goal is to be able to set
different matrices and vectors for
pp. 174, 183 in the current User Manual describe option -log_summary.
Running code with this option yields
WARNING: -log_summary is being deprecated; switch to -log_view
Btw either form of the option is missing in the Index.
smime.p7s
Description: S/MIME Cryptographic Signature
11 matches
Mail list logo