Dear All,
I am new to both pestc (petsc4py) and MPI. I am prototyping an application
that acts on data belonging to thousands of different spatial locations, so I
decided to use mpi4py to parallelize my computations over said set of points.
At each point, however, I have to solve linear
Thanks, Barry and Matt, for your prompt responses. I really appreciated them.
Sorry I forgot to mention that I am using petsc 3.6.1, petsc4py 3.6.0 and
mpi4py 2.0.0. (and also tested with mpi4py 1.2.2)
from running env I get:
Hello all,
Sorry for the newbie question, but is there a way of making petsc4py work with
an MPI group or subcommunicator? I saw a solution posted back in 2010
(http://lists.mcs.anl.gov/pipermail/petsc-users/2010-May/006382.html), but it
does not work for me. Indeed, if I try to use
it only happens if trying to initiate petsc with
petsc4py.init(comm=newcomm)?
Anyway, thanks Jed, really appreciate the help.
Cheers
Rodrigo
-Original Message-
From: Jed Brown [mailto:j...@jedbrown.org]
Sent: Tuesday, April 11, 2017 11:36 AM
To: Rodrigo Felicio; petsc-users@mcs.anl.gov
Cc
Thanks Jed and Gaetan.
I will try that approach of splitting PETSc.COMM_WORLD, but I still need to
load mpi4py (probably after PETSc), because PETSc.Comm is very limited, i.e.,
it does not have the split function, for example. My goal is to be able to set
different matrices and vectors for
Going over my older codes I found out that I have already tried the approach of
splitting PETSc.COMM_WORLD, but whenever I try to create a matrix using a
subcommuicator, the program fails. For example, executing the following python
code attached to this msg, I get the following output
time
Thanks, Gaetan, for your suggestions and for running the code…Now I know for
sure there is something wrong with my installation!
Cheers
Rodrigo
From: Gaetan Kenway [mailto:gaet...@gmail.com]
Sent: Wednesday, April 12, 2017 12:20 PM
To: Rodrigo Felicio
Cc: petsc-users@mcs.anl.gov
Subject: Re
I thought I had tried that as well with no success before, but this time it
worked, despite some persistent error msgs related to PMI_finalize:
time mpirun -n 1 python dyn_mem_ex.py
proc 2 of 4 proc 3 of 4
proc 1 of 4
proc 0 of 4
proc 1 of 4, Adim=[10]
proc 2 of 4, Adim=[10]
proc 0 of 4,
Rodrigo
From: petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov] on
behalf of Rodrigo Felicio [rodrigo.feli...@iongeo.com]
Sent: Wednesday, March 01, 2017 2:31 PM
To: Barry Smith
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] petsc4py
for me I am now seeking to solve my problem by
splitting the COMM_WORLD instead, which I believe is more in line with your
suggestion.
Kind regards
Rodrigo
From: Matthew Knepley [mailto:knep...@gmail.com]
Sent: Thursday, March 02, 2017 8:11 AM
To: Rodrigo Felicio
Cc: Barry Smith; petsc-users
10 matches
Mail list logo