rs in one
application because each Hypre owns its communicator. There is no way to
have all AMG solvers share the same HYPRE-sided communicator? Just like
what we are dong for PETSc objects?
Fande,
>
> Barry
>
>
> > On Apr 3, 2018, at 11:21 AM, Kong, Fande
I think we could add an inner comm for external package. If the same comm
is passed in again, we just retrieve the same communicator, instead of
MPI_Comm_dup(), for that external package (at least HYPRE team claimed this
will be fine). I did not see any issue with this idea so far.
I might be
gt;
> On Apr 3, 2018, at 10:56 PM, Kong, Fande <fande.k...@inl.gov> wrote:
>
> I think we could add an inner comm for external package. If the same comm
> is passed in again, we just retrieve the same communicator, instead of
> MPI_Comm_dup(), for that external package (at le
Hi All,
~/projects/slepc]> PETSC_ARCH=arch-darwin-c-debug-master ./configure
*Checking environment...Traceback (most recent call last): File
"./configure", line 10, in
execfile(os.path.join(os.path.dirname(__file__), 'config',
'configure.py')) File "./config/configure.py", line 206, in
101 - 104 of 104 matches
Mail list logo