> On Apr 3, 2018, at 11:59 AM, Balay, Satish <ba...@mcs.anl.gov> wrote:
> 
> On Tue, 3 Apr 2018, Smith, Barry F. wrote:
> 
>>   Note that PETSc does one MPI_Comm_dup() for each hypre matrix. Internally 
>> hypre does at least one MPI_Comm_create() per hypre boomerAMG solver. So 
>> even if PETSc does not do the MPI_Comm_dup() you will still be limited due 
>> to hypre's MPI_Comm_create.
>> 
>>    I will compose an email to hypre cc:ing everyone to get information from 
>> them.
> 
> Actually I don't see any calls to MPI_Comm_dup() in hypre sources [there are 
> stubs for it for non-mpi build]
> 
> There was that call to MPI_Comm_create() in the stack trace [via 
> hypre_BoomerAMGSetup]

   This is what I said. The MPI_Comm_create() is called for each solver and 
hence uses a slot for each solver.

   Barry

> 
> Satish

Reply via email to