On Thu, 26 Apr 2012, Barry Smith wrote:

>   We've never tested this configuration, there may be some flags that need to 
> be changed to get the build going; and we don't know if the build will work. 
> But we can get started on helping you test it with a complete error message 
> to petsc-maint

Ok - we don't support this configuration because both PETSc and hypre
internally use mpiuni type code. And when you combine
petsc+petsc-mpiuni with hypre+hypre-mpiuni petsc-mpiuni and
hypre-mpiuni conflict with each other - and things break.

[this is similar to mixing mpich compiled code with openmpi compiled
code]

So we'll always require a proper (common) MPI for using hypre from
PETSc.

Satish

Reply via email to