After compiling with MPI Uni and running some code gives:

~/Src/petsc/src/dm/impls/plex/examples/tutorials/dmcircuit  
shri/projects-dmcircuit $ ./PF -pfdata case22996.m

[0]PETSC ERROR: PetscCommBuildTwoSided_Allreduce() line 166 in 
/Users/barrysmith/Src/PETSc/src/sys/utils/mpits.c
[0]PETSC ERROR: PetscCommBuildTwoSided() line 238 in 
/Users/barrysmith/Src/PETSc/src/sys/utils/mpits.c
[0]PETSC ERROR: PetscSFSetUp_Basic() line 332 in 
/Users/barrysmith/Src/PETSc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: PetscSFSetUp() line 191 in 
/Users/barrysmith/Src/PETSc/src/vec/is/sf/interface/sf.c
[0]PETSC ERROR: PetscSFBcastBegin() line 917 in 
/Users/barrysmith/Src/PETSc/src/vec/is/sf/interface/sf.c
[0]PETSC ERROR: DMGlobalToLocalBegin() line 1679 in 
/Users/barrysmith/Src/PETSc/src/dm/interface/dm.c
[0]PETSC ERROR: SetInitialValues() line 750 in pf.c
[0]PETSC ERROR: main() line 914 in pf.c
Abort trap: 6

  A quick check of the code shows the problem

#define MPI_Type_extent(datatype,extent) *(extent) = datatype
#define MPI_Type_size(datatype,size) *(size) = datatype

Note that datatype in MPI uni is suppose to encode the size of the data type, 
not sure if you guys respected that properly in your new PETSc code.

So, do we plan to continue to support MPI uni and is that possible (relatively 
easily)?  Is MPI uni tested properly with the new functionality?

   Barry


Reply via email to