Satish, Jed, Thank you for the replies and suggestions for a solution.
The MPI_INTEGER4 seems to work. However, the MPIU_INTEGER suggested by Satish gives the same error as did the MPIU_INT, though I include the petscsys.h. Could it relate to the Petsc version? For compatibility reasons I am using the 2.3.3 distribution. regards, Wienand On Wed, Aug 24, 2011 at 10:20 PM, Satish Balay <balay at mcs.anl.gov> wrote: > On Wed, 24 Aug 2011, Jed Brown wrote: > > > On Wed, Aug 24, 2011 at 09:33, Wienand Drenth <w.drenth at gmail.com> > wrote: > > > > > The program compiles without troubles, but running it gives me an > error. It > > > is, I think, related to the MPI_SUM operation: > > > [walrus:463] *** An error occurred in MPI_Scan: the reduction operation > > > MPI_SUM is not defined on the MPI_BYTE datatype > > > [walrus:463] *** on communicator MPI_COMM_WORLD > > > [walrus:463] *** MPI_ERR_OP: invalid reduce operation > > > [walrus:463] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) > > > > > > In addition, I needed to declare MPIU_INT explicitly as integer; this > was > > > not required for the C program. > > > > > > > I believe MPIU_INT (which is normally made to match PetscInt) is not > defined > > in Fortran. You can use MPI_INTEGER4 or MPI_INTEGER8 as needed. > > > > Perhaps PETSc should define MPIU_INT, MPIU_SCALAR, etc in Fortran too? > > MPI does MPI_INT (c) vs MPI_INTEGER (fortran) - so we have > MPIU_INTEGER on the fortran side. [in finclude/petscsys.h] > > And we do have MPIU_SCALAR on the fortran side. > > Satish > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110825/da3f46d3/attachment.htm>
