Are you setting PETSC_COMM_WORLD before these calls? Perhaps on some 
processes but not on others?

   Barry

> On Dec 15, 2014, at 7:52 AM, Matthew Knepley <[email protected]> wrote:
> 
> On Mon, Dec 15, 2014 at 7:12 AM, Florian Lindner <[email protected]> wrote:
> Hello,
> 
> since our application has two possible entry paths, petsc could be 
> initialized at different positions. When used as a library (in contrast to a 
> standalone executable), the code looks like:
> 
>     PetscErrorCode ierr;
>     std::cout << "Petsc before PetscInitializeNoArguments()"  << std::endl;
>     ierr = PetscInitializeNoArguments(); CHKERRV(ierr);
>     std::cout << "Petsc after PetscInitializeNoArguments()"  << std::endl;
> 
> It never get's to "Petsc after..." and instead hangs and prints this error 
> message:
> 
> Internal Error: invalid error code 609e0e (Ring ids do not match) in 
> MPIR_Allreduce_impl:712
> [0]PETSC ERROR: #1 PetscWorldIsSingleHost() line 99 in 
> /data2/scratch/lindner/petsc/src/sys/utils/pdisplay.c
> [0]PETSC ERROR: #2 PetscSetDisplay() line 123 in 
> /data2/scratch/lindner/petsc/src/sys/utils/pdisplay.c
> [0]PETSC ERROR: #3 PetscOptionsCheckInitial_Private() line 324 in 
> /data2/scratch/lindner/petsc/src/sys/objects/init.c
> [0]PETSC ERROR: #4 PetscInitialize() line 881 in 
> /data2/scratch/lindner/petsc/src/sys/objects/pinit.c
> [0]PETSC ERROR: #5 SolverInterfaceImpl() line 120 in 
> src/precice/impl/SolverInterfaceImpl.cpp
> 
> Which is somehow incomprehensible for me... What could be the cause for that?
> 
> It means all procs did not callPetscInitialize(). We call MPI_Allreduce(), 
> which needs all procs in MPI_COMM_WORLD.
> 
>   Thanks,
> 
>     Matt
>  
> Thanks,
> Florian
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener

Reply via email to