Most likely the tool you are using to launch the parallel program is wrong 
for the MPI you have linked PETSc with. Are you starting the program with 
mpiexec ? Is that mpiexec the one that goes with the MPI (mpicc or mpif90) that 
you built PETSc with? 

   What happens if you compile a trivial MPI only code with the mpicc and then 
try to run it in parallel with the mpiexec?


   Barry

On Aug 21, 2013, at 5:05 PM, Bishesh Khanal <[email protected]> wrote:

> Dear all,
> My program runs fine when using just one processor, valgrind shows no errors 
> too, but when using more than one processor I get the following errors:
> 
> [0]PETSC ERROR: PetscOptionsInsertFile() line 461 in 
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c
> [0]PETSC ERROR: PetscOptionsInsert() line 623 in 
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c
> [0]PETSC ERROR: PetscInitialize() line 769 in 
> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/pinit.c
> PETSC ERROR: Logging has not been enabled.
> You might have forgotten to call PetscInitialize().
> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> [cli_0]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
> 
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   EXIT CODE: 56
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> ===================================================================================
> 
> I have not forgotten to call PetscInitialize, if that helps!
> Thanks,
> Bishesh

Reply via email to