Hi, today I installed openmpi-1.9a1r32664 on my machines (Solaris 10 Sparc (tyr), Solaris 10 x86_64 (sunpc1), and openSUSE Linux 12.1 x86_64 (linpc1)) with Sun C 5.12 and gcc-4.9.0.
I get the following internal failure for my gcc.4.9.0-version on Linux. I also have the other errors which I already reported for Sun C. linpc1 small_prog 118 ompi_info | grep -e MPI: -e "C compiler:" Open MPI: 1.9a1r32664 C compiler: gcc linpc1 small_prog 118 mpicc init_finalize.c linpc1 small_prog 119 mpiexec -np 1 a.out Hello! linpc1 small_prog 120 mpiexec -np 2 a.out -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_proc_complete_init failed --> Returned "(null)" (-27) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [linpc1:31592] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:startup:internal-failure [linpc1:31592] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages linpc1 small_prog 121 gdb mpiexec GNU gdb (GDB) SUSE (7.3-41.1.2) Copyright (C) 2011 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-suse-linux". For bug reporting instructions, please see: <http://www.gnu.org/software/gdb/bugs/>... Reading symbols from /usr/local/openmpi-1.9_64_gcc/bin/mpiexec...done. (gdb) run -np 2 a.out Starting program: /usr/local/openmpi-1.9_64_gcc/bin/mpiexec -np 2 a.out Missing separate debuginfo for /lib64/ld-linux-x86-64.so.2 Try: zypper install -C "debuginfo(build-id)=f20c99249f5a5776e1377d3bd728502e3f455a3f" Missing separate debuginfo for /usr/lib64/libnuma.so.1 Try: zypper install -C "debuginfo(build-id)=a459130c584aae8d867df651f0d99b8f359c369c" Missing separate debuginfo for /usr/lib64/libpciaccess.so.0 Try: zypper install -C "debuginfo(build-id)=a86c28f439ba7254afee2afed26e9d6c14a4a9f2" Missing separate debuginfo for /lib64/libdl.so.2 Try: zypper install -C "debuginfo(build-id)=8d32fdb9682242cc2ebc1d9e6d717c6eaa51714e" Missing separate debuginfo for /lib64/librt.so.1 Try: zypper install -C "debuginfo(build-id)=b38afcf428f2107c56c0939b59ef737a5571348c" Missing separate debuginfo for /lib64/libm.so.6 Try: zypper install -C "debuginfo(build-id)=8ee6418257efac9e7fbadc657c30c62e0a002d57" Missing separate debuginfo for /lib64/libutil.so.1 Try: zypper install -C "debuginfo(build-id)=ae21e2f7efccaf6967e62a9691d2cd2f1533d6d3" Missing separate debuginfo for /lib64/libpthread.so.0 Try: zypper install -C "debuginfo(build-id)=1f368f83b776815033caab6e389d7030bba4593e" [Thread debugging using libthread_db enabled] Missing separate debuginfo for /lib64/libc.so.6 Try: zypper install -C "debuginfo(build-id)=7b169b1db50384b70e3e4b4884cd56432d5de796" [New Thread 0x7ffff4939700 (LWP 31620)] Detaching after fork from child process 31621. Detaching after fork from child process 31622. -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_proc_complete_init failed --> Returned "(null)" (-27) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [linpc1:31603] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:startup:internal-failure [linpc1:31603] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages [Thread 0x7ffff4939700 (LWP 31620) exited] [Inferior 1 (process 31603) exited with code 01] (gdb) bt No stack. (gdb) I would be grateful, if somebody can fix the problem. Thank you very much for any help in advance. Kind regards