Dear Ralph,
Thank you for your reply. Do you know where I could find libcrypto.so.0.9.8 ?

Best,
On Oct 3, 2018, at 9:41 PM, Ralph H Castain 
<r...@open-mpi.org<mailto:r...@open-mpi.org>> wrote:

Actually, I see that you do have the tm components built, but they cannot be 
loaded because you are missing libcrypto from your LD_LIBRARY_PATH


On Oct 3, 2018, at 12:33 PM, Ralph H Castain 
<r...@open-mpi.org<mailto:r...@open-mpi.org>> wrote:

Did you configure OMPI —with-tm=<path-to-PBS-libs>? It looks like we didn’t 
build PBS support and so we only see one node with a single slot allocated to 
it.


On Oct 3, 2018, at 12:02 PM, Castellana Michele 
<michele.castell...@curie.fr<mailto:michele.castell...@curie.fr>> wrote:

Dear all,
I am having trouble running an MPI code across multiple cores on a new computer 
cluster, which uses PBS. Here is a minimal example, where I want to run two MPI 
processes, each on  a different node. The PBS script is

#!/bin/bash
#PBS -l walltime=00:01:00
#PBS -l mem=1gb
#PBS -l nodes=2:ppn=1
#PBS -q batch
#PBS -N test
mpirun -np 2 ./code.o

and when I submit it with

$qsub script.sh

I get the following message in the PBS error file

$ cat test.e1234
[shbli040:08879] mca_base_component_repository_open: unable to open mca_plm_tm: 
libcrypto.so.0.9.8: cannot open shared object file: No such file or directory 
(ignored)
[shbli040:08879] mca_base_component_repository_open: unable to open mca_oob_ud: 
libibverbs.so.1: cannot open shared object file: No such file or directory 
(ignored)
[shbli040:08879] mca_base_component_repository_open: unable to open mca_ras_tm: 
libcrypto.so.0.9.8: cannot open shared object file: No such file or directory 
(ignored)
--------------------------------------------------------------------------
There are not enough slots available in the system to satisfy the 2 slots
that were requested by the application:
  ./code.o

Either request fewer slots for your application, or make more slots available
for use.
—————————————————————————————————————

The PBS version is

$ qstat --version
Version: 6.1.2

and here is some additional information on the MPI version

$ mpicc -v
Using built-in specs.
COLLECT_GCC=/bin/gcc
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-redhat-linux/4.8.5/lto-wrapper
Target: x86_64-redhat-linux
[…]
Thread model: posix
gcc version 4.8.5 20150623 (Red Hat 4.8.5-28) (GCC)

Do you guys know what may be the issue here?

Thank you
Best,







_______________________________________________
users mailing list
users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users

_______________________________________________
users mailing list
users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users

_______________________________________________
users mailing list
users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users

_______________________________________________
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/users

Reply via email to