(1) How did you execute it ? mpiexec command line , etc?

any way of a number, does not seem to make any difference. Always on
the command line. I tried mpiexec, mpirun, just running the executable
without wrappers (used to work before, if one wanted to run one single
process). I also tried using a different number of processes, from 1 to 8,
no difference. In any case, if I run the code under gdb (or a graphical
wrapper to gdb, like ddd), I can see that the code hangs when trying to
execute the MPI_Init() call. Or, if I try some of the scalapack example
codes, the first call that implicitly tries to initialise MPI.


(2) Can you give the output of :

$ dpkg -l | grep pmix

gmulas@capitanata:~/transfer/debsrc$ dpkg -l | grep pmix
ii  libpmix2:amd64                                                   3.0.0-1    
                                             amd64        Process Management 
Interface (Exascale) library


$ dpkg -l | grep psm

gmulas@capitanata:~/transfer/debsrc$ dpkg -l | grep psm
ii  gir1.2-osmgpsmap-1.0                                             1.1.0-3    
                                             amd64        GTK+ library to embed 
OpenStreetMap maps - Python bindings
ii  libcupsmime1:amd64                                               2.2.8-5    
                                             amd64        Common UNIX Printing 
System(tm) - MIME library
ii  libosmgpsmap-1.0-1:amd64                                         1.1.0-3    
                                             amd64        GTK+ library to embed 
OpenStreetMap maps
ii  libpsm-infinipath1                                               
3.3+20.604758e7-5                                       amd64        PSM 
Messaging library for Intel Truescale adapters
ii  libpsm2-2                                                        10.3.58-1  
                                             amd64        Intel PSM2 library
ii  psmisc                                                           23.1-1+b1  
                                             amd64        utilities that use 
the proc file system

Any changes to /etc/openmpi/* files or environmental variables

No changes to /etc/openmpi/*. Here is my complete environment:

gmulas@capitanata:~/transfer/debsrc$ env
LC_ALL=it_IT.utf8
LC_MEASUREMENT=it_IT.UTF-8
LC_PAPER=it_IT.UTF-8
LC_MONETARY=it_IT.UTF-8
XDG_MENU_PREFIX=gnome-
LANG=it_IT.utf8
GDM_LANG=it_IT.utf8
LESS=-eiM
DISPLAY=:0
OLDPWD=/home/gmulas/transfer/debsrc/scalapack-2.0.2
DUSTEM_FILTER_DIR=/home/gmulas/PAHmodels/DUSTEM/Data/FILTERS/
EDITOR=joe
COLORTERM=truecolor
USERNAME=gmulas
NO_AT_BRIDGE=1
XDG_VTNR=2
SSH_AUTH_SOCK=/run/user/1000/keyring/ssh
PARA_ARCH=MPI
CLASSPATH=/usr/local/JMF-2.1.1e/lib/jmf.jar:.:/usr/local/JMF-2.1.1e/lib/jmf.jar:.
XDG_SESSION_ID=20
USER=gmulas
DUSTEM_DAT=/tmp/
DUSTEM_SOFT_DIR=/home/gmulas/PAHmodels/DUSTEM
PAGER=/usr/bin/less -eiMs
DESKTOP_SESSION=gnome-xorg
JMFHOME=/usr/local/JMF-2.1.1e
GNOME_TERMINAL_SCREEN=/org/gnome/Terminal/screen/7aa5ce9c_afbf_4166_9720_8b3d3187d254
PWD=/home/gmulas/transfer/debsrc
HOME=/home/gmulas
TMP=/tmp/user/1000
DUSTEM_DRAINE_DATA_DIR=/home/gmulas/PAHmodels/DUSTEM/Data/DRAINE/
SSH_AGENT_PID=13026
UPARM=/home/gmulas/uparm
QT_ACCESSIBILITY=1
MONKEYSPHERE_VALIDATION_AGENT_SOCKET=http://127.0.0.1:35619
XDG_SESSION_TYPE=x11
MIDASHOME=/usr/local/midas
XDG_DATA_DIRS=/usr/share/gnome:/home/gmulas/.local/share/flatpak/exports/share/:/var/lib/flatpak/exports/share/:/usr/local/share/:/usr/share/:
DUSTEM_RES=/tmp/
XDG_SESSION_DESKTOP=gnome-xorg
GJS_DEBUG_OUTPUT=stderr
LC_NUMERIC=it_IT.UTF-8
DUSTEM_WHICH=DESERT
GTK_MODULES=gtk-vector-screenshot:gail:atk-bridge
WINDOWPATH=2
TURBOMOLE_SYSNAME=em64t-unknown-linux-gnu
TERM=xterm-256color
SHELL=/bin/bash
VTE_VERSION=5202
TEMPDIR=/tmp/user/1000
TURBODIR=/usr/local/tmoleX/TURBOMOLE
XDG_CURRENT_DESKTOP=GNOME
GPG_AGENT_INFO=/run/user/1000/gnupg/S.gpg-agent:0:1
GNOME_TERMINAL_SERVICE=:1.319
SHLVL=1
XDG_SEAT=seat0
DUSTEM_PAHUV=DESERT
TEMP=/tmp/user/1000
GDMSESSION=gnome-xorg
LESSCHARSET=utf-8
GNOME_DESKTOP_SESSION_ID=this-is-deprecated
LOGNAME=gmulas
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1000/bus
XDG_RUNTIME_DIR=/run/user/1000
XAUTHORITY=/run/user/1000/gdm/Xauthority
UDEdir=
JACK_START_SERVER=1
PATH=/usr/local/tmoleX/TURBOMOLE/bin/em64t-unknown-linux-gnu_mpi:/usr/local/tmoleX/TURBOMOLE/scripts:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/gOpenMol-3.00/bin
MIDVERS=13SEP
GJS_DEBUG_TOPICS=JS ERROR;JS LOG
SESSION_MANAGER=local/capitanata:@/tmp/.ICE-unix/13067,unix/capitanata:/tmp/.ICE-unix/13067
PIPE_HOME=/usr/local/midas/13SEP/pipeline
LC_TIME=it_IT.UTF-8
_=/usr/bin/env

And, in case it may help, here goes the output of ldd on the simple "hello
world" MPI code I sent you, that hangs here:

gmulas@capitanata:~/PAHmodels/anharmonica-scalapack$ ldd sample_printf
        linux-vdso.so.1 (0x00007ffd034e3000)
        libmpi.so.40 => /usr/lib/x86_64-linux-gnu/libmpi.so.40 
(0x00007f94cc309000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 
(0x00007f94cc2e8000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f94cc12b000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f94cc126000)
        libopen-rte.so.40 => /usr/lib/x86_64-linux-gnu/libopen-rte.so.40 
(0x00007f94cc06f000)
        libopen-pal.so.40 => /usr/lib/x86_64-linux-gnu/libopen-pal.so.40 
(0x00007f94cbfc2000)
        librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f94cbfb6000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f94cbe22000)
        libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f94cbe1d000)
        libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 
(0x00007f94cbddc000)
        libevent-2.1.so.6 => /usr/lib/x86_64-linux-gnu/libevent-2.1.so.6 
(0x00007f94cbb86000)
        libevent_pthreads-2.1.so.6 => 
/usr/lib/x86_64-linux-gnu/libevent_pthreads-2.1.so.6 (0x00007f94cb983000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f94cc4a8000)
        libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 
(0x00007f94cb776000)
        libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 
(0x00007f94cb56c000)


Please let me know whatever I can do to help pinpoint this. I need to be
able to use my computer to develop my MPI codes, and I cannot understand at
all why it abruptly stopped working. Also, if you tell me it does work
properly on another current sid system, I'd like to find out what makes the
difference.

Thanks in advance
Giacomo Mulas

--
_________________________________________________________________

Giacomo Mulas <giacomo.mu...@inaf.it>
_________________________________________________________________

INAF - Osservatorio Astronomico di Cagliari
via della scienza 5 - 09047 Selargius (CA)

tel.   +39 070 71180255
mob. : +39 329  6603810
_________________________________________________________________

"When the storms are raging around you, stay right where you are"
                         (Freddy Mercury)
_________________________________________________________________

Reply via email to