I did a quick test to see if I can use Podman in combination with Open MPI: [test@test1 ~]$ mpirun --hostfile ~/hosts podman run quay.io/adrianreber/mpi-test /home/mpi/hello Hello, world (1 procs total) --> Process # 0 of 1 is alive. ->789b8fb622ef Hello, world (1 procs total)
Adrian, the MPI application relies on some environment variables (they typically start with OMPI_ and PMIX_). The MPI application internally uses a PMIx client that must be able to contact a PMIx server (that is included in mpirun and the orted daemon(s) spawned on the remote hosts).
Gilles, thanks for pointing out the environment variables. I quickly created a wrapper which tells Podman to re-export all OMPI_ and PMIX_ variables (grep "\(PMIX\|OMPI\)"). Now it works: $ mpirun --hostfile ~/hosts ./wrapper -v /tmp:/tmp --userns=keep-id --net=host mpi-test /home/mpi/hello
Not really a relevant reply, however Nomad has task drivers for Docker and Singularity https://www.hashicorp.com/blog/singularity-and-hashicorp-nomad-a-perfect-fit I'm not sure if it woul dbe easier to set up an MPI enviroment with Nomad though On Thu, 11 Jul 2019 at 11:08, Adrian Reber via