I compiled a fresh copy of the 4.1.3 branch on my M1 laptop, and I can run
both MPI and non-MPI apps without any issues.

Try running `lldb mpirun -- -np 1 hostname` and once it deadlocks, do a
CTRL+C to get back on the debugger and then `backtrace` to see where it is
waiting.

George.


On Wed, May 4, 2022 at 11:28 AM Scott Sayres via users <
users@lists.open-mpi.org> wrote:

> Thanks for looking at this Jeff.
> No, I cannot use mpirun to launch a non-MPI application.    The command
> "mpirun -np 2 hostname" also hangs.
>
> I get the following output if I add the -d  command before (I've replaced
> the server with the hashtags) :
>
> [scotts-mbp.3500.dhcp.###:05469] procdir:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469/0/0
>
> [scotts-mbp.3500.dhcp.###:05469] jobdir:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469/0
>
> [scotts-mbp.3500.dhcp.###:05469] top:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469
>
> [scotts-mbp.3500.dhcp.###:05469] top:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501
>
> [scotts-mbp.3500.dhcp.###:05469] tmp:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T/
>
> [scotts-mbp.3500.dhcp.###:05469] sess_dir_cleanup: job session dir does
> not exist
>
> [scotts-mbp.3500.dhcp.###:05469] sess_dir_cleanup: top session dir not
> empty - leaving
>
> [scotts-mbp.3500.dhcp.###:05469] procdir:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469/0/0
>
> [scotts-mbp.3500.dhcp.###:05469] jobdir:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469/0
>
> [scotts-mbp.3500.dhcp.###:05469] top:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501/pid.5469
>
> [scotts-mbp.3500.dhcp.###:05469] top:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T//ompi.scotts-mbp.501
>
> [scotts-mbp.3500.dhcp.###:05469] tmp:
> /var/folders/l0/94hsdtwj09xd62d90nfh_3h00000gn/T/
>
> [scotts-mbp.3500.dhcp.###:05469] [[48286,0],0] Releasing job data for
> [INVALID]
>
> Can you recommend a way to find where mpirun gets stuck?
> Thanks!
> Scott
>
> On Wed, May 4, 2022 at 6:06 AM Jeff Squyres (jsquyres) <jsquy...@cisco.com>
> wrote:
>
>> Are you able to use mpirun to launch a non-MPI application?  E.g.:
>>
>> mpirun -np 2 hostname
>>
>> And if that works, can you run the simple example MPI apps in the
>> "examples" directory of the MPI source tarball (the "hello world" and
>> "ring" programs)?  E.g.:
>>
>> cd examples
>> make
>> mpirun -np 4 hello_c
>> mpirun -np 4 ring_c
>>
>> --
>> Jeff Squyres
>> jsquy...@cisco.com
>>
>> ________________________________________
>> From: users <users-boun...@lists.open-mpi.org> on behalf of Scott Sayres
>> via users <users@lists.open-mpi.org>
>> Sent: Tuesday, May 3, 2022 1:07 PM
>> To: users@lists.open-mpi.org
>> Cc: Scott Sayres
>> Subject: [OMPI users] mpirun hangs on m1 mac w openmpi-4.1.3
>>
>> Hello,
>> I am new to openmpi, but would like to use it for ORCA calculations, and
>> plan to run codes on the 10 processors of my macbook pro.  I installed this
>> manually and also through homebrew with similar results.  I am able to
>> compile codes with mpicc and run them as native codes, but everything that
>> I attempt with mpirun, mpiexec just freezes.  I can end the program by
>> typing 'control C' twice, but it continues to run in the background and
>> requires me to 'kill <pid>'.
>> even as simple as 'mpirun uname' freezes
>>
>> I have tried one installation by: 'arch -arm64 brew install openmpi '
>> and a second by downloading the source file, './configure
>> --prefix=/usr/local', 'make all', make install
>>
>> the commands: 'which mpicc', 'which 'mpirun', etc are able to find them
>> on the path... it just hangs.
>>
>> Can anyone suggest how to fix the problem of the program hanging?
>> Thanks!
>> Scott
>>
>
>
> --
> Scott G Sayres
> Assistant Professor
> School of Molecular Sciences (formerly Department of Chemistry &
> Biochemistry)
> Biodesign Center for Applied Structural Discovery
> Arizona State University
>

Reply via email to