Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-19 Thread Mukkie
Ok, Thanks. In that case, can we reopen this issue then, to get an update from the participants.? Cordially, Muku. On Thu, Oct 19, 2017 at 4:37 PM, r...@open-mpi.org wrote: > Actually, I don’t see any related changes in OMPI master, let alone the > branches. So far as I can

Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-19 Thread r...@open-mpi.org
Actually, I don’t see any related changes in OMPI master, let alone the branches. So far as I can tell, the author never actually submitted the work. > On Oct 19, 2017, at 3:57 PM, Mukkie wrote: > > FWIW, my issue is related to this one. >

Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-19 Thread Mukkie
FWIW, my issue is related to this one. https://github.com/open-mpi/ompi/issues/1585 I have version 3.0.0 and the above issue is closed saying, fixes went into 3.1.0 However, i don't see the code changes towards this issue.? Cordially, Muku. On Wed, Oct 18, 2017 at 3:52 PM, Mukkie

Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-18 Thread Mukkie
Thanks for your suggestion. However my firewall's are already disabled on both the machines. Cordially, Muku. On Wed, Oct 18, 2017 at 2:38 PM, r...@open-mpi.org wrote: > Looks like there is a firewall or something blocking communication between > those nodes? > > On Oct 18,

Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-18 Thread r...@open-mpi.org
Looks like there is a firewall or something blocking communication between those nodes? > On Oct 18, 2017, at 1:29 PM, Mukkie wrote: > > Adding a verbose output. Please check for failed and advise. Thank you. > > [mselvam@ipv-rhel73 examples]$ mpirun -hostfile host

Re: [OMPI users] IpV6 Openmpi mpirun failed

2017-10-18 Thread Mukkie
Adding a verbose output. Please check for failed and advise. Thank you. [mselvam@ipv-rhel73 examples]$ mpirun -hostfile host --mca oob_base_verbose 100 --mca btl tcp,self ring_c [ipv-rhel73:10575] mca_base_component_repository_open: unable to open mca_plm_tm: libtorque.so.2: cannot open shared

[OMPI users] IpV6 Openmpi mpirun failed

2017-10-18 Thread Mukkie
Hi, I have two ipv6 only machines, I configured/built OMPI version 3.0 with - -enable-ipv6 I want to verify a simple MPI communication call through tcp ip between these two machines. I am using ring_c and connectivity_c examples. Issuing from one of the host machine… [mselvam@ipv-rhel73