Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-08-19 Thread Philippe
sh a connect to port 1. Thanks so much! p. still here... still trying... ;-) On Tue, Jul 27, 2010 at 12:58 AM, Ralph Castain <r...@open-mpi.org> wrote: > Use what hostname returns - don't worry about IP addresses as we'll discover > them. > > On Jul 26, 2010, at 10:45 PM,

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-08-19 Thread Philippe
surgery tomorrow morning, so > please forgive the delay. > > On Thu, Aug 19, 2010 at 11:13 AM, Philippe <phil...@mytoaster.net> wrote: >> >> Ralph, >> >> I'm able to use the generic module when the processes are on different >> machines. >> >> what

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-08-23 Thread Philippe
t get_node_rank is returning the wrong value for > the second process (your rank=1). If you want to dig deeper, look at the > orte/mca/ess/generic code where it generates the nidmap and pidmap. There is > a bug down there somewhere that gives the wrong answer when ppn > 1. > > > On Thu

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-08-23 Thread Philippe
> On Aug 23, 2010, at 1:15 PM, Philippe wrote: > >> I took a look at the code but I'm afraid I dont see anything wrong. >> >> p. >> >> On Thu, Aug 19, 2010 at 2:32 PM, Ralph Castain <r...@open-mpi.org> wrote: >>> Yes, that is correct - we reserv

[OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-06-25 Thread Philippe
, which make me feel better about the sequence of MPI calls) Regards, Philippe. Background: I intend to use openMPI to transport data inside a much larger application. Because of that, I cannot used mpiexec. Each process is started by our own "job management" and use a name server to

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-18 Thread Philippe
band messaging system. That system has > zero connection to any BTL, so it should crash either way. > > Regardless, I will play with this a little as time allows. Thanks for the > reproducer! > > > On Jun 25, 2010, at 7:23 AM, Philippe wrote: > >> Hi, >> >&

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-21 Thread Philippe
I_thread_init since its using a lot of threads). Is there a documentation or example I can use to see what information I can pass to the processes to enable that? Is it just environment variables? Many thanks! p. > > On Jul 18, 2010, at 4:09 PM, Philippe wrote: > >> Ralph, >>

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-22 Thread Philippe
On Wed, Jul 21, 2010 at 10:44 AM, Ralph Castain <r...@open-mpi.org> wrote: > > On Jul 21, 2010, at 7:44 AM, Philippe wrote: > >> Ralph, >> >> Sorry for the late reply -- I was away on vacation. > > no problem at all! > >> >> regardin

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-22 Thread Philippe
; * OMPI_MCA_oob_tcp_static_ports=6000-6010  <== obviously, replace this with > your range > > You will need a port range that is at least equal to the ppn for the job > (each proc on a node will take one of the provided ports). > > That should do it. I compute everything else I ne

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-27 Thread Philippe
> > Anything at r23478 or above will have the new module. Let me know how it > works for you. I haven't tested it myself, but am pretty sure it should work. > > > On Jul 22, 2010, at 3:22 PM, Philippe wrote: > >> Ralph, >> >> Thank you so much!! >> >>

[OMPI users] crash with hardware virtualization

2011-10-13 Thread Philippe Gouret
installed OpenMPI 1.4.3 and tree-ppuzzle 5.2. When i run the same command: "mpirun -np 8 tree-ppuzzle data_file", it crashes after few seconds. The error message is: [philippe-VirtualBox:01542] Signal: Segmentation fault (11) [philippe-VirtualBox:01542] Signal code: Address not

[OMPI users] configuration and compilation outputs

2010-03-29 Thread Philippe GOURET
outputs.tar.gz Description: File Attachment: outputs.tar.gz