Re: [OMPI users] Ok, I've got OpenMPI set up, now what?!

2010-07-19 Thread Jed Brown
On Mon, 19 Jul 2010 13:33:01 -0600, Damien Hocking wrote: > It does. The big difference is that MUMPS is a 3-minute compile, and > PETSc, erm, isn't. It's..longer... FWIW, PETSc takes less than 3 minutes to build (after configuration) for me (I build it every day).

Re: [OMPI users] Ok, I've got OpenMPI set up, now what?!

2010-07-19 Thread Damien Hocking
It does. The big difference is that MUMPS is a 3-minute compile, and PETSc, erm, isn't. It's..longer... D On 19/07/2010 12:56 PM, Daniel Janzon wrote: Thanks a lot! PETSc seems to be really solid and integrates with MUMPS suggested by Damien. All the best, Daniel Janzon On 7/18/10,

Re: [OMPI users] openmpi v1.5?

2010-07-19 Thread Jeff Squyres
I'm actually waiting for *1* more bug fix before we consider 1.5 "complete". On Jul 19, 2010, at 3:24 PM, Jed Brown wrote: > On Mon, 19 Jul 2010 15:16:59 -0400, Michael Di Domenico > wrote: >> Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is >>

Re: [OMPI users] openmpi v1.5?

2010-07-19 Thread Jed Brown
On Mon, 19 Jul 2010 15:16:59 -0400, Michael Di Domenico wrote: > Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is > scheduled for release? https://svn.open-mpi.org/trac/ompi/milestone/Open%20MPI%201.5 > And whether the Slurm srun changes are going to

[OMPI users] openmpi v1.5?

2010-07-19 Thread Michael Di Domenico
Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is scheduled for release? And whether the Slurm srun changes are going to make in? thanks

Re: [OMPI users] Ok, I've got OpenMPI set up, now what?!

2010-07-19 Thread Daniel Janzon
Thanks a lot! PETSc seems to be really solid and integrates with MUMPS suggested by Damien. All the best, Daniel Janzon On 7/18/10, Gustavo Correa wrote: > Check PETSc: > http://www.mcs.anl.gov/petsc/petsc-as/ > > On Jul 18, 2010, at 12:37 AM, Damien wrote: > >> You

Re: [OMPI users] Dynamic processes connection and segfault on MPI_Comm_accept

2010-07-19 Thread Edgar Gabriel
Hm, so I am not sure how to approach this. First of all, the test case works for me. I used up to 80 clients, and for both optimized and non-optimized compilation. I ran the tests with trunk (not with 1.4 series, but the communicator code is identical in both cases). Clearly, the patch from Ralph

Re: [OMPI users] MPE logging GUI

2010-07-19 Thread Stefan Kuhne
Am 19.07.2010 16:32, schrieb Anthony Chan: Hello Anthony, > > Just curious, is there any reason you are looking for another > tool to view slog2 file ? > I'm looking for a more clearer tool. I find jumpstart a little bit overloaded. Regards, Stefan Kuhne signature.asc Description: OpenPGP

[OMPI users] openib issues

2010-07-19 Thread Eloi Gaudry
Hi, I've been working on a random segmentation fault that seems to occur during a collective communication when using the openib btl (see [OMPI users] [openib] segfault when using openib btl). During my tests, I've come across different issues reported by OpenMPI-1.4.2: 1/

Re: [OMPI users] MPE logging GUI

2010-07-19 Thread Anthony Chan
Just curious, is there any reason you are looking for another tool to view slog2 file ? A.Chan - "Stefan Kuhne" wrote: > Hello, > > does anybody know another tool as jumpstart to view a MPE logging > file? > > Regards, > Stefan Kuhne > > >

Re: [OMPI users] MPICH2 is working OpenMPI Not

2010-07-19 Thread Scott Atchley
Hi Bibrak, The message about malloc looks like a MX message. Which interconnects did you compile support for? If you are using MX, does it appear when you run with: $ mpirun --mca pml cm -np 4 ./exec 98 which uses the MX MTL instead of MX BTL. Scott On Jul 18, 2010, at 9:23 AM, Bibrak Qamar

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-19 Thread Ralph Castain
I'm wondering if we can't make this simpler. What launch environment are you operating under? I know you said you can't use mpiexec, but I'm wondering if we could add support for your environment to mpiexec so you could. On Jul 18, 2010, at 4:09 PM, Philippe wrote: > Ralph, > > thanks for

Re: [OMPI users] MPI process dies with a route error when using dynamic process calls to connect more than 2 clients to a server with InfiniBand

2010-07-19 Thread Ralph Castain
On Jul 18, 2010, at 4:09 PM, Philippe wrote: > Ralph, > > thanks for investigating. > > I've applied the two patches you mentioned earlier and ran with the > ompi server. Although i was able to runn our standalone test, when I > integrated the changes to our code, the processes entered a crazy

[OMPI users] MPE logging GUI

2010-07-19 Thread Stefan Kuhne
Hello, does anybody know another tool as jumpstart to view a MPE logging file? Regards, Stefan Kuhne signature.asc Description: OpenPGP digital signature

Re: [OMPI users] is loop unrolling safe for MPI logic?

2010-07-19 Thread Tim Prince
On 7/18/2010 9:09 AM, Anton Shterenlikht wrote: On Sat, Jul 17, 2010 at 09:14:11AM -0700, Eugene Loh wrote: Jeff Squyres wrote: On Jul 17, 2010, at 4:22 AM, Anton Shterenlikht wrote: Is loop vectorisation/unrolling safe for MPI logic? I presume it is, but are there