Re: [OMPI users] Open-MPI 1.2 and GM

2007-03-27 Thread George Bosilca
Justin, There is no GM MTL. Therefore, the first mpirun allow the use of every available BTL, while the second one don't allow intra-node communications or self. The correct mpirun command line should be: mpirun -np 4 --mca btl gm,self ... george. On Mar 27, 2007, at 12:18 PM, Justin

Re: [OMPI users] Issues with Get/Put and IRecv

2007-03-27 Thread Mike Houston
Well, mpich2 and mvapich2 are working smoothly for my app. mpich2 under gige is also giving ~2X the performance of openmpi during the working cases for openmpi. After the paper deadline, I'll attempt to package up a simple test case and send it to the list. Thanks! -Mike Mike Houston

Re: [OMPI users] [Re: Memory leak in openmpi-1.2?]

2007-03-27 Thread Mohamad Chaarawi
Hello Mr. Van der Vlies, We are currently looking into this problem and will send out an email as soon as we recognize something and fix it. Thank you, > Subject: Re: [OMPI users] Memory leak in openmpi-1.2? > Date: Tue, 27 Mar 2007 13:58:15 +0200 > From: Bas van der Vlies >

[OMPI users] Open-MPI 1.2 and GM

2007-03-27 Thread Justin Bronder
Having a user who requires some of the features of gfortran in 4.1.2, I recently began building a new image. The issue is that "-mca btl gm" fails while "-mca mtl gm" works. I have not yet done any benchmarking, as I was wondering if the move to mtl is part of the upgrade. Below are the

Re: [OMPI users] MPI processes swapping out

2007-03-27 Thread Heywood, Todd
I tried the trunk version with "--mca btl tcp,self". Essentially system time changes to idle time, since empty polling is being replaced by blocking (right?). Page faults go to 0 though. It is interesting since you can see what is going on now, with distinct phases of user time and idle time

Re: [OMPI users] very long linking time with mixed-language libraries

2007-03-27 Thread Jeff Squyres
I notice that you are using the "medium" sized F90 bindings. Do these FAQ entries help? http://www.open-mpi.org/faq/?category=mpi-apps#f90-mpi-slow-compiles http://www.open-mpi.org/faq/?category=building#f90-bindings-slow-compile On Mar 27, 2007, at 2:21 AM, de Almeida, Valmor F. wrote:

Re: [OMPI users] Memory leak in openmpi-1.2?

2007-03-27 Thread Bas van der Vlies
Bas van der Vlies wrote: Hello, We are testing openmpi version 1.2 on Debian etch with openib. Some of our users are using scalapack/blacs that are running for a long time use a lot of mpi_comm functions. we have made a small C example that test if the mpi can handle this situation (see

[OMPI users] very long linking time with mixed-language libraries

2007-03-27 Thread de Almeida, Valmor F.
Hello, I am using mpic++ to create a program that combines c++ and f90 libraries. The libraries are created with mpic++ and mpif90. OpenMPI-1.2 was built using gcc-4.1.1. (below follows the output of ompi_info. The final linking stage takes quite a long time compared to the creation of the