[OMPI devel] Broken TotalView behavior in 1.5.4

2011-08-19 Thread David Gunter
mpirun -a -n 16 my.exe The same application built against ompi-1.4.3 debugs just fine under Totalview. Any thoughts? -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National Laboratory

Re: [OMPI devel] Broken TotalView behavior in 1.5.4

2011-08-19 Thread David Gunter
Just a follow-up. I've build OMPI-1.5.4 on a non-SLURM, PBS-based system and TotalView behaves the same (incorrect) way. Still not sure what is happening. -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National Laboratory On Aug 19, 2011, at 1:17 PM, David Gunter

Re: [OMPI devel] Inherent limit on #communicators?

2009-04-30 Thread David Gunter
Just to throw out more info on this, the test code runs fine on previous versions of OMPI. It only hangs on the 1.3 line when the cid reaches 65536. -david -- David Gunter HPC-3: Parallel Tools Team Los Alamos National Laboratory On Apr 30, 2009, at 12:28 PM, Edgar Gabriel wrote: cid&#

Re: [OMPI devel] Inherent limit on #communicators?

2009-04-30 Thread David Gunter
end if end do c if (icolor.eq.0) call mpi_comm_free(local_comm, ierr) call MPI_barrier(MPi_COMM_WORLD,ierr) call MPI_FINALIZE(IERR) print *, myid, ierr end -david -- David Gunter HPC-3: Parallel Tools Team Los Alamos National Laboratory On Apr 30, 2009, a

[OMPI devel] Ticket #1982 - Fortran MPI_IN_PLACE issue

2009-09-22 Thread David Gunter
0x2040 == (0x6020/17, 0x6024/18, 0x2040/32, 0x602c/20) Fortran MPI_BOTTOM is 32 I still don't see what the problem is for the two different versions of OMPI are. OSX 10.5.8, GCC 4.4.1, most recent libtool, autoconf, automake and m4. -david -- David Gunter HPC-3: Parallel Tools

Re: [OMPI devel] Ticket #1982 - Fortran MPI_IN_PLACE issue

2009-09-22 Thread David Gunter
I meant to say "configure", not "configure.in" below. -- David Gunter HPC-3: Parallel Tools Team Los Alamos National Laboratory On Sep 22, 2009, at 8:05 AM, David Gunter wrote: I've been playing around with Jeff's "bogus" tarball and I, too, see i

Re: [OMPI devel] Ticket #1982 - Fortran MPI_IN_PLACE issue

2009-09-22 Thread David Gunter
I don't believe I have an account to add comments - I would appreciate one! Thanks, david -- David Gunter HPC-3: Parallel Tools Team Los Alamos National Laboratory On Sep 22, 2009, at 8:24 AM, Jeff Squyres wrote: Thanks! I added these comments to #1982 (don't hesitate to add

[OMPI devel] SC09 gatherings?

2009-10-08 Thread David Gunter
I'm making travel plans for SC09 and was wondering if there is going to be an OMPI gathering during that time. During the week would be grand but I'd hate to make plans to depart Friday (11/20) only to discover that's the date. Are there plans in the works already? -- Dav

Re: [OMPI devel] SC09 gatherings?

2009-10-08 Thread David Gunter
I just found the BOF session schedule for Wednesday of that week...sorry for the spam. -david -- David Gunter HPC-3: Parallel Tools Team Los Alamos National Laboratory On Oct 8, 2009, at 2:59 PM, David Gunter wrote: I'm making travel plans for SC09 and was wondering if there is going

Re: [OMPI devel] NEWS file for 1.3.4

2009-10-28 Thread David Gunter
SLURM 2.0.5 is the new requirement for LANL as far us upgrading to OMPI 1.3.4. It's been sufficiently pounded into my skull (can't vouch for others) that I would think nothing needs to be said in the NEWS file. -david -- David Gunter HPC-3: Infrastructure Team Los Alamo

Re: [OMPI devel] Open MPI v1.3.4rc4 is out

2009-11-05 Thread David Gunter
components_available, 116 NULL); 117 return OPAL_ERR_NOT_FOUND; 118 } -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National Laboratory Sam Gutierrez wrote: > Hi All, > I just built OMPI 1.3.4rc4 on one of our Roadrunner machines. When I >

Re: [OMPI devel] Open MPI v1.3.4rc4 is out

2009-11-05 Thread David Gunter
I used one of the LANL platform files to build, $ configure --with-platform=contrib/platform/lanl/rr-class/debug- panasas-nocell Did the same thing with the non-debug platform file and it dies in the same location. -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National

Re: [OMPI devel] Open MPI v1.3.4rc4 is out

2009-11-05 Thread David Gunter
Oh, good catch. I'm not sure who updates the platform files or who would have added the "carto" option to the no_build. It's the only difference between the the 1.3.4 platform files and the previous ones, save for some compiler flags. -david -- David Gunter HPC-3:

Re: [OMPI devel] Open-MPI on TIPC

2010-05-12 Thread David Gunter
ich.de/nic-series/volume38/bounanos.pdf You might see what they did to port TIPC over. -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National Laboratory On May 12, 2010, at 10:54 AM, Nils Carlson wrote: > Hi, > > I'm wondering if anyone has looked at adding

[OMPI devel] Unable to build OMPI 1.4.2 and newer w/Intel 10 or Intel 11 compilers

2010-08-20 Thread David Gunter
e/asm-generic/ioctl.h and is well described here: https://bugzilla.redhat.com/show_bug.cgi?id=473947 The "ugly" fix recommended in the write-up works for the simple reproducer code given but has anyone figured out how to apply such a fix to the OMPI code? Thanks, david -- David Gu

[OMPI devel] Fwd: Unable to build OMPI 1.4.2 and newer w/Intel 10 or Intel 11 compilers

2010-09-01 Thread David Gunter
I tried the same build with the 1.4.3rc1 release and hit the same error. -david -- David Gunter HPC-3: Infrastructure Team Los Alamos National Laboratory Begin forwarded message: > From: David Gunter > Date: August 20, 2010 2:20:40 PM MDT > To: Open MPI Developers > Subjec