Hello,
I tried to install OpenMPI 1.2 but I saw there some problems when
compiling files with POE. When OpenMPI 1.2.1 was released, I saw in the
bug fixes that this problem was fixed. Then I tried, but it still
doesn't work. The problem comes from orte/mca/pls/poe/pls_poe_module.c.
A static
I have previously been running parallel VASP happily with an old,
prerelease version of OpenMPI:
[terry@nocona Vasp.4.6-OpenMPI]$
head /home/terry/Install_trees/OpenMPI-1.0rc6/config.log
This file contains any messages produced by compilers while
running configure, to aid debugging if configure
Hi Laurent,
Unfortunately, as far as I know, none of the current Open MPI developers has
access to a system with POE, so the POE process launcher has fallen into
disrepair. Attached is a patch that should allow you to compile (however, you
may also need to add #include to pls_poe_module.c).
Hi Tim,
Ok, I thank you for all theses precisions. I also add "static int
pls_poe_cancel_operation(void)" similary to you, and I can continue the
compilation. But, I had another problem. In ompi/mpi/cxx/mpicxx.cc,
three variables are already defined. The preprocessor set them to the
constant
I'm trying to run a job specifically over tcp and the eth1 interface.
It seems to be barfing on trying to listen via ipv6. I don't want ipv6.
How can I disable it?
Here's my mpirun line:
[root@vic12-10g ~]# mpirun --n 2 --host vic12,vic20 --mca btl self,tcp -mca
btl_tcp_if_include eth1
On Thursday 10 May 2007 11:35 am, Laurent Nguyen wrote:
> Hi Tim,
>
> Ok, I thank you for all theses precisions. I also add "static int
> pls_poe_cancel_operation(void)" similary to you, and I can continue the
> compilation. But, I had another problem. In ompi/mpi/cxx/mpicxx.cc,
> three variables
Brian --
Didn't you add something to fix exactly this problem recently? I
have a dim recollection of seeing a commit go by about this...?
(I advised Steve in IM to use --disable-ipv6 in the meantime)
On May 10, 2007, at 1:25 PM, Steve Wise wrote:
I'm trying to run a job specifically
On Thu, 2007-05-10 at 20:07 -0400, Jeff Squyres wrote:
> Brian --
>
> Didn't you add something to fix exactly this problem recently? I
> have a dim recollection of seeing a commit go by about this...?
>
> (I advised Steve in IM to use --disable-ipv6 in the meantime)
>
Yes, disabling it
Good to know. This suggests that building VASP properly with Open
MPI should work properly; perhaps there's some secret sauce in the
Makefile somewhere...? Off list, someone cited the following to me:
-
Also VASP has a forum for things like this too.
I am a newbie in openmpi. I have just compiled a program with -g -pg (an
mpi program with a listener thread, which all MPI calls except
initialization and MPI_Finalize are placed within) and I run it. However
it crashes and I can't find any core dump, even I set the core dump max size
to
On Thursday 10 May 2007 07:19 pm, Code Master wrote:
> I am a newbie in openmpi. I have just compiled a program with -g -pg (an
> mpi program with a listener thread, which all MPI calls except
> initialization and MPI_Finalize are placed within) and I run it. However
> it crashes and I can't
Folks,
MTT version 2.1 is live! Please do an "svn up" at your
earliest convenience. As always, report any issues you
encounter to this mailing list.
Thanks,
Ethan
On Wed, May/09/2007 09:09:46PM, Jeff Squyres wrote:
> All --
>
> We are just about to release MTT version 2.1. Woo hoo!
>
>
12 matches
Mail list logo