On Mon, 19 Jul 2010 13:33:01 -0600, Damien Hocking wrote:
> It does. The big difference is that MUMPS is a 3-minute compile, and
> PETSc, erm, isn't. It's..longer...
FWIW, PETSc takes less than 3 minutes to build (after configuration) for
me (I build it every day). Building MUMPS (with depend
On 7/18/2010 9:09 AM, Anton Shterenlikht wrote:
On Sat, Jul 17, 2010 at 09:14:11AM -0700, Eugene Loh wrote:
Jeff Squyres wrote:
On Jul 17, 2010, at 4:22 AM, Anton Shterenlikht wrote:
Is loop vectorisation/unrolling safe for MPI logic?
I presume it is, but are there situati
Hello,
does anybody know another tool as jumpstart to view a MPE logging file?
Regards,
Stefan Kuhne
signature.asc
Description: OpenPGP digital signature
On Jul 18, 2010, at 4:09 PM, Philippe wrote:
> Ralph,
>
> thanks for investigating.
>
> I've applied the two patches you mentioned earlier and ran with the
> ompi server. Although i was able to runn our standalone test, when I
> integrated the changes to our code, the processes entered a crazy
I'm wondering if we can't make this simpler. What launch environment are you
operating under? I know you said you can't use mpiexec, but I'm wondering if we
could add support for your environment to mpiexec so you could.
On Jul 18, 2010, at 4:09 PM, Philippe wrote:
> Ralph,
>
> thanks for inv
Hi Bibrak,
The message about malloc looks like a MX message. Which interconnects did you
compile support for?
If you are using MX, does it appear when you run with:
$ mpirun --mca pml cm -np 4 ./exec 98
which uses the MX MTL instead of MX BTL.
Scott
On Jul 18, 2010, at 9:23 AM, Bibrak Qamar
Just curious, is there any reason you are looking for another
tool to view slog2 file ?
A.Chan
- "Stefan Kuhne" wrote:
> Hello,
>
> does anybody know another tool as jumpstart to view a MPE logging
> file?
>
> Regards,
> Stefan Kuhne
>
>
> __
Hi,
I've been working on a random segmentation fault that seems to occur during a
collective communication when using the openib btl (see [OMPI users] [openib]
segfault when using openib btl).
During my tests, I've come across different issues reported by OpenMPI-1.4.2:
1/
[[12770,1],0][btl_o
Am 19.07.2010 16:32, schrieb Anthony Chan:
Hello Anthony,
>
> Just curious, is there any reason you are looking for another
> tool to view slog2 file ?
>
I'm looking for a more clearer tool.
I find jumpstart a little bit overloaded.
Regards,
Stefan Kuhne
signature.asc
Description: OpenPGP di
Hm, so I am not sure how to approach this. First of all, the test case
works for me. I used up to 80 clients, and for both optimized and
non-optimized compilation. I ran the tests with trunk (not with 1.4
series, but the communicator code is identical in both cases). Clearly,
the patch from Ralph i
Thanks a lot! PETSc seems to be really solid and integrates with MUMPS
suggested by Damien.
All the best,
Daniel Janzon
On 7/18/10, Gustavo Correa wrote:
> Check PETSc:
> http://www.mcs.anl.gov/petsc/petsc-as/
>
> On Jul 18, 2010, at 12:37 AM, Damien wrote:
>
>> You should check out the MUMPS pa
Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is
scheduled for release? And whether the Slurm srun changes are going
to make in?
thanks
On Mon, 19 Jul 2010 15:16:59 -0400, Michael Di Domenico
wrote:
> Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is
> scheduled for release?
https://svn.open-mpi.org/trac/ompi/milestone/Open%20MPI%201.5
> And whether the Slurm srun changes are going to make in?
https://svn.open-m
I'm actually waiting for *1* more bug fix before we consider 1.5 "complete".
On Jul 19, 2010, at 3:24 PM, Jed Brown wrote:
> On Mon, 19 Jul 2010 15:16:59 -0400, Michael Di Domenico
> wrote:
>> Since I am a SVN neophyte can anyone tell me when openmpi 1.5 is
>> scheduled for release?
>
> https
It does. The big difference is that MUMPS is a 3-minute compile, and
PETSc, erm, isn't. It's..longer...
D
On 19/07/2010 12:56 PM, Daniel Janzon wrote:
Thanks a lot! PETSc seems to be really solid and integrates with MUMPS
suggested by Damien.
All the best,
Daniel Janzon
On 7/18/10, Gustavo
15 matches
Mail list logo