On Jun 8, 2018, at 11:38 AM, Bennet Fauber wrote:
>
> Hmm. Maybe I had insufficient error checking in our installation process.
>
> Can you make and make install after the configure fails? I somehow got an
> installation, despite the configure status, perhaps?
If it's a fresh tarball
Jeff,
Hmm. Maybe I had insufficient error checking in our installation process.
Can you make and make install after the configure fails? I somehow got an
installation, despite the configure status, perhaps?
-- bennet
On Fri, Jun 8, 2018 at 11:32 AM Jeff Squyres (jsquyres) via users <
Hmm. I'm confused -- can we clarify?
I just tried configuring Open MPI v3.1.0 on a RHEL 7.4 system with the RHEL
hwloc RPM installed, but *not* the hwloc-devel RPM. Hence, no hwloc.h (for
example).
When specifying an external hwloc, configure did fail, as expected:
-
$ ./configure
> On Jun 8, 2018, at 8:10 AM, Bennet Fauber wrote:
>
> Further testing shows that it was the failure to find the hwloc-devel files
> that seems to be the cause of the failure. I compiled and ran without the
> additional configure flags, and it still seems to work.
>
> I think it issued a
Further testing shows that it was the failure to find the hwloc-devel files
that seems to be the cause of the failure. I compiled and ran without the
additional configure flags, and it still seems to work.
I think it issued a two-line warning about this. Is that something that
should result in
nt of slurmd.log?
>>
>
> I will reply separately with this, as I have to coordinate with the
> cluster administrator, who is not in yet.
>
> Please note, also, that I was able to build this successfully after
> install the hwlock-devel package and adding
ministrator, who is not in yet.
Please note, also, that I was able to build this successfully after install
the hwlock-devel package and adding the --disable-dlopen and
--enable-shared options to configure.
Thanks,-- bennet
>
> Today's Topics:
>
>1. Re: Fwd: OpenMPI 3.1.0
-----------------
Message: 1
Date: Thu, 7 Jun 2018 08:05:30 -0700
From: "r...@open-mpi.org"
To: Open MPI Users
Subject: Re: [OMPI users] Fwd: OpenMPI 3.1.0 on aarch64
Message-ID:
Content-Type: text/plain; charset=utf-8
Odd - Artem, do you have any suggest
I rebuilt and examined the logs more closely. There was a warning
about a failure with the external hwloc, and that led to finding that
the CentOS hwloc-devel package was not installed.
I also added the options that we have been using for a while,
--disable-dlopen and --enable-shared, to the
Odd - Artem, do you have any suggestions?
> On Jun 7, 2018, at 7:41 AM, Bennet Fauber wrote:
>
> Thanks, Ralph,
>
> I just tried it with
>
>srun --mpi=pmix_v2 ./test_mpi
>
> and got these messages
>
>
> srun: Step created for job 89
> [cav02.arc-ts.umich.edu:92286] PMIX ERROR:
Thanks, Ralph,
I just tried it with
srun --mpi=pmix_v2 ./test_mpi
and got these messages
srun: Step created for job 89
[cav02.arc-ts.umich.edu:92286] PMIX ERROR: OUT-OF-RESOURCE in file
client/pmix_client.c at line 234
[cav02.arc-ts.umich.edu:92286] OPAL ERROR: Error in file
I think you need to set your MPIDefault to pmix_v2 since you are using a PMIx
v2 library
> On Jun 7, 2018, at 6:25 AM, Bennet Fauber wrote:
>
> Hi, Ralph,
>
> Thanks for the reply, and sorry for the missing information. I hope
> this fills in the picture better.
>
> $ srun --version
>
Hi, Ralph,
Thanks for the reply, and sorry for the missing information. I hope
this fills in the picture better.
$ srun --version
slurm 17.11.7
$ srun --mpi=list
srun: MPI types are...
srun: pmix_v2
srun: openmpi
srun: none
srun: pmi2
srun: pmix
We have pmix configured as the default in
You didn’t show your srun direct launch cmd line or what version of Slurm is
being used (and how it was configured), so I can only provide some advice. If
you want to use PMIx, then you have to do two things:
1. Slurm must be configured to use PMIx - depending on the version, that might
be
We are trying out MPI on an aarch64 cluster.
Our system administrators installed SLURM and PMIx 2.0.2 from .rpm.
I compiled OpenMPI using the ARM distributed gcc/7.1.0 using the
configure flags shown in this snippet from the top of config.log
It was created by Open MPI configure 3.1.0, which
15 matches
Mail list logo