> On 14 Jan 2019, at 11:27, Kenneth Hoste <[email protected]> wrote:
> 
> Dear Damian,
> 
> On 14/01/2019 10:59, Alvarez, Damian wrote:
>> Hi Kenneth,
>> Wouldn't compiling OpenMPI 4.0 with --enable-mpi1-compatibility be an 
>> option? (See https://www.open-mpi.org/faq/?category=mpi-removed)
> 
> Maybe, but I would like to avoid i) using a new major release, ii) using a 
> non-default configuration in a common toolchain like foss/2019a.

I concur.

I am actually seeing strange problems with my own Asap molecular dynamics 
package when compiled with OpenMPI 4.0.0.  I have decided to wait for 4.0.1 
before looking further into it.  Never use X.0.0 versions for production :-)

Jakob

> 
> 
> regards,
> 
> Kenneth
> 
>> Best,
>> Damian
>> On 12.01.19, 19:15, "[email protected] on behalf of Kenneth 
>> Hoste" <[email protected] on behalf of 
>> [email protected]> wrote:
>>     Dear EasyBuilders,
>>          Based on the problems that several people are seeing with Intel MPI 
>> 2019
>>     update 1 (see also notes of the last EasyBuild conf call [1]), we
>>     concluded that it's better to stick with Intel MPI 2018 update 4 for the
>>     intel/2019a toolchain.
>>     I will change the pull request accordingly soon.
>>     For the Intel compilers and Intel MKL, we'll use the latest release
>>     (2019 update 1).
>>          As for foss/2019a: OpenMPI 4.0 dropped some old/'obscure' parts of 
>> MPI,
>>     which seems to affect several libraries/tools, incl. ScaLAPACK, so we'll
>>     stick to the latest OpenMPI 3.x release for the time being.
>>               regards,
>>          Kenneth
>>          [1]
>>     
>> https://github.com/easybuilders/easybuild/wiki/Conference-call-notes-20190109#2019a-update-of-common-toolchains
>>          On 08/01/2019 07:12, pramod kumbhar wrote:
>>     > Same experience on our systems : we encountered multiple issues,
>>     > specifically with libfabric (which is default with Intel MPI 2019).
>>     >
>>     > -Pramod
>>     >
>>     > On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian <[email protected]
>>     > <mailto:[email protected]>> wrote:
>>     >
>>     >     A word of caution regarding Intel MPI 2019: They changed a lot of
>>     >     things under the hood, and we have seen lots of issues in our
>>     >     systems on relatively large jobs (1.5K+ MPI processes). Basically
>>     >     most collective algorithms don't make it through. We have seen that
>>     >     on 2 different InfiniBand systems, I am unsure about OmniPath, or
>>     >     about the underlying cause (maybe libfabric issues?), but with 
>> Intel
>>     >     MPI 2018 we haven't seen them.
>>     >
>>     >     Best,
>>     >     Damian
>>     >
>>     >     On 07.01.19, 16:37, "[email protected]
>>     >     <mailto:[email protected]> on behalf of Kenneth
>>     >     Hoste" <[email protected]
>>     >     <mailto:[email protected]> on behalf of
>>     >     [email protected] <mailto:[email protected]>> wrote:
>>     >
>>     >          Dear EasyBuilders,
>>     >
>>     >          By tradition at the start of the year, I have started looking 
>> at
>>     >          updating the 'foss' and 'intel' common toolchains, currently
>>     >     for the
>>     >          2019a update.
>>     >          The plan is to include these in the upcoming EasyBuild v3.8.1
>>     >     release,
>>     >          which I hope to release in a couple of weeks.
>>     >
>>     >          Current proposals are:
>>     >
>>     >          * foss/2019a:
>>     >             (see
>>     >     https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
>>     >
>>     >             - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
>>     >
>>     >             - OpenMPI 3.1.3 (latest 3.x)
>>     >
>>     >             - OpenBLAS 0.3.5 [LATEST]
>>     >               + ScaLAPACK 2.0.2 [LATEST]
>>     >
>>     >              - FFTW 3.3.8 [LATEST]
>>     >
>>     >              => There also is OpenMPI 4.0.0, but we generally stay away
>>     >     from new
>>     >          major version of a toolchain component for the common 
>> toolchains.
>>     >
>>     >                 Does anyone have a detailed view on how OpenMPI v4.0.0
>>     >     compares
>>     >          to v3.1.3, and whether or not we should consider v4.0.0?
>>     >
>>     >
>>     >          * intel/2019a
>>     >             (see
>>     >     https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
>>     >
>>     >             - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
>>     >               (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to 
>> "icc
>>     >     -V")
>>     >               on top of GCC 8.2.0 + binutils 2.31.1
>>     >
>>     >             - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>>     >
>>     >             - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>>     >
>>     >             => Any positive/negative experiences with the latest 
>> version
>>     >     of the
>>     >          Intel tools, or any remarks that may be relevant to the 
>> version
>>     >     we pick
>>     >          for intel/2019a?
>>     >
>>     >
>>     >          I'm currently testing these foss/2019a and intel/2019a
>>     >     proposals with a
>>     >          bunch of existing easyconfigs using the */2018b equivalent.
>>     >
>>     >          A couple of minor compilation problems popped up when 
>> compiling
>>     >     with GCC
>>     >          8.2.0 (for HDF5, NASM, ...), but they were resolved trivially
>>     >     by simply
>>     >          updating to the latest version of these dependencies.
>>     >
>>     >
>>     >          The 2019a toolchain definitions will also be discussed during 
>> the
>>     >          EasyBuild conf call this Wednesday (Jan 9th).
>>     >
>>     >
>>     >
>>     >          regards,
>>     >
>>     >          Kenneth
>>     >
>>     >
>>     >
>>     >
>>     >     
>> ------------------------------------------------------------------------------------------------
>>     >     
>> ------------------------------------------------------------------------------------------------
>>     >     Forschungszentrum Juelich GmbH
>>     >     52425 Juelich
>>     >     Sitz der Gesellschaft: Juelich
>>     >     Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 
>> 3498
>>     >     Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
>>     >     Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt 
>> (Vorsitzender),
>>     >     Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
>>     >     Prof. Dr. Sebastian M. Schmidt
>>     >     
>> ------------------------------------------------------------------------------------------------
>>     >     
>> ------------------------------------------------------------------------------------------------
>>     >
>>     

--
Jakob Schiøtz, professor, Ph.D.
Department of Physics
Technical University of Denmark
DK-2800 Kongens Lyngby, Denmark
http://www.fysik.dtu.dk/~schiotz/



Reply via email to