Re: [easybuild] 2019a update of common toolchains

2019-01-14 Thread Jakob Schiøtz


> On 14 Jan 2019, at 11:27, Kenneth Hoste  wrote:
> 
> Dear Damian,
> 
> On 14/01/2019 10:59, Alvarez, Damian wrote:
>> Hi Kenneth,
>> Wouldn't compiling OpenMPI 4.0 with --enable-mpi1-compatibility be an 
>> option? (See https://www.open-mpi.org/faq/?category=mpi-removed)
> 
> Maybe, but I would like to avoid i) using a new major release, ii) using a 
> non-default configuration in a common toolchain like foss/2019a.

I concur.

I am actually seeing strange problems with my own Asap molecular dynamics 
package when compiled with OpenMPI 4.0.0.  I have decided to wait for 4.0.1 
before looking further into it.  Never use X.0.0 versions for production :-)

Jakob

> 
> 
> regards,
> 
> Kenneth
> 
>> Best,
>> Damian
>> On 12.01.19, 19:15, "easybuild-requ...@lists.ugent.be on behalf of Kenneth 
>> Hoste" > kenneth.ho...@ugent.be> wrote:
>> Dear EasyBuilders,
>>  Based on the problems that several people are seeing with Intel MPI 
>> 2019
>> update 1 (see also notes of the last EasyBuild conf call [1]), we
>> concluded that it's better to stick with Intel MPI 2018 update 4 for the
>> intel/2019a toolchain.
>> I will change the pull request accordingly soon.
>> For the Intel compilers and Intel MKL, we'll use the latest release
>> (2019 update 1).
>>  As for foss/2019a: OpenMPI 4.0 dropped some old/'obscure' parts of 
>> MPI,
>> which seems to affect several libraries/tools, incl. ScaLAPACK, so we'll
>> stick to the latest OpenMPI 3.x release for the time being.
>>   regards,
>>  Kenneth
>>  [1]
>> 
>> https://github.com/easybuilders/easybuild/wiki/Conference-call-notes-20190109#2019a-update-of-common-toolchains
>>  On 08/01/2019 07:12, pramod kumbhar wrote:
>> > Same experience on our systems : we encountered multiple issues,
>> > specifically with libfabric (which is default with Intel MPI 2019).
>> >
>> > -Pramod
>> >
>> > On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian > > > wrote:
>> >
>> > A word of caution regarding Intel MPI 2019: They changed a lot of
>> > things under the hood, and we have seen lots of issues in our
>> > systems on relatively large jobs (1.5K+ MPI processes). Basically
>> > most collective algorithms don't make it through. We have seen that
>> > on 2 different InfiniBand systems, I am unsure about OmniPath, or
>> > about the underlying cause (maybe libfabric issues?), but with 
>> Intel
>> > MPI 2018 we haven't seen them.
>> >
>> > Best,
>> > Damian
>> >
>> > On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be
>> >  on behalf of Kenneth
>> > Hoste" > >  on behalf of
>> > kenneth.ho...@ugent.be > wrote:
>> >
>> >  Dear EasyBuilders,
>> >
>> >  By tradition at the start of the year, I have started looking 
>> at
>> >  updating the 'foss' and 'intel' common toolchains, currently
>> > for the
>> >  2019a update.
>> >  The plan is to include these in the upcoming EasyBuild v3.8.1
>> > release,
>> >  which I hope to release in a couple of weeks.
>> >
>> >  Current proposals are:
>> >
>> >  * foss/2019a:
>> > (see
>> > https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
>> >
>> > - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
>> >
>> > - OpenMPI 3.1.3 (latest 3.x)
>> >
>> > - OpenBLAS 0.3.5 [LATEST]
>> >   + ScaLAPACK 2.0.2 [LATEST]
>> >
>> >  - FFTW 3.3.8 [LATEST]
>> >
>> >  => There also is OpenMPI 4.0.0, but we generally stay away
>> > from new
>> >  major version of a toolchain component for the common 
>> toolchains.
>> >
>> > Does anyone have a detailed view on how OpenMPI v4.0.0
>> > compares
>> >  to v3.1.3, and whether or not we should consider v4.0.0?
>> >
>> >
>> >  * intel/2019a
>> > (see
>> > https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
>> >
>> > - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
>> >   (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to 
>> "icc
>> > -V")
>> >   on top of GCC 8.2.0 + binutils 2.31.1
>> >
>> > - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>> >
>> > - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>> >
>> > => Any positive/negative experiences with the latest 
>> version
>> > of the
>> >  Intel tools, 

Re: [easybuild] 2019a update of common toolchains

2019-01-14 Thread Kenneth Hoste

Dear Damian,

On 14/01/2019 10:59, Alvarez, Damian wrote:

Hi Kenneth,

Wouldn't compiling OpenMPI 4.0 with --enable-mpi1-compatibility be an option? 
(See https://www.open-mpi.org/faq/?category=mpi-removed)


Maybe, but I would like to avoid i) using a new major release, ii) using 
a non-default configuration in a common toolchain like foss/2019a.



regards,

Kenneth



Best,
Damian

On 12.01.19, 19:15, "easybuild-requ...@lists.ugent.be on behalf of Kenneth Hoste" 
 wrote:

 Dear EasyBuilders,
 
 Based on the problems that several people are seeing with Intel MPI 2019

 update 1 (see also notes of the last EasyBuild conf call [1]), we
 concluded that it's better to stick with Intel MPI 2018 update 4 for the
 intel/2019a toolchain.
 I will change the pull request accordingly soon.
 For the Intel compilers and Intel MKL, we'll use the latest release
 (2019 update 1).
 
 As for foss/2019a: OpenMPI 4.0 dropped some old/'obscure' parts of MPI,

 which seems to affect several libraries/tools, incl. ScaLAPACK, so we'll
 stick to the latest OpenMPI 3.x release for the time being.
 
 
 regards,
 
 Kenneth
 
 [1]

 
https://github.com/easybuilders/easybuild/wiki/Conference-call-notes-20190109#2019a-update-of-common-toolchains
 
 On 08/01/2019 07:12, pramod kumbhar wrote:

 > Same experience on our systems : we encountered multiple issues,
 > specifically with libfabric (which is default with Intel MPI 2019).
 >
 > -Pramod
 >
 > On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian  > wrote:
 >
 > A word of caution regarding Intel MPI 2019: They changed a lot of
 > things under the hood, and we have seen lots of issues in our
 > systems on relatively large jobs (1.5K+ MPI processes). Basically
 > most collective algorithms don't make it through. We have seen that
 > on 2 different InfiniBand systems, I am unsure about OmniPath, or
 > about the underlying cause (maybe libfabric issues?), but with Intel
 > MPI 2018 we haven't seen them.
 >
 > Best,
 > Damian
 >
 > On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be
 >  on behalf of Kenneth
 > Hoste"   on behalf of
 > kenneth.ho...@ugent.be > wrote:
 >
 >  Dear EasyBuilders,
 >
 >  By tradition at the start of the year, I have started looking at
 >  updating the 'foss' and 'intel' common toolchains, currently
 > for the
 >  2019a update.
 >  The plan is to include these in the upcoming EasyBuild v3.8.1
 > release,
 >  which I hope to release in a couple of weeks.
 >
 >  Current proposals are:
 >
 >  * foss/2019a:
 > (see
 > https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
 >
 > - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
 >
 > - OpenMPI 3.1.3 (latest 3.x)
 >
 > - OpenBLAS 0.3.5 [LATEST]
 >   + ScaLAPACK 2.0.2 [LATEST]
 >
 >  - FFTW 3.3.8 [LATEST]
 >
 >  => There also is OpenMPI 4.0.0, but we generally stay away
 > from new
 >  major version of a toolchain component for the common 
toolchains.
 >
 > Does anyone have a detailed view on how OpenMPI v4.0.0
 > compares
 >  to v3.1.3, and whether or not we should consider v4.0.0?
 >
 >
 >  * intel/2019a
 > (see
 > https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
 >
 > - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
 >   (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc
 > -V")
 >   on top of GCC 8.2.0 + binutils 2.31.1
 >
 > - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
 >
 > - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
 >
 > => Any positive/negative experiences with the latest version
 > of the
 >  Intel tools, or any remarks that may be relevant to the version
 > we pick
 >  for intel/2019a?
 >
 >
 >  I'm currently testing these foss/2019a and intel/2019a
 > proposals with a
 >  bunch of existing easyconfigs using the */2018b equivalent.
 >
 >  A couple of minor compilation problems popped up when compiling
 > with GCC
 >  8.2.0 (for HDF5, NASM, ...), but they were resolved trivially
 > by simply
 >  updating to the latest version of these dependencies.
 >
   

Re: [easybuild] 2019a update of common toolchains

2019-01-14 Thread Alvarez, Damian
Hi Kenneth,

Wouldn't compiling OpenMPI 4.0 with --enable-mpi1-compatibility be an option? 
(See https://www.open-mpi.org/faq/?category=mpi-removed)

Best,
Damian

On 12.01.19, 19:15, "easybuild-requ...@lists.ugent.be on behalf of Kenneth 
Hoste"  
wrote:

Dear EasyBuilders,

Based on the problems that several people are seeing with Intel MPI 2019 
update 1 (see also notes of the last EasyBuild conf call [1]), we 
concluded that it's better to stick with Intel MPI 2018 update 4 for the 
intel/2019a toolchain.
I will change the pull request accordingly soon.
For the Intel compilers and Intel MKL, we'll use the latest release 
(2019 update 1).

As for foss/2019a: OpenMPI 4.0 dropped some old/'obscure' parts of MPI, 
which seems to affect several libraries/tools, incl. ScaLAPACK, so we'll 
stick to the latest OpenMPI 3.x release for the time being.


regards,

Kenneth

[1] 

https://github.com/easybuilders/easybuild/wiki/Conference-call-notes-20190109#2019a-update-of-common-toolchains

On 08/01/2019 07:12, pramod kumbhar wrote:
> Same experience on our systems : we encountered multiple issues, 
> specifically with libfabric (which is default with Intel MPI 2019).
> 
> -Pramod
> 
> On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian  > wrote:
> 
> A word of caution regarding Intel MPI 2019: They changed a lot of
> things under the hood, and we have seen lots of issues in our
> systems on relatively large jobs (1.5K+ MPI processes). Basically
> most collective algorithms don't make it through. We have seen that
> on 2 different InfiniBand systems, I am unsure about OmniPath, or
> about the underlying cause (maybe libfabric issues?), but with Intel
> MPI 2018 we haven't seen them.
> 
> Best,
> Damian
> 
> On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be
>  on behalf of Kenneth
> Hoste"   on behalf of
> kenneth.ho...@ugent.be > wrote:
> 
>  Dear EasyBuilders,
> 
>  By tradition at the start of the year, I have started looking at
>  updating the 'foss' and 'intel' common toolchains, currently
> for the
>  2019a update.
>  The plan is to include these in the upcoming EasyBuild v3.8.1
> release,
>  which I hope to release in a couple of weeks.
> 
>  Current proposals are:
> 
>  * foss/2019a:
> (see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
> 
> - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
> 
> - OpenMPI 3.1.3 (latest 3.x)
> 
> - OpenBLAS 0.3.5 [LATEST]
>   + ScaLAPACK 2.0.2 [LATEST]
> 
>  - FFTW 3.3.8 [LATEST]
> 
>  => There also is OpenMPI 4.0.0, but we generally stay away
> from new
>  major version of a toolchain component for the common toolchains.
> 
> Does anyone have a detailed view on how OpenMPI v4.0.0
> compares
>  to v3.1.3, and whether or not we should consider v4.0.0?
> 
> 
>  * intel/2019a
> (see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
> 
> - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
>   (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc
> -V")
>   on top of GCC 8.2.0 + binutils 2.31.1
> 
> - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
> 
> - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
> 
> => Any positive/negative experiences with the latest version
> of the
>  Intel tools, or any remarks that may be relevant to the version
> we pick
>  for intel/2019a?
> 
> 
>  I'm currently testing these foss/2019a and intel/2019a
> proposals with a
>  bunch of existing easyconfigs using the */2018b equivalent.
> 
>  A couple of minor compilation problems popped up when compiling
> with GCC
>  8.2.0 (for HDF5, NASM, ...), but they were resolved trivially
> by simply
>  updating to the latest version of these dependencies.
> 
> 
>  The 2019a toolchain definitions will also be discussed during the
>  EasyBuild conf call this Wednesday (Jan 9th).
> 
> 
> 
>  regards,
> 
>  Kenneth
> 
> 
> 
> 
> 

Re: [easybuild] 2019a update of common toolchains

2019-01-12 Thread Kenneth Hoste

Dear EasyBuilders,

Based on the problems that several people are seeing with Intel MPI 2019 
update 1 (see also notes of the last EasyBuild conf call [1]), we 
concluded that it's better to stick with Intel MPI 2018 update 4 for the 
intel/2019a toolchain.

I will change the pull request accordingly soon.
For the Intel compilers and Intel MKL, we'll use the latest release 
(2019 update 1).


As for foss/2019a: OpenMPI 4.0 dropped some old/'obscure' parts of MPI, 
which seems to affect several libraries/tools, incl. ScaLAPACK, so we'll 
stick to the latest OpenMPI 3.x release for the time being.



regards,

Kenneth

[1] 
https://github.com/easybuilders/easybuild/wiki/Conference-call-notes-20190109#2019a-update-of-common-toolchains


On 08/01/2019 07:12, pramod kumbhar wrote:
Same experience on our systems : we encountered multiple issues, 
specifically with libfabric (which is default with Intel MPI 2019).


-Pramod

On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian > wrote:


A word of caution regarding Intel MPI 2019: They changed a lot of
things under the hood, and we have seen lots of issues in our
systems on relatively large jobs (1.5K+ MPI processes). Basically
most collective algorithms don't make it through. We have seen that
on 2 different InfiniBand systems, I am unsure about OmniPath, or
about the underlying cause (maybe libfabric issues?), but with Intel
MPI 2018 we haven't seen them.

Best,
Damian

On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be
 on behalf of Kenneth
Hoste" mailto:easybuild-requ...@lists.ugent.be> on behalf of
kenneth.ho...@ugent.be > wrote:

     Dear EasyBuilders,

     By tradition at the start of the year, I have started looking at
     updating the 'foss' and 'intel' common toolchains, currently
for the
     2019a update.
     The plan is to include these in the upcoming EasyBuild v3.8.1
release,
     which I hope to release in a couple of weeks.

     Current proposals are:

     * foss/2019a:
        (see
https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)

        - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]

        - OpenMPI 3.1.3 (latest 3.x)

        - OpenBLAS 0.3.5 [LATEST]
          + ScaLAPACK 2.0.2 [LATEST]

         - FFTW 3.3.8 [LATEST]

         => There also is OpenMPI 4.0.0, but we generally stay away
from new
     major version of a toolchain component for the common toolchains.

            Does anyone have a detailed view on how OpenMPI v4.0.0
compares
     to v3.1.3, and whether or not we should consider v4.0.0?


     * intel/2019a
        (see
https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)

        - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
          (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc
-V")
          on top of GCC 8.2.0 + binutils 2.31.1

        - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]

        - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]

        => Any positive/negative experiences with the latest version
of the
     Intel tools, or any remarks that may be relevant to the version
we pick
     for intel/2019a?


     I'm currently testing these foss/2019a and intel/2019a
proposals with a
     bunch of existing easyconfigs using the */2018b equivalent.

     A couple of minor compilation problems popped up when compiling
with GCC
     8.2.0 (for HDF5, NASM, ...), but they were resolved trivially
by simply
     updating to the latest version of these dependencies.


     The 2019a toolchain definitions will also be discussed during the
     EasyBuild conf call this Wednesday (Jan 9th).



     regards,

     Kenneth








Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
Prof. Dr. Sebastian M. Schmidt







Re: [easybuild] 2019a update of common toolchains

2019-01-07 Thread pramod kumbhar
Same experience on our systems : we encountered multiple issues,
specifically with libfabric (which is default with Intel MPI 2019).

-Pramod

On Mon, Jan 7, 2019 at 4:45 PM Alvarez, Damian 
wrote:

> A word of caution regarding Intel MPI 2019: They changed a lot of things
> under the hood, and we have seen lots of issues in our systems on
> relatively large jobs (1.5K+ MPI processes). Basically most collective
> algorithms don't make it through. We have seen that on 2 different
> InfiniBand systems, I am unsure about OmniPath, or about the underlying
> cause (maybe libfabric issues?), but with Intel MPI 2018 we haven't seen
> them.
>
> Best,
> Damian
>
> On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be on behalf of
> Kenneth Hoste"  kenneth.ho...@ugent.be> wrote:
>
> Dear EasyBuilders,
>
> By tradition at the start of the year, I have started looking at
> updating the 'foss' and 'intel' common toolchains, currently for the
> 2019a update.
> The plan is to include these in the upcoming EasyBuild v3.8.1 release,
> which I hope to release in a couple of weeks.
>
> Current proposals are:
>
> * foss/2019a:
>(see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
>
>- GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
>
>- OpenMPI 3.1.3 (latest 3.x)
>
>- OpenBLAS 0.3.5 [LATEST]
>  + ScaLAPACK 2.0.2 [LATEST]
>
> - FFTW 3.3.8 [LATEST]
>
> => There also is OpenMPI 4.0.0, but we generally stay away from new
> major version of a toolchain component for the common toolchains.
>
>Does anyone have a detailed view on how OpenMPI v4.0.0 compares
> to v3.1.3, and whether or not we should consider v4.0.0?
>
>
> * intel/2019a
>(see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
>
>- Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
>  (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc -V")
>  on top of GCC 8.2.0 + binutils 2.31.1
>
>- Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>
>- Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>
>=> Any positive/negative experiences with the latest version of the
> Intel tools, or any remarks that may be relevant to the version we pick
> for intel/2019a?
>
>
> I'm currently testing these foss/2019a and intel/2019a proposals with a
> bunch of existing easyconfigs using the */2018b equivalent.
>
> A couple of minor compilation problems popped up when compiling with
> GCC
> 8.2.0 (for HDF5, NASM, ...), but they were resolved trivially by simply
> updating to the latest version of these dependencies.
>
>
> The 2019a toolchain definitions will also be discussed during the
> EasyBuild conf call this Wednesday (Jan 9th).
>
>
>
> regards,
>
> Kenneth
>
>
>
>
>
> 
>
> 
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
> Prof. Dr. Sebastian M. Schmidt
>
> 
>
> 
>
>


Re: [easybuild] 2019a update of common toolchains

2019-01-07 Thread Bart Oldeman
Hi Damian,

I have no idea if it helps but a new version of libfabric has just been
released:
https://github.com/ofiwg/libfabric/releases/tag/v1.7.0

whereas according to
https://software.intel.com/en-us/articles/intel-mpi-library-release-notes-linux
Intel MPI 2019.1 ships with a customized 1.7.0 alpha version.

Bart

On Mon, 7 Jan 2019 at 10:46, Alvarez, Damian 
wrote:

> A word of caution regarding Intel MPI 2019: They changed a lot of things
> under the hood, and we have seen lots of issues in our systems on
> relatively large jobs (1.5K+ MPI processes). Basically most collective
> algorithms don't make it through. We have seen that on 2 different
> InfiniBand systems, I am unsure about OmniPath, or about the underlying
> cause (maybe libfabric issues?), but with Intel MPI 2018 we haven't seen
> them.
>
> Best,
> Damian
>
> On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be on behalf of
> Kenneth Hoste"  kenneth.ho...@ugent.be> wrote:
>
> Dear EasyBuilders,
>
> By tradition at the start of the year, I have started looking at
> updating the 'foss' and 'intel' common toolchains, currently for the
> 2019a update.
> The plan is to include these in the upcoming EasyBuild v3.8.1 release,
> which I hope to release in a couple of weeks.
>
> Current proposals are:
>
> * foss/2019a:
>(see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)
>
>- GCC 8.2.0 + binutils 2.31.1 [LATEST for both]
>
>- OpenMPI 3.1.3 (latest 3.x)
>
>- OpenBLAS 0.3.5 [LATEST]
>  + ScaLAPACK 2.0.2 [LATEST]
>
> - FFTW 3.3.8 [LATEST]
>
> => There also is OpenMPI 4.0.0, but we generally stay away from new
> major version of a toolchain component for the common toolchains.
>
>Does anyone have a detailed view on how OpenMPI v4.0.0 compares
> to v3.1.3, and whether or not we should consider v4.0.0?
>
>
> * intel/2019a
>(see
> https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)
>
>- Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
>  (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc -V")
>  on top of GCC 8.2.0 + binutils 2.31.1
>
>- Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>
>- Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]
>
>=> Any positive/negative experiences with the latest version of the
> Intel tools, or any remarks that may be relevant to the version we pick
> for intel/2019a?
>
>
> I'm currently testing these foss/2019a and intel/2019a proposals with a
> bunch of existing easyconfigs using the */2018b equivalent.
>
> A couple of minor compilation problems popped up when compiling with
> GCC
> 8.2.0 (for HDF5, NASM, ...), but they were resolved trivially by simply
> updating to the latest version of these dependencies.
>
>
> The 2019a toolchain definitions will also be discussed during the
> EasyBuild conf call this Wednesday (Jan 9th).
>
>
>
> regards,
>
> Kenneth
>
>
>
>
>
> 
>
> 
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
> Prof. Dr. Sebastian M. Schmidt
>
> 
>
> 
>
>

-- 
Dr. Bart E. Oldeman | bart.olde...@mcgill.ca | bart.olde...@calculquebec.ca
Scientific Computing Analyst / Analyste en calcul scientifique
McGill HPC Centre / Centre de Calcul Haute Performance de McGill |
http://www.hpc.mcgill.ca
Calcul Québec | http://www.calculquebec.ca
Compute/Calcul Canada | http://www.computecanada.ca
Tel/Tél: 514-396-8926 | Fax/Télécopieur: 514-396-8934


Re: [easybuild] 2019a update of common toolchains

2019-01-07 Thread Alvarez, Damian
A word of caution regarding Intel MPI 2019: They changed a lot of things under 
the hood, and we have seen lots of issues in our systems on relatively large 
jobs (1.5K+ MPI processes). Basically most collective algorithms don't make it 
through. We have seen that on 2 different InfiniBand systems, I am unsure about 
OmniPath, or about the underlying cause (maybe libfabric issues?), but with 
Intel MPI 2018 we haven't seen them.

Best,
Damian

On 07.01.19, 16:37, "easybuild-requ...@lists.ugent.be on behalf of Kenneth 
Hoste"  
wrote:

Dear EasyBuilders,

By tradition at the start of the year, I have started looking at
updating the 'foss' and 'intel' common toolchains, currently for the
2019a update.
The plan is to include these in the upcoming EasyBuild v3.8.1 release,
which I hope to release in a couple of weeks.

Current proposals are:

* foss/2019a:
   (see https://github.com/easybuilders/easybuild-easyconfigs/pull/7371)

   - GCC 8.2.0 + binutils 2.31.1 [LATEST for both]

   - OpenMPI 3.1.3 (latest 3.x)

   - OpenBLAS 0.3.5 [LATEST]
 + ScaLAPACK 2.0.2 [LATEST]

- FFTW 3.3.8 [LATEST]

=> There also is OpenMPI 4.0.0, but we generally stay away from new
major version of a toolchain component for the common toolchains.

   Does anyone have a detailed view on how OpenMPI v4.0.0 compares
to v3.1.3, and whether or not we should consider v4.0.0?


* intel/2019a
   (see https://github.com/easybuilders/easybuild-easyconfigs/pull/7372)

   - Intel C/C++/Fortran compilers 2019.1.144  [LATEST]
 (a.k.a. 2019 update 1, a.k.a. 19.0.1.144 according to "icc -V")
 on top of GCC 8.2.0 + binutils 2.31.1

   - Intel MPI 2019.1.144 (a.k.a. 2019 update 1) [LATEST]

   - Intel MKL 2019.1.144 (a.k.a. 2019 update 1) [LATEST]

   => Any positive/negative experiences with the latest version of the
Intel tools, or any remarks that may be relevant to the version we pick
for intel/2019a?


I'm currently testing these foss/2019a and intel/2019a proposals with a
bunch of existing easyconfigs using the */2018b equivalent.

A couple of minor compilation problems popped up when compiling with GCC
8.2.0 (for HDF5, NASM, ...), but they were resolved trivially by simply
updating to the latest version of these dependencies.


The 2019a toolchain definitions will also be discussed during the
EasyBuild conf call this Wednesday (Jan 9th).



regards,

Kenneth






Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
Prof. Dr. Sebastian M. Schmidt