Hi Eric,

On 09/07/15 16:48, Gregory, Eric wrote:
Has anyone tried building this impi yet?

For me it fails with a message:

  Failed to move contents of 
/usr/local/software/jureca/Stage2/software/Toolchain/iccifort/2015.3.187/impi/5.1.0.038/impi/5.1.0.038
 to 
/usr/local/software/jureca/Stage2/software/Toolchain/iccifort/2015.3.187/impi/5.1.0.038:
 [Errno 2] No such file or directory.....

That is, the source directory:
... iccifort/2015.3.187/impi/5.1.0.038/impi/5.1.0.038/

does not exist but the installer has built a directory:

.../iccifort/2015.3.187/impi/5.1.0.038/impi/5.1.0.079

I wonder if Intel got the version number internally inconsistient somehow.


It seems to work if i rename the source and change the version from

"5.1.0.038" -> 5.1.0.079

Does anyone else have this experience?
I just gave this a spin too, seems like Intel screwed up the version in the tarball...

The actual version should be 5.1.0.079 (.038 makes no sense, since the previous version was 5.0.3.048).

Here's the easyconfig I'm using now, this works: https://github.com/hpcugent/easybuild-easyconfigs/pull/1793


regards,

Kenneth


-Eric


-------------------------------------------------------
We (EB) really need to come up with a more timely manner of
declaring toolchains.

It's interesting... while reading this morning's mail (probably
while the telecon was going on across the pond), I noticed a
notification from Intel for a new MPI release.
It is always tempting to wait 'just a little bit' for that just
released or soon to be released change, but that often leads to even
more 'just a bit longer' (I do this far too often myself).

For our users who want to chase version numbers, I leave that to them
as something they can build privately.

The immediate question that arose was "oh shit, now WTF do I call a
toolchain with this change?".

To which, there is no clear answer.
I think the answer it to not worry the details and not over think
toolchains.  I don't really want to chase toolchain components but
instead want a tested and fairly stable toolchain.  Preferably with
just one or two updates a year (versus the CentOS 6 toolchain which is
several years old).

I'm looking forward to foss-2015b whether it has GCC 5.1, 5.2 or even
4.X.  Based strictly on version numbers (without any insight as to
significant differences), I would suggest GCC 5.1 as something stable.

I could probably live with foss-2015a which seems to work well and
wait until the November hack-a-thon for foss-2015b.

Note: intel-2015b, if I decide to use it, will likely be 2015B with
GCC 4.9.2 or 4.9.3.
You might try naming your site specific toolchains tamu-intel-2015X so
as to more visibly indicate they are site specific.

Of course, with Intel's new MPI this morning, I need something new.
Do you actually really need to instantly upgrade to the latest intel
compiler?  There might be some practical reasons (especially if Intel
does not leave older versions around).

I just wish y'all had all the answers for me before I had the
question. :)
They do have a good number of answers to questions I'm glad I don't
even need to think about asking.

I'm mostly an easybuild freeloader.  I thank the easybuild developers
who do monitor the toolchain and other software changes and do a lot
of testing and validation of the easyconfig files.  This saves me a
lot of work determining which toolchain components to use.

Stuart
--
I've never been lost; I was once bewildered for three days, but never lost!
                                         --  Daniel Boone


------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
Prof. Dr. Sebastian M. Schmidt
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------

Reply via email to