Hi Mikael, you could actually go one step further and also compile OpenBLAS/LAPACK or install MKL at GCCcore level. FZ Julich does something like that now: https://apps.fz-juelich.de/jsc/llview/jureca_modules/Core/gcccoremkl/7.3.0-2019.0.117.xhtml (they use a seperate SciPy-Stack module at the GCCcore level)
In that case you can have an almost fully feature python with numpy etc. at GCCcore, except for mpi4py. This assume there is no benefit to Intel-compiled numpy. I believe there is not really (all the performance comes from MKL) but I believe there was an issue with the exp() function which is taken from libm instead of libimf (see https://github.com/numpy/numpy/issues/8823). Damian, can you tell us how you did that in your most recent iteration? One could even go further and install python at the 'dummy' level or some kind of central "GCCcorecore", after all the Python binary and libpython does not link to any libraries of GCCcore (libstdc++), but just system libraries such as glibc. Although I noticed that in newer toolchains (foss-2018b with GCC 7.3.0 libssp is linked in, but I think this is ABI compatible. Remember that the reason for GCCcore is to provide modern C++ headers and libraries to Intel and other compilers, but for pure C code we don't need to worry about C++ ABI issues. The problem with the dummy toolchain is that it's not really tied to anything, and sets no flags, it just works out in practise to system gcc -- it's really most appropriate for pre-compiled binaries I think. GCCcore-system goes around that but in a hierarchy is a sister of GCCcore-7.3.0 and I think it should sit under it. So I believe there could be something sandwiched in between dummy and GCCcore (that would install things at the "Core" level in a hierarchy, instead of under GCCcore/x.y.z) but I am not sure what it should be and what it should be called. And the GCC "in between" can then still be something newer than the system GCC (which as we know could be as old as 4.4) Bart On Thu, 13 Dec 2018 at 12:13, Mikael Öhman <[email protected]> wrote: > Hi easybuilders, > > We have for the 2018b toolchain been running with a shared libpython at > GCCcore level, which also let us more a lot of additional packages (e.g. > Qt) down to GCCcore level as well, significantly reducing the pointless > duplicated packages (primarily from the graphical stack). > Mixing python interpreter with packages from gcc + icc works just fine. > I'm hoping for something like this to get into upstream EB. > > There was a few fixable issues getting EB to build packages with intel, > but the main issue is just the name. We used the name "Python-core" @ > GCCcore, and had "Python" as a python package bundle on top of this (per > toolchain), so it was completely backwards-compatible for users. > > But, the EB experience is a bit painful; there is quite a few places where > the name matters; For example, we don't get %(pyver)s as a result, and > there a couple of things of minor annoyances like this. > > 1. These issues could all be addressed, but the fix isn't always > necessarily very beautiful; > https://github.com/easybuilders/easybuild-framework/pull/2559 > > 2. Or just exclude numpy, scipy, pandas, ... from the "Python" package > allowing it to be moved the GCCcore. Use a separate package for numpy, etc. > at the toolchain level. > It still doesn't cause an explosion of packages to keep these MPI/BLAS > dependent packages separate, as it's just ~5 of them. > > I would prefer the latter; it requires no hacks in EB, and lets us build > these heavy packages separately, which is easier to debug than the bundle. > I've also found myself explaining that the Python module includes numpy as > often as I have to explain to users that they need to load an additional > module, so I don't really think there is much difference on the user > perspective. > > Thoughts? > I could live with any solution, but I think libpython simply must get to > down to GCCcore one way or another. > > Best regards, Mikael > > -- Dr. Bart E. Oldeman | [email protected] | [email protected] Scientific Computing Analyst / Analyste en calcul scientifique McGill HPC Centre / Centre de Calcul Haute Performance de McGill | http://www.hpc.mcgill.ca Calcul Québec | http://www.calculquebec.ca Compute/Calcul Canada | http://www.computecanada.ca Tel/Tél: 514-396-8926 | Fax/Télécopieur: 514-396-8934

