2015-01-25 16:46 GMT+01:00 Nathaniel Smith <n...@pobox.com>:

> On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner <cmkleff...@gmail.com>
> wrote:
> >
> > 2015-01-23 0:23 GMT+01:00 Nathaniel Smith <n...@pobox.com>:
> >>
> >> On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner <cmkleff...@gmail.com>
> >> wrote:
> >> > OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
> >> > wheels
> >> > mentioned above are dependant on the installation of the OpenBLAS
> based
> >> > numpy and won't work i.e. with an installed  numpy-MKL.
> >>
> >> This sounds like it probably needs to be fixed before we can recommend
> >> the scipy wheels for anyone? OTOH it might be fine to start
> >> distributing numpy wheels first.
> >
> >
> > I very much prefer dynamic linking to numpy\core\libopenblas.dll instead
> of
> > static linking to avoid bloat. This matters, because libopenblas.dll is a
> > heavy library (around 30Mb for amd64). As a consequence all packages with
> > dynamic linkage to OpenBLAS depend on numpy-openblas. This is not
> different
> > to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.
>
> The difference is that if we upload this as the standard scipy wheel,
> and then someone goes "hey, look, a new scipy release just got
> announced, 'pip upgrade scipy'", then the result will often be that
> they just get random unexplained crashes. I think we should try to
> avoid that kind of outcome, even if it means making some technical
> compromises. The whole idea of having the wheels is to make fetching
> particular versions seamless and robust, and the other kinds of builds
> will still be available for those willing to invest more effort.
>
> One solution would be for the scipy wheel to explicitly depend on a
> numpy+openblas wheel, so that someone doing 'pip install scipy' also
> forced a numpy upgrade. But I think we should forget about trying this
> given the current state of python packaging tools: pip/setuptools/etc.
> are not really sophisticated enough to let us do this without a lot of
> kluges and compromises, and anyway it is nicer to allow scipy and
> numpy to be upgraded separately.
>

I've learned, that mark numpy with something like numpy+openblas is called
"local version identifier":
https://www.python.org/dev/peps/pep-0440/#local-version-identifiers
These identifieres are not allowed for Pypi however.

>
> Another solution would be to just include openblas in both. This
> bloats downloads, but I'd rather waste 30 MiB then waste users' time
> fighting with random library incompatibility nonsense that they don't
> care about.
>
> Another solution would be to split the openblas library off into its
> own "python package", that just dropped the binary somewhere where it
> could be found later, and then have both the numpy and scipy wheels
> depend on this package.
>

Creating a dedicated OpenBLAS package and adding this package as an
dependancy to numpy/scipy would also allow independant upgrade paths to
OpenBLAS, numpy and scipy. The API of OpenBLAS seems to be stable enough to
allow for that. Having an additional package dependancy is a minor problem,
as pip can handle this automatically for the user.


> We could start with the brute force solution (just including openblas
> in both) for the first release, and then upgrade to the fancier
> solution (both depend on a separate package) later.
>
> >> > For the numpy 32bit builds there are 3 failures for special FP value
> >> > tests,
> >> > due to a bug in mingw-w64 that is still present. All scipy versions
> show
> >> > up
> >> > 7 failures with some numerical noise, that could be ignored (or
> >> > corrected
> >> > with relaxed asserts in the test code).
> >> >
> >> > PR's for numpy and scipy are in preparation. The mingw-w64 compiler
> used
> >> > for
> >> > building can be found at
> >> > https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
> >>
> >> Correct me if I'm wrong, but it looks like there isn't any details on
> >> how exactly the compiler was set up? Which is fine, I know you've been
> >> doing a ton of work on this and it's much appreciated :-). But
> >> eventually I do think a prerequisite for us adopting these as official
> >> builds is that we'll need a text document (or an executable script!)
> >> that walks through all the steps in setting up the toolchain etc., so
> >> that someone starting from scratch could get it all up and running.
> >> Otherwise we run the risk of eventually ending up back where we are
> >> today, with a creaky old mingw binary snapshot that no-one knows how
> >> it works or how to reproduce...
> >
> >
> > This has to be done and is in preperation, but not ready for consumption
> > right now. Some preliminary information is given here:
> >
> https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014-11-readme.md
>
> Right, I read that :-). There's no way that I could sit down with that
> document and a clean windows install and replicate your mingw-w64
> toolchain, though :-). Which, like I said, is totally fine at this
> stage in the process, I just wanted to make sure that this step is on
> the radar, b/c it will eventually become crucial.
>
> -n
>
> --
> Nathaniel J. Smith
> Postdoctoral researcher - Informatics - University of Edinburgh
> http://vorpus.org
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to