On 14Aug2015 0038, Nathaniel Smith wrote:
Windows and OS X don't (reliably) have any package manager. So PyPI
*is* inevitably going to contain non-Python shared libraries or
statically linked modules or something like that. (And in fact it
already contains such things today.) I'm not sure what the alternative
would even be.

Windows 10 has a package manager (http://blogs.technet.com/b/keithmayer/archive/2014/04/16/what-s-new-in-powershell-getting-started-with-oneget-in-one-line-with-powershell-5-0.aspx) but I don't think it will be particularly helpful here. The Windows model has always been to only share system libraries and each application should keep its own dependencies local.

I actually like two ideas for Windows (not clear to me how well they apply on other platforms), both of which have been mentioned in the past:

* PyPI packages that are *very* thin wrappers around a shared library

For example, maybe "libpng" shows up on PyPI, and packages can then depend on it. It takes some care on the part of the publisher to maintain version-to-version compatibility (or maybe wheel/setup.py/.cfg grows a way to define vendored dependencies?) but this should be possible today.

* "Packages" that contain shared sources

One big problem on Windows is there's no standard place to put library sources, so build tools can't find them. If a package declared "build requires libpng.x.y source" then there could be tarballs "somewhere" (or even links to public version control) that have that version of the source, and the build tools can add the path references to include it.

I don't have numbers, but I do know that once a C compiler is available the next easiest problem to solve is getting and referencing sources.

Given that, the only situation I can see where we would ever
distribute wheels that require system blas on Linux, is if we were
able to do it alongside wheels that do not require system blas, and
pip were clever enough to reliably always pick the latter except in
cases where the system blas was actually present and working.

I think something similar came up back when we were discussing SSE support in Windows wheels. I'd love to see packages be able to run system checks to determine their own platform string (maybe a pip/wheel extension?) before selecting and downloading a wheel. I think that would actually solve a lot of these issues.

This means that when you build, e.g., scipy, then you get a binary
that depends on things like the in-memory layout of numpy's internal
objects. We'd like it to be the case that when we release a new
version of numpy, pip could realize "hey, this new version says it has
an incompatible ABI that will break your currently installed version
of scipy -- I'd better fetch a new version of scipy as well, or at
least rebuild the same version I already have". Notice that at the
time scipy is built, it is not known which future version of numpy
will require a rebuild. There are a lot of ways this might work on
both the numpy and pip sides -- definitely fodder for a separate
thread -- but that's the basic problem.

There was discussion about an "incompatible_with" metadata item at one point. Could numpy include {incompatible_with: "scipy<x.y"} in such a release? Or would that not be possible.

Cheers,
Steve

-n


_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to