Hi, On Thu, Jan 21, 2016 at 2:05 AM, M.-A. Lemburg <m...@egenix.com> wrote: > On 21.01.2016 10:31, Nick Coghlan wrote: >> On 21 January 2016 at 19:03, M.-A. Lemburg <m...@egenix.com> wrote: >>> By using the version based approach, we'd not run into this >>> problem and gain a lot more. >> >> I think it's better to start with a small core that we *know* works, >> then expand later, rather than trying to make the first iteration too >> wide. The "manylinux1" tag itself is versioned (hence the "1" at the >> end), so "manylinux2" may simply have *more* libraries defined, rather >> than newer ones. > > My argument is that the file based approach taken by the PEP > is too limiting to actually make things work for a large > set of Python packages. > > It will basically only work for packages that do not interface > to other external libraries (except for the few cases listed in > the PEP, e.g. X11, GL, which aren't always installed or > available either). > > IMO, testing the versions of a set of libraries is a safer > approach. It's perfectly fine to have a few dependencies > not work in a module because an optional system package is not > installed, e.g. say a package comes with UIs written in > Qt and one in GTK.
Please forgive my slowness, but I don't understand exactly what you mean. Can you give a specific example? Say my package depends on libpng. Call the machine I'm installing on the client machine. Are you saying that, when I build a wheel, I should specify to the wheel what versions of libpng I can tolerate on the the client machine, and if if the client does have a compatible version, then pip should raise an error, perhaps with a useful message about how to get libpng? If you do mean that, how do you want the PEP changed? Best, Matthew _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig