On Tue, 18 Sep 2018 at 11:51, Joni Orponen <j.orpo...@4teamwork.ch> wrote:
>
> On Mon, Sep 17, 2018 at 6:07 PM Antoine Pitrou <anto...@python.org> wrote:
>>
>> Paul Moore wrote:
>> > I'm not really familiar with manylinux1, but I'd be concerned if we
>> > started getting bug reports on pip because we installed a library that
>> > claimed to be manylinux1 and was failing because it wasn't. (And yes,
>> > packaging errors like this are a common source of pip bug reports).
>> >
>> > It seems to me that it's defeating the purpose of having standards if
>> > people aren't willing to follow them...
>>
>> I agree with that. OTOH it seems providing binary wheels is generally a 
>> strong
>> demand from the community. I would be fine with only providing conda 
>> packages myself.
>
>
> The biggest demand seems to be for developer convenience of quick downloads / 
> installs and by people whom have not delved very deep into the gnarly black 
> arts of cross compilation and forwards / backwards compatibility maintenance.
>
> Deployment bandwidth costs and install times are a second tier use, but still 
> a real concern to any parties whom should consider sponsoring any effort 
> going towards solving anything within the scope, as solving their gripes 
> would save them money.
>
>> By the way other packages are already doing worse:
>> https://github.com/tensorflow/tensorflow/issues/8802
>
>
> Domain specific packages with real industry needs will need to deviate from 
> any standard put forth as the world of the bleeding edge moves faster than 
> the standards can.

Implementing the necessary changes [1] to support manylinux2010 across
the various tools and components isn't hard (so nobody is going to do
it for the intellectual challenge), it's just tedious (so nobody is
likely to do it for fun, either).

Unfortunately, even large companies like Google are mostly sitting
back and passively consuming Python packaging projects maintained by
folks in their spare time (see
https://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#sustainability-and-the-bystander-effect).

Google could likely address the bulk of Python's near term Linux
packaging maintainability problems simply by advertising for a
full-time "Python project packaging at Google" maintainer role, and
setting whoever they hire loose on the problems of getting
manylinux2010 rolled out, defining manylinux2014 (the RHEL/CentOS
7/Ubuntu 14.04 baseline), and defining official support for
distro-specific wheels (such that they can publish their Ubuntu-only
tensorflow packages properly, without messing up the package
installation experience for users of other distros).

Google as a company know how that works, since they do things properly
for other aspects of Python development (such as gradual typing).
That's why their deliberate violation of the manylinux1 spec with
their tensorflow packages is so egregious - for whatever reason, it
hasn't occurred to them that the only reason their tensorflow
workflows are currently working without massive build times is because
we're not actively enforcing spec compliance at the PyPI level. If
PyPI were to ever start running auditwheel on upload, and subsequently
hide manylinux wheels that failed the check (such that installers were
able to properly meet the binary compatibility assurances nominally
offered by the specifications).

Cheers,
Nick.

[1] https://github.com/pypa/manylinux/issues/179

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/WGRHDZA4BALHFZXLKYMM4WTWVZZFT6ZZ/

Reply via email to