> * One 3.x version at a time. Doesn't line up with cpython's support terms.

folks, deep breath here: this is much more important than the one line
summary suggests.

for some background: i have been using debian since 1996 and python for
20+ years, dating back to python 2.0.

due to the massive amount of accumulated software (several million
development source code files) i run a rolling debian/testing system,
*never* do an "apt-get dist-upgrade", *always* simply perform an on-demand
install of a given package and let apt sort itself out even to the
point often of
having some innocuous package end up pulling in a new libc6-dev and
binutils.

generally this works extremely well: in 20 years the only really serious
problems occur are, yes, you guessed it: python related.

correction: the situation around python 2.2, 2.3 to 2.5 was perfectly fine,
i will touch on this again below.

now, i do hesitate to mention this, but what i am about to tell you, *multiple*
people including debian python package maintainers, have, over the years,
raised it *multiple* times on *multiple* separate occasions, and each and
every time they have been ignored and the bugreport shut down without
due consideration.

if you recall the situation with a version of ubuntu where they created two
versions of ubuntu where python 2.5 was the *only* version supported in
one, and python 2.6 was the *only* version of python available in the next,
anyone who performed an "apt-get dist-upgrade" had python REMOVED
from their running system, and, given that apt itself was critically dependent
on python at the time, people ended up with a permanently unuseable
system.

debian's python packaging is rushing ahead so fast at the moment that it
is in danger of running into something similar.

the key issue is that python-dev is an exclusive "narrow window" that, on
each upgrade, moves to one *AND ONLY ONE* version of python.
python3.6-dev, python3.7-dev, python3.8-dev, and now python3.9-dev.

this is not a problem for the majority of packages that are pure python.
it is however a REALLY SERIOUS problem for c-based modules, and
for python-boost applications it is absolute hell on earth (boost is
a long-term known thorn in the side of developers, and the python bindings
add a layer of extra special hell on top of that)

let's go through a scenario, going back in time somewhat

* developer (me) uses debian/testing which, nominally, is by now
  mostly debian 9 packages, including (mostly) python 3.6
* debian / testing flips python-dev to python3.7-dev and builds
  begin
* ALL PACKAGES WHICH DEPEND ON 3.6 NOW BREAK

due to the hard-coded numbering, *ANY* package containing
c-based modules has debian packaging which now says
"this package now EXCLUSIVELY depends on python3.7.so"

in that package, which is **NOT** numbered python3.7-{packagename},
it is named **PYTHON**-{packagename}, there is **ONLY**
a python3.7.so: the dependency on python3.6.so has been
**REMOVED**

that package, small as it is, ends up with a dependency cascade
that can result in literally HUNDREDS os packages being removed
and/or force-upgraded unnecessarily.

thus, i end up in a really serious situation, just like with ubuntu about
10 years ago, where the only option is to literally remove everything.
given that i am trying to do some development work on which i rely
for income, that is simply not an option to remove critical packages.

now, where it gets particularly bad is during the cross-over point
where some of the packages have not been re-built yet, and some
have.  this happened very recently with the 3.7 to 3.8 cross-over:
an innocuous and simple package had a dependency cascade
where half the required packages - all of which were c-based -
had not even been built yet.

numpy and sci-py, the two "best known" debian python software
packages, have known about this for a long, long, time.  they
quietly solved it by adding explicit building of **MULTIPLE**
versions of numpy-3.5, numpy-3.6, numpy-3.7, etc. etc.

this "rolling release" system basically solves absolutely everything,
and causes no problems whatsoever of any kind.

if i may repeat that again so as to emphasise the deep significance:
numpy and sci-py have *absolutely no problems whatsoever* when
it comes to these critical long-term rolling upgrades.

i cannot emphasise enough how serious this really is.  it is
asking for trouble, and, more than that, has been reported multiple
times and been ignored each and every time.

unfortunately, very recently, it gets even worse.

i was extremely dismayed and alarmed, recently, to learn, on trying
to install a package, that there has been no "stabilisation time"
allowed, and that the latest version of debian/testing has *already*
moved on to python 3.9.

now, if the latest version of debian/testing had stayed with python 3.8,
there would be no problem.  debian/testing would stabilise and become
debian/11, and there would be some peace at least for a while.

it gets worse.

one of our developers, who has never used python before, has reported
that with the python astor package, there's a serious problem:

  File "<frozen importlib._bootstrap>", line 627, in _load_backward_compatible
  File "<frozen zipimport>", line 259, in load_module
  File 
"/opt/python39/lib/python3.9/site-packages/astor-0.8.1-py3.9.egg/astor/__init__.py",
line 24, in <module>
NotADirectoryError: [Errno 20] Not a directory:
'/opt/python39/lib/python3.9/site-packages/astor-0.8.1-py3.9.egg/astor/VERSION'

astor/VERSION is a file, not a directory.  this error should not in the
least bit be happening.

here's where the mistake of the narrow focus really hits home: you're
half way through an upgrade of rebuilding the entirety of python 3.9,
and it turns out that there's a serious problem with either the setuptools
or with python 3.9 itself.

you can't now "back out" of python 3.9 in debian/testing - it's far
too late for that.
you're now - whilst in the middle of a major upgrade / rebuild -
critically dependent
on scrambling to fix something in a moving target (python 3.9).

if there was a policy in place which followed EXACTLY what numpy and
sci-py already do - build c-based packages that remain STABLE - there would
be no problem whatsoever.

the lesson from ubuntu's mistake, 10+ years ago, seems not to have been learned.

to fix this, there needs to be a stable overlap where multiple compiled versions
are installed and maintained. to fix the problem, the rule needs to be set:

* any given version of python *MUST* cross over from one debian/stable
  to another debian/stable.
* only when a given version of python has been in TWO versions of
  debian/stable may it be considered for retirement / removal

yes, this does mean that if there are two major releases of python within
a debian stable-to-stable release, you end up with *FOUR* versions of
python built and supported.  this is just how it is - it's exactly how it
is with numpy and sci-py, which is doing things correctly.

python itself seems to be on an accelerated path at the moment (this
is not a good thing), and it's up to you - the distro - to put in place
policies that provide stable, reliable, long-term packaging.

right now, things are not in the least bit stable (not referring to
debian/stable,
just stability "in general").  no, it is not an option for our team to use
debian/stable itself, because we have a mixture of software packages that
are not even in debian, and have to be installed from git repositories.

i leave you with these insights.  there *is* a solution: do what numpy and
sci-py do.

any questions please feel free to ask.

best,

l.

Reply via email to