On Sun, May 12, 2024 at 8:39 PM Gael Varoquaux <
gael.varoqu...@normalesup.org> wrote:

> On Fri, May 10, 2024 at 04:42:49PM +0200, Ralf Gommers wrote:
> > It gets ever-easier to install new Python versions, with
> pyenv/conda/etc. The "my single Python install comes from python.org and
> I'm using the same one because I am afraid to upgrade" is much less of an
> issue than it was 10 years ago. And it's caused mostly by users having
> knowledge gaps. So yes it can be a pain for them, but they'll have to learn
> at some point anyway. Same for "my old HPC cluster uses ..." - it's often
> an even older Python anyway, and you'll have tons of reasons why you don't
> want your cluster-installed stack - learn to use Spack or Conda, and you'll
> be much happier in the long run.
>
> IMHO the view that its a tooling/knowledge gap problem is a bit
> disconnected of users. I meet many users who either
>
> 1. cannot control the Python version they run, or even the base
> environment, because of company culture (decisions at company level on
> these constraints). Maybe upper management in completely misguided here,
> but then upper management must be targeted, and it is not clear at all that
> constraints on packages is the right way to do it, as they typically never
> run any code.
>
> 2. have environments with many dependencies that get in gridlocks of
> dependencies that cannot be mutually satisfied. For my own work it becomes
> increasingly frequent for me to spawn multiple environments that then talk
> to each others (eg via files) to work around these problems.
>
>
> Problem 1 is probably something that user organizations should change, and
> we need to understand why they lock Python versions.


I don't think the problem is what you think it is. It's typically general
IT policies, meaning you cannot do things like download and install random
executables in places like C:\. This is quite common, and it's mostly some
Python installers (python.org ones in particular) falling under a general
policy. It's very rare to have a Python-tooling-specific block - I don't
think I have ever seen something like that. If you can run `pip` and
install to for example locations under your home dir, then you almost
always can also run pyenv/conda/mamba & co to install different Python
versions. And if you cannot run `pip`, there is nothing to discuss w.r.t.
release policy, new package versions don't matter.

It could be a QA issue, and this might reveal both a lack of good practices
> for QA (ie testing) but also the instability of the ecosystems that creates
> a fear of upgrade. We should not be too fast in dismissing these
> organizations as strife with bad practices that could easily be changed, as
> even tech-savy organizations (such as Google, I believe) run into these
> problems.
>

Google (along with many other large tech companies) uses a monorepo. That
means that there is only a single version of any package. This can be a
great strategy once you're large enough; it has costs too, but certainly
shouldn't be qualified as a bad or uninformed strategy. A common support
window like SPEC 0 is actually helpful there.


> Problem 2 is not a problem solvable by users: it comes from the fact that
> dependency version windows are too narrow. Without broad windows on
> dependencies, the more dependencies one has, the more one gets into an
> impossible situation. For this last reason, I strongly advocate that
> packages, in particular core packages, try hard to be as permissible as
> reasonably possible on dependencies.
>

You don't give any details, but the one common place I'm aware of where
narrow support windows are common is when deep learning / CUDA are used.
Many deep learning packages use something like a 3-6 month deprecation
policy, and a 6-12 month support window. And then often depend on 1 or max
2 versions of CUDA and cuDNN. This makes it very challenging to work with
more than one deep learning framework or use them in combination with other
complex packages like Ray. So yes, when you have lots of dependencies like
that, you will have a hard time.

Given you work with deep learning, I'll assume that that's the issue you
are seeing. I don't think we have evidence that a 3 year support window is
too narrow, and we do understand the cost of longer windows. I'll note also
that SPEC 0 says a minimum of 2 years after initial release (so back in
time), but due to backwards compatibility policies the support also goes at
least 1-1.5 years forward in time. So there's a >=3 year window of support
for feature releases of core packages in the scientific Python ecosystem.

Cheers,
Ralf
_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com

Reply via email to