M.-A. Lemburg wrote:
Orestis Markou wrote:
M.-A. Lemburg wrote:
Just to avoid any misunderstandings: I don't have anything
against argparse or adding it to the stdlib.

I'm only concerned about the effects of deprecating modules
that are in wide-spread use, in this case getopt and optparse,
without providing a drop-in API compatible replacement.
A deprecated -> removed module can always exist in PyPI and manually
installed (or perhaps they can live in their own special place
stdlib-legacy). When a new version of Python comes out, people *do* have
to spend some time testing their apps, so if something has moved from
stdlib to stdlib-legacy, they can just install that in their site
packages and go on pretending nothing happened.

True, but the modules in question mainly target small scripts,
not big applications. And those typically use the default Python
version found on the system via "#!/usr/bin/env python".

However, perhaps we can reach some middle-ground: we split out
LTS legacy modules into a stdlib-legacy or stdlib-compat
collection, keep applying patches to them as needed and
only apply the deprecation mechanism to the documentation
of these modules (the deprecation warnings tend to upset
consumers of the script's output and thus cause breakage on
their own).

Applying patches presumes we have tests (or bug reports). Maintaining dead code still has a cost. Of course whilst a module is in the standard library we should be committed to this (probably more than we are but then we're a volunteer based project) - but this is one of the reasons that we *won't* guarantee to keep any arbitrary module in the standard library forever without a maintainer. It is still an orthogonal point to any individual question of module removal however.


That way, new code will likely start to be written against
the new modules, while old tested and working code can continue
to run with the old tested and working modules.


This is an interesting idea, but hard to implement in a meaningful way. (i.e. if existing scripts using legacy modules are to continue working with new versions of Python then the legacy modules will have to be installed and on the path by default - and de-facto the situation is unchanged.)

I do agree that deprecating the documentation long before removing any modules is desirable - and probably achieves the same result as actually partitioning modules. I wouldn't be against a legacy package though as a 'physical' partitioning has some benefits on the path to 'complete deprecation' (whatever that may mean).

For me, deprecating modules and adding new ones is more of a social move
- it shows that the Python community cares, and it moves Python closer
to the bleeding edge. I think it can be done without alienating legacy
users.

I think points to one of the differences we are seeing in this
discussion:

One group of people wants bleeding edge, the other is more
interested in a stable code base.

I don't expect the standard library to get close to 'the-bleeding-edge'.

There seem to be three positions:

1) Virtually no changes or improvements to the standard library at all - nothing beyond maintaining compatibility with language changes. (Laura) 2) New modules are acceptable but old modules should remain forever. (Antoine) 3) New modules are acceptable and eventual removal of old modules is desirable. (Brett, myself, Jesse and Frank)

Marc-Andre seems to fall somewhere between 1 and 2 and Orestis wants the bleeding edge. (Sorry for these caricatures but it seems approximately right.)

I'd still like to write a longer piece on why I think 1) isn't possible and 3) is preferable to 2) - but the basic points have all been covered in this thread anyway.

IMHO, it's fair to ask the first group to put in a little more
effort and download whatever they want from PyPI, than putting
the burden on the second group.

After all, people who use bleeding edge code, need to be aware
of the fact that things can break, so it's better to make that
choice explicit, than have it made implicit for everyone.

It also seems fair to say that if people want stability to the point of no change at all then they should stick to a version of Python that meets their requirements. Language changes and even bugfixes introduce incompatibilities that can't and won't be avoided so there simply *is not* a guarantee that old code will continue to work forever.

Another idea is to have a standard library 'bleeding-edge' package, containing modules that we expect to move into the standard library in the next version of Python. This could be an additional install (with convenient installers for Mac OS X and Windows).

This slows down the introduction of new modules - whilst bringing them into the standard library but 'on probation'. We still insist on the PEP process for bringing in new modules and Steven will be writing a PEP for argparse (which will address the specific question of deprecation of optparse / getopt) so I don't know if a bleeding-edge package is worth the effort.

Michael

--
http://www.ironpythoninaction.com/

_______________________________________________
stdlib-sig mailing list
stdlib-sig@python.org
http://mail.python.org/mailman/listinfo/stdlib-sig

Reply via email to