Re: Python policy proposed changes

2005-10-20 Thread Antal A. Buss
Hi,

I don't know if it was already discussed.
If a (little) different approach is taken, such a (x)emacs or drscheme do. 

In installing time, the install script detect which version are installed (I.e. 
emacs and/or xemacs) and then compile the source for each version.

So, it's possible to install new modules to default, legacy and new version of 
Python, maintained only one package, using package dependency to know which 
Python version check.
Specific modules are installed without check installed version

Regards,

 Antal A. Buss



pgpi5TfDKk3Io.pgp
Description: PGP signature


Re: Python policy proposed changes

2005-10-20 Thread Thomas Viehmann
Antal A. Buss wrote:
 So, it's possible to install new modules to default, legacy and new version 
 of Python, maintained only one package, using package dependency to know 
 which Python version check.
 Specific modules are installed without check installed version
This is a good idea, if
- modules are compatible with the involved python versions (could be
  specified by maintainer)
- modules are pure python (i.e. compilation of C/C++.. modules at
  run-time is not an option)

Kind regards

T.

-- 
Thomas Viehmann, http://thomas.viehmann.net/


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-18 Thread Josselin Mouette
Le mardi 18 octobre 2005 à 02:57 +0200, Martin v. Löwis a écrit :
 Josselin Mouette wrote:
  Apart from a typo and the FSF address, the changes are about which
  packaging variants are mandated, recommending to provide only one
  python-foo package for each module, except when depending applications
  mandate another python version.
  
  This way, we could enforce that policy during the transition, removing
  hundreds of cruft python2.X-foo packages.
 
 I don't like this policy. the python2.X-foo are not at all cruft; they
 provide a useful feature.

To what cost? How many gigabytes of mirror space and bandwidth are we
wasting with python2.X-libprout stuff nobody ever uses?

 With the multi-version build, you can support Python versions more 
 recent than the default python version, which is useful for people
 who would like to use more recent versions: they don't have to rebuild
 all their extension modules themselves.

To avoid this, as states the policy, the latest python upstream should
be the unstable version. Furthermore, is this needed that badly? It
isn't possible to use a more recent perl version, but I don't see that
many complaints from perl users.

Even in a situation like the current one, when we're stuck with 2.3 as
the default when there's 2.4 available, there are only a few python
packages which actually need the 2.4 version. In this case, the policy
states they should be built as python2.4-foo, until python2.4 becomes
the default. That's also why modules needed by a lot of binary packages
should be built as multi-binary packages, as there is a probability a
script requires both modules.

But I'm not talking about python-gtk here, I'm talking about those
hundreds of modules actually used by zero or one binary packages. Do we
need multi-binary packages for them? Compared to the waste of human and
computer resources this implies, I'm pretty sure it's not worth the
deal.

 It also simplifies the transition from one Python version to the next:
 people can build and test their packages against newer versions long
 before Debian updates the default. That way, when the default version
 changes, they just have to turn a switch in the default package. This
 reduces the amount of work necessary in the transition phase itself.

You're completely wrong. Based on the 2.2-2.3 experience, multi-binary
packages complexify the transition a lot. Instead of just a rebuild,
they require adding the new version and removing the older ones, on a
case-by-case basis, before going through the NEW queue. Believe me, it's
more work for everyone.

 Of course, supporting versions older than the default version is rarely
 needed, except when there are applications that require such older
 versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5)
 packages should be built.

Don't you understand that it's even more added work to remove the legacy
python 2.1 to 2.3 packages in hundreds of packages ?
-- 
 .''`.   Josselin Mouette/\./\
: :' :   [EMAIL PROTECTED]
`. `'[EMAIL PROTECTED]
   `-  Debian GNU/Linux -- The power of freedom



Re: Python policy proposed changes

2005-10-18 Thread Martin v. Löwis

To what cost? How many gigabytes of mirror space and bandwidth are we
wasting with python2.X-libprout stuff nobody ever uses?


I don't know. What is the answer to this question? I wouldn't expect
it to be more than 1GiB per mirror, though, likely much less. On
i386, for example, the useless python2.[124]- packages for example
seem to add up to 59MiB, if I counted correctly.


Even in a situation like the current one, when we're stuck with 2.3 as
the default when there's 2.4 available, there are only a few python
packages which actually need the 2.4 version. 


What do you mean, actually need? Every python2.3-foo package actually
needs python2.4. If you have only python2.3-foo installed, and do

~$ python2.4
Python 2.4.1 (#2, May  5 2005, 11:32:06)
[GCC 3.3.5 (Debian 1:3.3.5-12)] on linux2
Type help, copyright, credits or license for more information.
 import foo
Traceback (most recent call last):
  File stdin, line 1, in ?
ImportError: No module named foo

This is because python2.3-foo installed into python2.3's site-packages,
so it won't be available in python2.4. You really need a separate
package for 2.4.


In this case, the policy
states they should be built as python2.4-foo, until python2.4 becomes
the default. That's also why modules needed by a lot of binary packages
should be built as multi-binary packages, as there is a probability a
script requires both modules.


This I don't understand. You mean, a script might require both
python2.3-foo and python2.4-foo if foo contains an extension module?


But I'm not talking about python-gtk here, I'm talking about those
hundreds of modules actually used by zero or one binary packages. Do we
need multi-binary packages for them? Compared to the waste of human and
computer resources this implies, I'm pretty sure it's not worth the
deal.


It's a policy decision, obviously. I wonder how many users you have
interviewed or what other criteria you have used to decide what is
best for the users. IOW, even if this policy is chosen, it lacks
rationale, IMO.


Of course, supporting versions older than the default version is rarely
needed, except when there are applications that require such older
versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5)
packages should be built.



Don't you understand that it's even more added work to remove the legacy
python 2.1 to 2.3 packages in hundreds of packages ?


It is more work, but I don't understand why it is significantly more
work. Maintainers just have to remove all traces of 2.1, 2.2, and 2.3.
from their debian directory, and rebuild, no?

Anyway, it's hardly hundreds of. I counted 194 python2.3- packages,
82 python2.2- packages, and 46 python2.1- packages. There are also
125 python2.4- packages, so the majority of the packages has already
prepared for the transition.

Regards,
Martin


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-18 Thread Josselin Mouette
Le mardi 18 octobre 2005 à 10:24 +0200, Martin v. Löwis a écrit :
  Even in a situation like the current one, when we're stuck with 2.3 as
  the default when there's 2.4 available, there are only a few python
  packages which actually need the 2.4 version. 
 
 What do you mean, actually need? Every python2.3-foo package actually
 needs python2.4. If you have only python2.3-foo installed, and do
 
 ~$ python2.4
 Python 2.4.1 (#2, May  5 2005, 11:32:06)
 [GCC 3.3.5 (Debian 1:3.3.5-12)] on linux2
 Type help, copyright, credits or license for more information.
   import foo
 Traceback (most recent call last):
File stdin, line 1, in ?
 ImportError: No module named foo
 
 This is because python2.3-foo installed into python2.3's site-packages,
 so it won't be available in python2.4. You really need a separate
 package for 2.4.

I don't understand your approach of the issue. The default python
version is available in the python command, and all modules should be
available when launching python, not python2.4.

  In this case, the policy
  states they should be built as python2.4-foo, until python2.4 becomes
  the default. That's also why modules needed by a lot of binary packages
  should be built as multi-binary packages, as there is a probability a
  script requires both modules.
 
 This I don't understand. You mean, a script might require both
 python2.3-foo and python2.4-foo if foo contains an extension module?

No, I mean a script using the foo module, which is only available for
python 2.4, might also require the gtk module, as the latter is widely
used. That's a reason to build a module like gtk in a multi-binary
fashion.

 It's a policy decision, obviously. I wonder how many users you have
 interviewed or what other criteria you have used to decide what is
 best for the users. IOW, even if this policy is chosen, it lacks
 rationale, IMO.

The rationale is to save mirror size, bandwidth and Packages size, and
to make things clearer to our users. It is your rationale which I wonder
about. You want all modules to be available for all python versions?
What does it bring to our users? When using python, you're using the
python binary, you don't want to ask yourself what version to use.

  Don't you understand that it's even more added work to remove the legacy
  python 2.1 to 2.3 packages in hundreds of packages ?
 
 It is more work, but I don't understand why it is significantly more
 work. Maintainers just have to remove all traces of 2.1, 2.2, and 2.3.
 from their debian directory, and rebuild, no?

It's the just that bothers me. It's not a just when you're talking
about rebuilding 200 packages.

 Anyway, it's hardly hundreds of. I counted 194 python2.3- packages,
 82 python2.2- packages, and 46 python2.1- packages. There are also
 125 python2.4- packages, so the majority of the packages has already
 prepared for the transition.

You mean that's only 3 percent of the total number of packages that
could be saved?

Regards,
-- 
 .''`.   Josselin Mouette/\./\
: :' :   [EMAIL PROTECTED]
`. `'[EMAIL PROTECTED]
   `-  Debian GNU/Linux -- The power of freedom



Re: Python policy proposed changes

2005-10-18 Thread Donovan Baarda
On Tue, 2005-10-18 at 08:23, Josselin Mouette wrote:
 Le mardi 18 octobre 2005 à 02:57 +0200, Martin v. Löwis a écrit :
  Josselin Mouette wrote:
[...]
 But I'm not talking about python-gtk here, I'm talking about those
 hundreds of modules actually used by zero or one binary packages. Do we
 need multi-binary packages for them? Compared to the waste of human and
 computer resources this implies, I'm pretty sure it's not worth the
 deal.

Just because there are no Debian packages that depend on them, doesn't
mean they aren't used. A simple example is python2.1-mysqldb. I'm pretty
sure there would be no Debian packages that use it, but anyone
developing/testing for python2.1 that needs to talk to a mysql database
will need it. The fact that Ubuntu has it made my life much easier...

  It also simplifies the transition from one Python version to the next:
  people can build and test their packages against newer versions long
  before Debian updates the default. That way, when the default version
  changes, they just have to turn a switch in the default package. This
  reduces the amount of work necessary in the transition phase itself.
 
 You're completely wrong. Based on the 2.2-2.3 experience, multi-binary
 packages complexify the transition a lot. Instead of just a rebuild,
 they require adding the new version and removing the older ones, on a
 case-by-case basis, before going through the NEW queue. Believe me, it's
 more work for everyone.

This is why I suggested that the policy recommend seperate python-foo
and pythonx.y-foo source packages for multi-version support. Have the
pythonx.y-foo source package build all the pythonX.Y-foo binary
packages, and have a simple python-foo source package build the small
binary python-foo wrapper package. This way, when python transitions,
only the small python-foo source package needs to have it's dependencies
updated, so only the small python-foo binary wrapper package gets
re-built. The pythonx.y-foo source package and pythonX.Y-foo binary
packages don't need to be touched.

  Of course, supporting versions older than the default version is rarely
  needed, except when there are applications that require such older
  versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5)
  packages should be built.
 
 Don't you understand that it's even more added work to remove the legacy
 python 2.1 to 2.3 packages in hundreds of packages ?

So why remove them? If they aren't being re-built, they are not being
updated, and hence are not being re-mirrored. Sure, they are using disk
space on the mirrors, but it's not that much, and they are not eating
bandwidth, which is the bit that really hurts.

Rather than add the burden of removing legacy packages into the
transition process, by de-coupling them Python can transition first, and
then the legacy packages can be kept or removed at the individual
package maintainers discretion later.

This only applies to multi-version supporting packages... 

-- 
Donovan Baarda [EMAIL PROTECTED]


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-17 Thread Martin v. Löwis

Josselin Mouette wrote:

Apart from a typo and the FSF address, the changes are about which
packaging variants are mandated, recommending to provide only one
python-foo package for each module, except when depending applications
mandate another python version.

This way, we could enforce that policy during the transition, removing
hundreds of cruft python2.X-foo packages.


I don't like this policy. the python2.X-foo are not at all cruft; they
provide a useful feature.

With the multi-version build, you can support Python versions more 
recent than the default python version, which is useful for people

who would like to use more recent versions: they don't have to rebuild
all their extension modules themselves.

It also simplifies the transition from one Python version to the next:
people can build and test their packages against newer versions long
before Debian updates the default. That way, when the default version
changes, they just have to turn a switch in the default package. This
reduces the amount of work necessary in the transition phase itself.

Of course, supporting versions older than the default version is rarely
needed, except when there are applications that require such older
versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5)
packages should be built.

Regards,
Martin


--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-13 Thread Donovan Baarda
On Tue, 2005-10-11 at 20:23, Josselin Mouette wrote:
 Le lundi 10 octobre 2005 à 17:01 +0100, Donovan Baarda a écrit :
  In 2.2.2, I would remove the only from only supports python versions
  different from the currrent default one... You can use this for
  packages that support the current default one as well as other versions.
 
 The next section deals about such a scheme.

OK.

  In 2.2.3, part 1. I would recommend (mandate?) that the python-foo dummy
  wrapper package should have it's own tiny source package, seperate from
  the pythonX.Y-foo source package. This way, only the small python-foo
  wrapper package needs to be updated and re-built when the default python
  changes.
 
 I think the ftp-masters and/or release would object to having so many
 more source packages to deal with; I've heard it would cause problems
 when we went up with this kind of idea for GNOME metapackages.

What is worse, an extra very small source package, or a single large
source package that generates x large + 1 small binary packages (where
x=number of python versions supported) and gets upgraded more
frequently. 

As an end user on a modem I know downloading all the binary packages
again when really only the python-foo wrapper dependencies changed hurts
much more, and I suspect this hurts mirrors more too.

  For 2.2.3 part 2, I don't know what to do... there has been no progress
  in making this work for a long time, and I don't know if there ever will
  be. It's probably still worth documenting as a wish list, but I can't
  help but think it's overly complicated somehow.
 
 I agree, maybe we should remove documentation about things not working.

But it is handy to have wishlist stuff documented somewhere... perhaps a
wishlist bug?

  Another alternative for 2.2.3 part 2 is the practice adopted by some
  pure-python packages of putting modules in /usr/lib/site-python, which
  can be found on the default sys.path of all versions of python. The
  ugliness of this is the pyc's get recompiled for different versions of
  python, and will be re-written if the user has write access there. For
  all it's drawbacks, it is a much lighter alternative to the mass of
  symlinks approach...
 
 I wouldn't recommend this method, because of that possible .pyc
 autogeneration and the stale files it leaves in the filesystem.

It doesn't really leave stale pyc's, as they should still get cleaned up
when the package is removed. What it does mean is the pyc's that are
there might not be compiled for the right version. To help ensure that
they are compiled for the right version, the python.postinst should
re-compile everything in /usr/lib/site-python...

Another option would be for python.postinst to trigger
dpkg-reconfigure's for everything that has Depends: python.


-- 
Donovan Baarda [EMAIL PROTECTED]


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-13 Thread Donovan Baarda
On Tue, 2005-10-11 at 20:29, Josselin Mouette wrote:
 Le lundi 10 octobre 2005 à 17:14 +0100, Donovan Baarda a écrit :
  The best person to decide what packages need to support which old
  versions of python are the package maintainers. They know this based on
  the requests and bug reports from the people that need them. It is up to
  them to balence the hassle of packaging them against the hassle of
  dealing with complaints.
 
 You are right, that's up to the maintainer.

BTW, at a large search engine company where I now work, it has proven
useful that Ubuntu still has legacy Python packages for v2.1 through
v2.4. If it didn't, we would be forced to build/support our own (which
we may yet need to do).

So even if Debian chooses not to support legacy versions of Python,
please keep a framework that makes it easy to add extra packages to
support it :-)

-- 
Donovan Baarda [EMAIL PROTECTED]


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-11 Thread Josselin Mouette
Le lundi 10 octobre 2005 à 17:01 +0100, Donovan Baarda a écrit :
 In 2.2.2, I would remove the only from only supports python versions
 different from the currrent default one... You can use this for
 packages that support the current default one as well as other versions.

The next section deals about such a scheme.

 In 2.2.3, part 1. I would recommend (mandate?) that the python-foo dummy
 wrapper package should have it's own tiny source package, seperate from
 the pythonX.Y-foo source package. This way, only the small python-foo
 wrapper package needs to be updated and re-built when the default python
 changes.

I think the ftp-masters and/or release would object to having so many
more source packages to deal with; I've heard it would cause problems
when we went up with this kind of idea for GNOME metapackages.

 For 2.2.3 part 2, I don't know what to do... there has been no progress
 in making this work for a long time, and I don't know if there ever will
 be. It's probably still worth documenting as a wish list, but I can't
 help but think it's overly complicated somehow.

I agree, maybe we should remove documentation about things not working.

 Another alternative for 2.2.3 part 2 is the practice adopted by some
 pure-python packages of putting modules in /usr/lib/site-python, which
 can be found on the default sys.path of all versions of python. The
 ugliness of this is the pyc's get recompiled for different versions of
 python, and will be re-written if the user has write access there. For
 all it's drawbacks, it is a much lighter alternative to the mass of
 symlinks approach...

I wouldn't recommend this method, because of that possible .pyc
autogeneration and the stale files it leaves in the filesystem.
-- 
 .''`.   Josselin Mouette/\./\
: :' :   [EMAIL PROTECTED]
`. `'[EMAIL PROTECTED]
  `-  Debian GNU/Linux -- The power of freedom


signature.asc
Description: This is a digitally signed message part


Re: Python policy proposed changes

2005-10-10 Thread Donovan Baarda
On Sun, 2005-10-09 at 13:36, Josselin Mouette wrote:
 Le dimanche 09 octobre 2005 à 14:30 +0200, Matthias Klose a écrit :
[...]
  I don't like the idea that maintainers of depending
  applications have to fight with maintainers of library packages,
  which versions they should provide.
 
 Maybe we could make it easier; for example only modules with more than
 $threshold (say 5) depending applications should provide several
 versions, while less widely used modules shouldn't.

I think it is up to the package maintainers... fights are how you
convice package maintainers to support older versions :-)

The best person to decide what packages need to support which old
versions of python are the package maintainers. They know this based on
the requests and bug reports from the people that need them. It is up to
them to balence the hassle of packaging them against the hassle of
dealing with complaints.

You could institute some degree of manditory legacy support, but exactly
what the degree is would be kind of hard to pin down.

By recommending 2.2.1 Support Only The Default Version, we are
recommending no legacy support. It would be interesting to see just how
many packages end up providing legacy support anyway after a period of
time. I have a feeling Zope and Mailman dependencies will end up
defining the extent of legacy support.

-- 
Donovan Baarda [EMAIL PROTECTED]


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]



Re: Python policy proposed changes

2005-10-09 Thread Matthias Klose
Josselin Mouette writes:
 Apart from a typo and the FSF address, the changes are about which
 packaging variants are mandated, recommending to provide only one
 python-foo package for each module, except when depending applications
 mandate another python version.
 
 This way, we could enforce that policy during the transition, removing
 hundreds of cruft python2.X-foo packages.

we should deprecate building python2.1-foo and python2.2-foo versions
of the packages, I don't like the idea that maintainers of depending
applications have to fight with maintainers of library packages,
which versions they should provide.

  Matthias


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]