Re: Python policy proposed changes
Le mardi 18 octobre 2005 à 02:57 +0200, Martin v. Löwis a écrit : Josselin Mouette wrote: Apart from a typo and the FSF address, the changes are about which packaging variants are mandated, recommending to provide only one python-foo package for each module, except when depending applications mandate another python version. This way, we could enforce that policy during the transition, removing hundreds of cruft python2.X-foo packages. I don't like this policy. the python2.X-foo are not at all cruft; they provide a useful feature. To what cost? How many gigabytes of mirror space and bandwidth are we wasting with python2.X-libprout stuff nobody ever uses? With the multi-version build, you can support Python versions more recent than the default python version, which is useful for people who would like to use more recent versions: they don't have to rebuild all their extension modules themselves. To avoid this, as states the policy, the latest python upstream should be the unstable version. Furthermore, is this needed that badly? It isn't possible to use a more recent perl version, but I don't see that many complaints from perl users. Even in a situation like the current one, when we're stuck with 2.3 as the default when there's 2.4 available, there are only a few python packages which actually need the 2.4 version. In this case, the policy states they should be built as python2.4-foo, until python2.4 becomes the default. That's also why modules needed by a lot of binary packages should be built as multi-binary packages, as there is a probability a script requires both modules. But I'm not talking about python-gtk here, I'm talking about those hundreds of modules actually used by zero or one binary packages. Do we need multi-binary packages for them? Compared to the waste of human and computer resources this implies, I'm pretty sure it's not worth the deal. It also simplifies the transition from one Python version to the next: people can build and test their packages against newer versions long before Debian updates the default. That way, when the default version changes, they just have to turn a switch in the default package. This reduces the amount of work necessary in the transition phase itself. You're completely wrong. Based on the 2.2-2.3 experience, multi-binary packages complexify the transition a lot. Instead of just a rebuild, they require adding the new version and removing the older ones, on a case-by-case basis, before going through the NEW queue. Believe me, it's more work for everyone. Of course, supporting versions older than the default version is rarely needed, except when there are applications that require such older versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5) packages should be built. Don't you understand that it's even more added work to remove the legacy python 2.1 to 2.3 packages in hundreds of packages ? -- .''`. Josselin Mouette/\./\ : :' : [EMAIL PROTECTED] `. `'[EMAIL PROTECTED] `- Debian GNU/Linux -- The power of freedom
Re: Python policy proposed changes
To what cost? How many gigabytes of mirror space and bandwidth are we wasting with python2.X-libprout stuff nobody ever uses? I don't know. What is the answer to this question? I wouldn't expect it to be more than 1GiB per mirror, though, likely much less. On i386, for example, the useless python2.[124]- packages for example seem to add up to 59MiB, if I counted correctly. Even in a situation like the current one, when we're stuck with 2.3 as the default when there's 2.4 available, there are only a few python packages which actually need the 2.4 version. What do you mean, actually need? Every python2.3-foo package actually needs python2.4. If you have only python2.3-foo installed, and do ~$ python2.4 Python 2.4.1 (#2, May 5 2005, 11:32:06) [GCC 3.3.5 (Debian 1:3.3.5-12)] on linux2 Type help, copyright, credits or license for more information. import foo Traceback (most recent call last): File stdin, line 1, in ? ImportError: No module named foo This is because python2.3-foo installed into python2.3's site-packages, so it won't be available in python2.4. You really need a separate package for 2.4. In this case, the policy states they should be built as python2.4-foo, until python2.4 becomes the default. That's also why modules needed by a lot of binary packages should be built as multi-binary packages, as there is a probability a script requires both modules. This I don't understand. You mean, a script might require both python2.3-foo and python2.4-foo if foo contains an extension module? But I'm not talking about python-gtk here, I'm talking about those hundreds of modules actually used by zero or one binary packages. Do we need multi-binary packages for them? Compared to the waste of human and computer resources this implies, I'm pretty sure it's not worth the deal. It's a policy decision, obviously. I wonder how many users you have interviewed or what other criteria you have used to decide what is best for the users. IOW, even if this policy is chosen, it lacks rationale, IMO. Of course, supporting versions older than the default version is rarely needed, except when there are applications that require such older versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5) packages should be built. Don't you understand that it's even more added work to remove the legacy python 2.1 to 2.3 packages in hundreds of packages ? It is more work, but I don't understand why it is significantly more work. Maintainers just have to remove all traces of 2.1, 2.2, and 2.3. from their debian directory, and rebuild, no? Anyway, it's hardly hundreds of. I counted 194 python2.3- packages, 82 python2.2- packages, and 46 python2.1- packages. There are also 125 python2.4- packages, so the majority of the packages has already prepared for the transition. Regards, Martin -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]
Re: Python policy proposed changes
Le mardi 18 octobre 2005 à 10:24 +0200, Martin v. Löwis a écrit : Even in a situation like the current one, when we're stuck with 2.3 as the default when there's 2.4 available, there are only a few python packages which actually need the 2.4 version. What do you mean, actually need? Every python2.3-foo package actually needs python2.4. If you have only python2.3-foo installed, and do ~$ python2.4 Python 2.4.1 (#2, May 5 2005, 11:32:06) [GCC 3.3.5 (Debian 1:3.3.5-12)] on linux2 Type help, copyright, credits or license for more information. import foo Traceback (most recent call last): File stdin, line 1, in ? ImportError: No module named foo This is because python2.3-foo installed into python2.3's site-packages, so it won't be available in python2.4. You really need a separate package for 2.4. I don't understand your approach of the issue. The default python version is available in the python command, and all modules should be available when launching python, not python2.4. In this case, the policy states they should be built as python2.4-foo, until python2.4 becomes the default. That's also why modules needed by a lot of binary packages should be built as multi-binary packages, as there is a probability a script requires both modules. This I don't understand. You mean, a script might require both python2.3-foo and python2.4-foo if foo contains an extension module? No, I mean a script using the foo module, which is only available for python 2.4, might also require the gtk module, as the latter is widely used. That's a reason to build a module like gtk in a multi-binary fashion. It's a policy decision, obviously. I wonder how many users you have interviewed or what other criteria you have used to decide what is best for the users. IOW, even if this policy is chosen, it lacks rationale, IMO. The rationale is to save mirror size, bandwidth and Packages size, and to make things clearer to our users. It is your rationale which I wonder about. You want all modules to be available for all python versions? What does it bring to our users? When using python, you're using the python binary, you don't want to ask yourself what version to use. Don't you understand that it's even more added work to remove the legacy python 2.1 to 2.3 packages in hundreds of packages ? It is more work, but I don't understand why it is significantly more work. Maintainers just have to remove all traces of 2.1, 2.2, and 2.3. from their debian directory, and rebuild, no? It's the just that bothers me. It's not a just when you're talking about rebuilding 200 packages. Anyway, it's hardly hundreds of. I counted 194 python2.3- packages, 82 python2.2- packages, and 46 python2.1- packages. There are also 125 python2.4- packages, so the majority of the packages has already prepared for the transition. You mean that's only 3 percent of the total number of packages that could be saved? Regards, -- .''`. Josselin Mouette/\./\ : :' : [EMAIL PROTECTED] `. `'[EMAIL PROTECTED] `- Debian GNU/Linux -- The power of freedom
Re: Python policy proposed changes
On Tue, 2005-10-18 at 08:23, Josselin Mouette wrote: Le mardi 18 octobre 2005 à 02:57 +0200, Martin v. Löwis a écrit : Josselin Mouette wrote: [...] But I'm not talking about python-gtk here, I'm talking about those hundreds of modules actually used by zero or one binary packages. Do we need multi-binary packages for them? Compared to the waste of human and computer resources this implies, I'm pretty sure it's not worth the deal. Just because there are no Debian packages that depend on them, doesn't mean they aren't used. A simple example is python2.1-mysqldb. I'm pretty sure there would be no Debian packages that use it, but anyone developing/testing for python2.1 that needs to talk to a mysql database will need it. The fact that Ubuntu has it made my life much easier... It also simplifies the transition from one Python version to the next: people can build and test their packages against newer versions long before Debian updates the default. That way, when the default version changes, they just have to turn a switch in the default package. This reduces the amount of work necessary in the transition phase itself. You're completely wrong. Based on the 2.2-2.3 experience, multi-binary packages complexify the transition a lot. Instead of just a rebuild, they require adding the new version and removing the older ones, on a case-by-case basis, before going through the NEW queue. Believe me, it's more work for everyone. This is why I suggested that the policy recommend seperate python-foo and pythonx.y-foo source packages for multi-version support. Have the pythonx.y-foo source package build all the pythonX.Y-foo binary packages, and have a simple python-foo source package build the small binary python-foo wrapper package. This way, when python transitions, only the small python-foo source package needs to have it's dependencies updated, so only the small python-foo binary wrapper package gets re-built. The pythonx.y-foo source package and pythonX.Y-foo binary packages don't need to be touched. Of course, supporting versions older than the default version is rarely needed, except when there are applications that require such older versions. So when 2.4 becomes the default, only 2.4 (and perhaps 2.5) packages should be built. Don't you understand that it's even more added work to remove the legacy python 2.1 to 2.3 packages in hundreds of packages ? So why remove them? If they aren't being re-built, they are not being updated, and hence are not being re-mirrored. Sure, they are using disk space on the mirrors, but it's not that much, and they are not eating bandwidth, which is the bit that really hurts. Rather than add the burden of removing legacy packages into the transition process, by de-coupling them Python can transition first, and then the legacy packages can be kept or removed at the individual package maintainers discretion later. This only applies to multi-version supporting packages... -- Donovan Baarda [EMAIL PROTECTED] -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]