[Python-Dev] [GSoC] Porting on RPM3

2011-03-21 Thread Prashant Kumar
Hello,
My name is  Prashant Kumar and I've worked on porting few python
libraries(distutils2, configobj) and I've been looking at the ideas
list for GSoC for a project related to porting.

I came across [1]  and found it interesting. It mentions that some
of the work has already been done; I would like to look at the code
repository for the same, could someone provide me the link for the
same?


Regards,
pkumar

[1] http://wiki.python.org/moin/RPMOnPython3
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread R. David Murray
On Mon, 21 Mar 2011 14:07:46 +0900, Stephen J. Turnbull step...@xemacs.org 
wrote:
 No, at best the DVCS workflow forces the developer on a branch to
 merge and test the revisions that will actually be added to the
 repository, and perhaps notice system-level anomolies before pushing.

hg does not force the developer to test, it only forces the merge.

As far as I can see, the only difference between hg and svn in this
regard is that svn merging was easier, because, as you say, it was done
behind the scenes when one did a conflict-free commit.  If there were
conflicts, though, you had the same need to merge to tip as with hg,
and the same lack of enforcing of the running of tests.

--
R. David Murray   http://www.bitdance.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Draft PEP and reference implementation of a Python launcher for Windows

2011-03-21 Thread Mark Hammond

On 21/03/2011 1:04 PM, Martin v. Löwis wrote:

Can you please add a summary of this discussion to
the PEP? (also, can you please check in the PEP, and

 give it a number?)

OK, I'll check it in once I get a PEP number allocated as per PEP1, 
updated to reflect some of the discussions in this thread.


Should I also check the reference implementation in?  Maybe next to the 
PEP text as pep--reference.py?


Thanks,

Mark
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Tim Golden

On 21/03/2011 02:49, Éric Araujo wrote:

I have been avoiding hg import because my understanding is that it
defaults to commit, and I don't see that it has any advantage over patch
itself.

“hg import” understands the extended diff format, which patch does not.
  (That format has been described a number of times already, see
http://mercurial.selenic.com/wiki/GitExtendedDiffFormat.)

“hg import --no-commit” is basically a patch command that understands
the extended format.  (Pro tip: it can be abbreviated to “hg import
--no-c”, as Mercurial accepts unambiguous abbreviations of commands and
options.)


A further tip in case it helps anyone: hg import (and its mq
counterpart hg qimport) can patch directly from a URL. This
is handy when I want to try out someone's patch directly from
the issue page on bugs.python.org. [Maybe everyone else knew
this, but I found it out by accident!]

TJG
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Stephen J. Turnbull
R. David Murray writes:
  On Mon, 21 Mar 2011 14:07:46 +0900, Stephen J. Turnbull 
  step...@xemacs.org wrote:
   No, at best the DVCS workflow forces the developer on a branch to
   merge and test the revisions that will actually be added to the
   repository, and perhaps notice system-level anomolies before pushing.
  
  hg does not force the developer to test, it only forces the merge.

I didn't say any VCS forces the test; I said that the workflow can (in
the best case).  That's also inaccurate, of course.  I should have
said require, not force.

My understanding is that svn does not detect fast forwards, only lack
of conflicts, and therefore in case of concurrent development it is
possible that the repository contains a version that never existed in
any developer's workspace.  If that is not true, I apologize for the
misinformation.

If it is true, by definition developers cannot test or review what
hasn't existed in their workspace; that testing and review is
therefore imposed on the project as a whole, and perhaps not done
until more concurrent commits have been made.  On the other hand, in a
DVCS this can't happen under normal circumstances.

  As far as I can see, the only difference between hg and svn in this
  regard is that svn merging was easier, because, as you say, it was done
  behind the scenes when one did a conflict-free commit.

That's true from the point of view of the individual developer; the
DVCS requires more effort of her.  That is not true from the point of
view of the whole project however.

It would be possible for the svn-based workflow to require that after
testing in one's workspace, one does an svn update, and if any changes
are made to files in the workspace, the whole build and test procedure
must be repeated.  I don't see that that has advantages over the hg
workflow, though -- it should cause an addition build-test cycle in
exactly the same revision sequences that the hg workflow does.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Martin v. Löwis
 My understanding is that svn does not detect fast forwards, only lack
 of conflicts, and therefore in case of concurrent development it is
 possible that the repository contains a version that never existed in
 any developer's workspace.  

I can't understand how you draw this conclusion (therefore).

If you do an svn up, it merges local changes with remote changes;
if that works without conflicts, it tells you what files it merged,
but lets you commit.

Still, in this case, the merge result did exist in the sandbox
of the developer performing the merge. Subversion never ever creates
versions in the repository that didn't before exist in some working
copy. The notion that it may have done a server-side merge or some
such is absurd.

 If it is true, by definition developers cannot test or review what
 hasn't existed in their workspace; that testing and review is
 therefore imposed on the project as a whole, and perhaps not done
 until more concurrent commits have been made.

You make it sound as if you have never used subversion.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Martin v. Löwis
 A further tip in case it helps anyone: hg import (and its mq
 counterpart hg qimport) can patch directly from a URL. This
 is handy when I want to try out someone's patch directly from
 the issue page on bugs.python.org. [Maybe everyone else knew
 this, but I found it out by accident!]

Thanks - I didn't know, and it sounds useful.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [GSoC] Porting on RPM3

2011-03-21 Thread Martin v. Löwis
Am 21.03.2011 07:37, schrieb Prashant Kumar:
 Hello,
 My name is  Prashant Kumar and I've worked on porting few python
 libraries(distutils2, configobj) and I've been looking at the ideas
 list for GSoC for a project related to porting.
 
 I came across [1]  and found it interesting. It mentions that some
 of the work has already been done; I would like to look at the code
 repository for the same, could someone provide me the link for the
 same?

Not so much the code but the person who did the porting. This was Dave
Malcolm (CC'ed); please get in touch with him. Please familiarize
yourself with the existing Python bindings (in the latest RPM 4 release
from rpm.org). You'll notice that this already has Python 3 support;
not sure whether that's the most recent code, though.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Draft PEP and reference implementation of a Python launcher for Windows

2011-03-21 Thread Nick Coghlan
On Mon, Mar 21, 2011 at 5:16 PM, Mark Hammond mhamm...@skippinet.com.au wrote:
 On 21/03/2011 1:04 PM, Martin v. Löwis wrote:

 Can you please add a summary of this discussion to
 the PEP? (also, can you please check in the PEP, and

 give it a number?)

 OK, I'll check it in once I get a PEP number allocated as per PEP1, updated
 to reflect some of the discussions in this thread.

We should really update PEP 1 at some point to say that people with
commit rights are allowed to just grab the next number in the sequence
(the source repository effectively prevents conflicts if two people
happen to start PEPs at the same time). I've asked the PEP editors
about that in the past, and they were fine with the practice.

 Should I also check the reference implementation in?  Maybe next to the PEP
 text as pep--reference.py?

Generally the PEP directory is just for the PEPs themselves. Attaching
scripts to tracker items is a fairly common way of publishing
reference implementations, as is sticking them in an alternate repo
somewhere.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] VM and Language summit info for those not at Pycon (and those that are!)

2011-03-21 Thread Stefan Behnel

[long post ahead, again]

Guido van Rossum, 21.03.2011 03:46:

Thanks for the clarifications. I now have a much better understanding
of what Cython is. But I'm not sold. For one, your attitude about
strict language compatibility worries me when it comes to the stdlib.


Not sure what you mean exactly. Given our large user base, we do worry a 
lot about things like backwards compatibility, for example.


If you are referring to compatibility with Python, I don't think anyone in 
the project really targets Cython as a a drop-in replacement for a Python 
runtime. We aim to compile Python code, yes, and there's a hand-wavy idea 
in the back of our head that we may want a plain Python compatibility mode 
at some point that will disable several important optimisations. But 
there's no real drive for that, simply because Cython users usually care a 
lot more about speed than about strict Python language compliance in 
dangerous areas like overridden builtins (such as 'range'). And Cython 
users know that they also have CPython available, which allows them to 
easily get 100% compatibility if they need it, be it through an import or 
by calling exec.


That being said, we do consider any deviation from Python language 
semantics a bug, and try to fix at least those with a user impact. 
Compatibility has improved a lot since the early days.




Also, I don't know how big it is


It's not small. The compiler is getting close to some 50,000 lines of 
python code.




but it seems putting the cart before
the horse to use it to optimize the stdlib. Cython feels much less
mature than CPython;


It certainly is not completely stable, neither the language nor the 
compiler, but it has been used for production code ever since the project 
started (from Pyrex' original inheritance).


There are parts of the language that we still fledge out, but we try hard 
to keep the user impact low and to adhere to the expected Python 
semantics as closely as possible whenever we design new language features. 
Much of what we need to fix these days is actually due to different 
language semantics that originally appeared in Pyrex, or to differences 
between Python 2 and Python 3 that make it tricky for users to write 
portable code.




but the latter should only have dependencies that
themselves change even slower than CPython.


I understand that. C is certainly evolving a *lot* slower than Cython.

Personally, I wouldn't consider Cython a dependency even if CPython started 
using code written in Cython. It's more like a development tool, as users 
won't have to care if the generated C sources ship with the distribution. 
Only those who want to build from hg sources and distributors that patch 
impacted release sources will have to take care to install the 
corresponding Cython version. Shipping tested C sources is certainly the 
recommended way of using Cython.




I also am unclear on how
exactly you're supporting the different semantics in Python 2 vs. 3
without recompiling.


We try to make it easy for users to write portable code by keeping the code 
semantics fixed as much as possible once it's compiled. However, there are 
things that we don't currently fix. For example, we only try to keep 
builtins compatible as far as we consider reasonable. If you write


   x = range(5)

in your Cython code, you will get a list in Py2 and an iterator in Py3. If 
you write xrange(5), however, you will get an xrange object in Py2 and a 
range object in Py3. Same for unicode etc. We also don't change the API 
of the bytes type (returning integers on indexing in Python 3), even though 
it represents a major portability hassle for our users and also prevents 
several optimisations (and language features) that Cython could otherwise 
provide.


String semantics are actually quite complex inside of the Cython compiler 
(as the cross-Python/C/C++ type system in general) and were subject to 
major design/usability discussions in the past. We basically have three 
Python string types: bytes (Py2/3 bytes), unicode (Py2 unicode, Py3 str) 
and str (Py2/3 str), as well as C types like char/char* or Py_UCS4. The 
'str' type is needed because parts of CPython, its stdlib and external 
libraries actually require bytes in Python 2 (and it's sort-of the native 
string type for ASCII text there), but require unicode text in Python 3. To 
write portable code, you can use unprefixed string constants in Cython 
code, which will become the respective 'str' type in each of the runtime 
environments. That's an impressively well appreciated feature for our 
users, and obviously modelled after 2to3.


However, since the API of 'str' isn't portable, you will only get a 
performance boost when you use the unicode (and, for portable operations, 
bytes) type, especially for looping, 'in' tests, etc. That will basically 
allow Cython to 'unbox' the strings into a C array, with the obvious 
optimisations like unboxed Unicode characters etc. As I said, quite a 
complex 

Re: [Python-Dev] cpython (3.2): Issue 7391: Remove questionable and outdated HOWTO document with permission

2011-03-21 Thread Senthil Kumaran
This push caught me by surprise too. So, +1 on having a content of
similar effect.

On Sat, Mar 19, 2011 at 07:05:59PM +0100, Georg Brandl wrote:
 +1.  (Also I don't understand why we'd need permission from an author to
 *remove* content.)

And hypothetically, if the author refuses what do we do? :)

-- 
Senthil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Ned Deily
In article 4d871440.2000...@timgolden.me.uk,
 Tim Golden m...@timgolden.me.uk wrote:
 A further tip in case it helps anyone: hg import (and its mq
 counterpart hg qimport) can patch directly from a URL. This
 is handy when I want to try out someone's patch directly from
 the issue page on bugs.python.org. [Maybe everyone else knew
 this, but I found it out by accident!]

Using a URL can be useful but be aware that for hg import, just as with 
the patch utility, you need to know what if any strip value to use, that 
is, the -p parameter.  So you generally need to examine the patch file 
first.

hg qimport currently does not provide a way to specify the strip count; 
it requires the patch to have been generated with a strip count of 1.
 http://mercurial.selenic.com/bts/issue311 

hg import defaults to 1 but can be changed with -p.

Svn-generated diffs/patches generally need a strip count of 0.

-- 
 Ned Deily,
 n...@acm.org

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread John Arbash Meinel
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 3/21/2011 10:44 AM, Martin v. Löwis wrote:
 My understanding is that svn does not detect fast forwards, only lack
 of conflicts, and therefore in case of concurrent development it is
 possible that the repository contains a version that never existed in
 any developer's workspace.  
 
 I can't understand how you draw this conclusion (therefore).
 
 If you do an svn up, it merges local changes with remote changes;
 if that works without conflicts, it tells you what files it merged,
 but lets you commit.
 
 Still, in this case, the merge result did exist in the sandbox
 of the developer performing the merge. Subversion never ever creates
 versions in the repository that didn't before exist in some working
 copy. The notion that it may have done a server-side merge or some
 such is absurd.

It does so at the *tree* level, not at an individual file level.

1) svn doesn't require you to run 'svn up' unless there is a direct
change to the file you are committing. So there is plenty of opportunity
to have cross-file failures.

 The standard example is I change foo.py's foo() function to add a new
 mandatory parameter. I 'svn up' and run the test suite, updating all
 callers to supply that parameter. You update bar.py to call foo.foo()
 not realizing I'm updating it. You 'svn up' and run the test suite.
 Both my test suite and your test suite were perfect. So we both 'svn
 commit'.
 There is now a race condition. Both of our commits will succeed.
 (revisions 10 and 11 lets say). Revision 11 is now broken (will fail
 to pass the test suite.)

2) svn's default lack of tree-wide synchronization means that no matter
   how diligent we are, there is a race condition. (As I'm pretty sure
   both 'svn commit' can run concurrently since they don't officially
   modify the same file.)

3) IIRC, there is a way to tell svn commit, this is my base revno,
   if the server is newer, abort the commit. It is used by bzr-svn to
   ensure we always commit tree-wide state safely. However, I don't
   think svn itself makes much use of it, or makes it easy to use.



Blindly merging in trunk / rebasing your changes has the same hole.
Though you at least can be aware that it is there, rather than the
system hiding the fact that you were out of date.

John
=:-
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (Cygwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk2HMxoACgkQJdeBCYSNAAMaewCfW3DK8hW4hBKOA+5zbyaxyptH
MMQAoKGw2uWUWafBK2+Jl5A6XMK+0z9f
=5R0+
-END PGP SIGNATURE-
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread skip
Stephen It would be possible for the svn-based workflow to require that
Stephen after testing in one's workspace, one does an svn update, and
Stephen if any changes are made to files in the workspace, the whole
Stephen build and test procedure must be repeated.  I don't see that
Stephen that has advantages over the hg workflow, though -- it should
Stephen cause an addition build-test cycle in exactly the same revision
Stephen sequences that the hg workflow does.

It, however requires every developer to become facile, if not expert, with
the ins and outs of the Python/Mercurial workflow.  This discourages casual
or intermittent contributions.  My main contribution to the Python codebase
over the past couple years has been to intercept trivial bug reports sent
to the webmaster address calling out typos in the documentation or the
website.  Handling such reports was trivial with Subversion.  Update, edit,
check in.  That is no longer the case with Mercurial.  (And for the website
will no longer be the case in the fairly near future if I understand
correctly.)

I believe it runs counter to the professed intention of the switch away from
a centralized version control system, to make it easier for more people to
contribute to Python.  It certainly seems harder for this old dog.

Skip

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Nick Coghlan
On Mon, Mar 21, 2011 at 7:44 PM, Martin v. Löwis mar...@v.loewis.de wrote:
 If you do an svn up, it merges local changes with remote changes;
 if that works without conflicts, it tells you what files it merged,
 but lets you commit.

 Still, in this case, the merge result did exist in the sandbox
 of the developer performing the merge. Subversion never ever creates
 versions in the repository that didn't before exist in some working
 copy. The notion that it may have done a server-side merge or some
 such is absurd.

If you do an svn commit, and there are no files in conflict (or that
require merging), svn will let the commit go through. It doesn't care
if someone else may have updated *other* files in the meantime, so
long as none of the files in the current commit were touched. Thus, if
you don't do an svn up immediately before committing, you may get an
implicit merge of orthogonal changesets on the server. svn will only
complain if the changesets aren't orthogonal (i.e. file x.y is not up
to date). This may break the buildbots if, for example, one commit
changed an internal API in a backwards incompatible way, while the
latter commit used that API (or vice-versa).

hg broadens the check and complains if *any* files are not up to date
on any of the branches being pushed, thus making it a requirement to
do a hg pull and merge on all affected branches before the hg push can
succeed. In theory, this provides an opportunity for the developer
doing the merge to double check that it didn't break anything, in
practice (at least in the near term) we're more likely to see an
SVN-like practice of pushing the merged changes without rerunning the
full test suite.

Just accepting that, and filtering python-checkins to make it easier
to skip over merge commits seems to cover the important points here,
though.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [GSoC] Porting on RPM3

2011-03-21 Thread Nick Coghlan
On Mon, Mar 21, 2011 at 7:50 PM, Martin v. Löwis mar...@v.loewis.de wrote:
 Am 21.03.2011 07:37, schrieb Prashant Kumar:
 Hello,
     My name is  Prashant Kumar and I've worked on porting few python
 libraries(distutils2, configobj) and I've been looking at the ideas
 list for GSoC for a project related to porting.

     I came across [1]  and found it interesting. It mentions that some
 of the work has already been done; I would like to look at the code
 repository for the same, could someone provide me the link for the
 same?

 Not so much the code but the person who did the porting. This was Dave
 Malcolm (CC'ed); please get in touch with him. Please familiarize
 yourself with the existing Python bindings (in the latest RPM 4 release
 from rpm.org). You'll notice that this already has Python 3 support;
 not sure whether that's the most recent code, though.

Also, if you're interested in any other work on porting (possibly even
in the context of GSoC if you can find a mentor), I believe the
majority of folks working on that kind of thing are now coordinating
their efforts on the python-porting [1] mailing list.

Cheers,
Nick.

[1] http://mail.python.org/mailman/listinfo/python-porting

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Nick Coghlan
On Mon, Mar 21, 2011 at 9:14 PM,  s...@pobox.com wrote:
 I believe it runs counter to the professed intention of the switch away from
 a centralized version control system, to make it easier for more people to
 contribute to Python.  It certainly seems harder for this old dog.

I agree it's harder *now*, but I don't think it will stay that way. As
best practices like installing the whitespace hook client-side evolve
and are codified then the devguide can become a lot more prescriptive
for new (and current) users.

Remember, we knew from the beginning that core devs were going to see
the least benefit from the switch, and were at the most risk of having
to relearn things. The principle benefit of a DVCS (i.e. making it
far, far easier to keep a clone repository up to date so that
externally maintained patches are less likely to go stale) doesn't
really apply to us (although we can certainly take advantage of it if
we choose to - I did that myself when creating my own sandbox
repository).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Copyright notices

2011-03-21 Thread M.-A. Lemburg
Nadeem Vawda wrote:
 I was wondering what the policy is regarding copyright notices and license
 boilerplate text at the top of source files.
 
 I am currently rewriting the bz2 module (see 
 http://bugs.python.org/issue5863),
 splitting the existing Modules/bz2module.c into Modules/_bz2module.c and
 Lib/bz2.py.
 
 Are new files expected to include a copyright notice and/or license 
 boilerplate
 text? 

Since you'll be adding new IP to Python, the new code you write should
contain your copyright and the standard PSF contributor agreement
notice, e.g.


(c) Copyright 2011 by Nadeem Vawda. Licensed to PSF under a Contributor 
Agreement.


(please also make sure you have sent the signed agreement to the PSF;
see http://www.python.org/psf/contrib/)

We don't have a general copyright or license boiler plate for Python
source files.

 Also, is it necessary for _bz2module.c (new) to retain the copyright
 notices from bz2module.c (old)? In the tracker issue, Antoine said he didn't
 think so, but suggested that I get some additional opinions.

If the file copies significant code parts from older files, the
copyright notices from those files will have to added to the
file comment as well - ideally with a note explaining to which parts
those copyrights apply and where they originated.

If you are replacing the old implementation with a new one,
you don't need to copy over the old copyright statements.

Thanks,
-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Mar 21 2011)
 Python/Zope Consulting and Support ...http://www.egenix.com/
 mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/
 mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/


::: Try our new mxODBC.Connect Python Database Interface for free ! 


   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
   Registered at Amtsgericht Duesseldorf: HRB 46611
   http://www.egenix.com/company/contact/
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Stephen J. Turnbull
Martin v. Löwis writes:
   My understanding is that svn does not detect fast forwards, only lack
   of conflicts, and therefore in case of concurrent development it is
   possible that the repository contains a version that never existed in
   any developer's workspace.
  
  I can't understand how you draw this conclusion (therefore).

A fast forward is a case where some ancestor of the workspace is the
tip of the repository.  When the tip is not an ancestor, it must
contain changes not yet in the workspace.  If a VCS does not check for
fast-forward, then if those changes are in files not changed in the
workspace, there will be no conflict, and in theory there could indeed
be a silent server-side merge.  QED, therefore.

This seems especially plausible for VCSes that allow only a subset of
files to be committed/pushed.

  Subversion never ever creates versions in the repository that
  didn't before exist in some working copy.

John Arbash-Meinel disagrees with you, so I think I'll go with his
opinion absent a really convincing argument otherwise.  No disrespect
to you intended, but John is an expert I've known for years.

  The notion that it may have done a server-side merge or some
  such is absurd.

False, quite possibly; I'm not an expert on Subversion internals.
Absurd, definitely not.  CVS does it (and much worse, but it certainly
does this too).

  You make it sound as if you have never used subversion.

These days, it's awful hard to avoid using Subversion.  However, I
have no experience with committing in Python, and I don't have that
much experience that I can claim to be authoritative, nor have I
managed a multiuser Subversion repository.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 04:09:35 +0100
Martin v. Löwis mar...@v.loewis.de wrote:
 Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
 one outputting int; the other one outputting Py_ssize_t.
 
 The former should have been removed in 3.0, but this was forgotten.
 
 Still, I would like people to move over to the new version, so
 that extension modules will typically support 64-bit collections
 well. Therefore, I'd like to propose that the int version is deprecated
 in 3.3.

+1 !

 Given the recent discussion about backwards compatibility: what's
 the best approach? What warning should be emitted, if any?
 (the warning would only be generated if an s# or similar format
  was actually encountered - not just if merely PyArg_ParseTuple is
  called).

I'd say a DeprecationWarning. They are quiet by default anyway...

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Draft PEP and reference implementation of a Python launcher for Windows

2011-03-21 Thread Paul Moore
On 21 March 2011 01:54, Mark Hammond mhamm...@skippinet.com.au wrote:
 ie, let's say we are forced to choose between the following 3 options:

 * No launcher at all (the status-quo), causing demonstrable breakage in
 Windows file associations whenever Python 2.x and Python 3.x scripts exist
 on the same box.

 * An in-process launcher which caused breakage in a number of reasonably
 common scenarios for Python programmers, and such breakage could easily be
 demonstrated.

 * An out-of-process launcher which caused breakage for the hypothetical
 program mentioned above, of which no instance can be found and no breakage
 actually demonstrated.

 I personally would conclude that the last option is the least worst scenario
 by a wide margin.

I haven't had time to read the PEP yet, so my apologies if this is
made explicit there, but is the launcher expected to be solely for
implementing file associations? I thought there had been discussions
of using it to start the interactive interpreter, and for it having
command line arguments (implying direct command line usage). If it can
be used directly, there are many other scenarios that might be
impacted. Consider a service implemented using SRVANY which uses the
launcher. Stopping the service kills the launcher, leaving Python (and
the script, ie the service) running...

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread R. David Murray
On Mon, 21 Mar 2011 18:33:00 +0900, Stephen J. Turnbull step...@xemacs.org 
wrote:
 R. David Murray writes:
   On Mon, 21 Mar 2011 14:07:46 +0900, Stephen J. Turnbull 
 step...@xemacs.org wrote:
No, at best the DVCS workflow forces the developer on a branch to
merge and test the revisions that will actually be added to the
repository, and perhaps notice system-level anomolies before pushing.
   
   hg does not force the developer to test, it only forces the merge.
 
 I didn't say any VCS forces the test; I said that the workflow can (in
 the best case).  That's also inaccurate, of course.  I should have
 said require, not force.

The workflow in svn can require this same thing:  before committing,
you do an svn up and run the test suite.

The hg workflow can not require this as well:  before committing,
do an hg pull -u, merge heads, and *don't* run the test suite.

HG the tool does *NOT* change this aspect of things.  If this change
is to be made (tip should always be a repository state over which the
full regrtest suite has been run), then that is a *cultural* change that
we would need to make.  And could have made with svn.

We didn't.  We probably won't with hg.  Because the test suite takes
to long to run and the buildbots will do it anyway.

It's a discussion we could have, but as far as I can see it is completely
independent of the choice of tool.

Your point seems to boil down to the fact that many developers may
not have thought about the fact that committing to svn without an svn up
could mean they hadn't run regrtest over the *current* state of the repo.
This may be true, I don't know.  I did think about it.  When we moved
to hg, I did not re-think about it; nothing about the hg workflow
change forced me to re-think about it.  I did not change my habits,
and that includes not re-running the test suite after a merge heads,
unless there's a conflict (just like svn).

That is, not only did the change in the tool and consequent change in
the workflow have *zero* impact on this aspect of the way I work with
CPython, it didn't even trigger me to *think* about it.

--
R. David Murray   http://www.bitdance.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Eric Smith
 On Mon, 21 Mar 2011 04:09:35 +0100
 Martin v. Löwis mar...@v.loewis.de wrote:
 Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
 one outputting int; the other one outputting Py_ssize_t.

 The former should have been removed in 3.0, but this was forgotten.

 Still, I would like people to move over to the new version, so
 that extension modules will typically support 64-bit collections
 well. Therefore, I'd like to propose that the int version is deprecated
 in 3.3.

 +1 !

Agreed.

 Given the recent discussion about backwards compatibility: what's
 the best approach? What warning should be emitted, if any?
 (the warning would only be generated if an s# or similar format
  was actually encountered - not just if merely PyArg_ParseTuple is
  called).

 I'd say a DeprecationWarning. They are quiet by default anyway...

Why not a PendingDeprecationWarning?

Eric.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Antoine Pitrou
 
  Given the recent discussion about backwards compatibility: what's
  the best approach? What warning should be emitted, if any?
  (the warning would only be generated if an s# or similar format
   was actually encountered - not just if merely PyArg_ParseTuple is
   called).
 
  I'd say a DeprecationWarning. They are quiet by default anyway...
 
 Why not a PendingDeprecationWarning?

Is there still a difference?


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Victor Stinner
Le lundi 21 mars 2011 à 04:09 +0100, Martin v. Löwis a écrit :
 Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
 one outputting int; the other one outputting Py_ssize_t.
 
 The former should have been removed in 3.0, but this was forgotten.
 
 Still, I would like people to move over to the new version, so
 that extension modules will typically support 64-bit collections
 well. Therefore, I'd like to propose that the int version is deprecated
 in 3.3.

By the way, what is the status of migration to Py_ssize_t of CPython
extensions? I suppose that adding a warning will quickly give an answer.

Victor

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Stefan Behnel

Victor Stinner, 21.03.2011 15:21:

Le lundi 21 mars 2011 à 04:09 +0100, Martin v. Löwis a écrit :

Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
one outputting int; the other one outputting Py_ssize_t.

The former should have been removed in 3.0, but this was forgotten.

Still, I would like people to move over to the new version, so
that extension modules will typically support 64-bit collections
well. Therefore, I'd like to propose that the int version is deprecated
in 3.3.


By the way, what is the status of migration to Py_ssize_t of CPython
extensions? I suppose that adding a warning will quickly give an answer.


You'll get a series of very visible warnings from the C compiler when 
compiling a non-Py_ssize_t extension on a 64bit platform, which is a rather 
common platform type these days. So I'd doubt that there are any 
still-in-use extensions left that have not migrated.


Stefan

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Copyright notices

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 13:20:59 +0100
M.-A. Lemburg m...@egenix.com wrote:
 Nadeem Vawda wrote:
  I was wondering what the policy is regarding copyright notices and license
  boilerplate text at the top of source files.
  
  I am currently rewriting the bz2 module (see 
  http://bugs.python.org/issue5863),
  splitting the existing Modules/bz2module.c into Modules/_bz2module.c and
  Lib/bz2.py.
  
  Are new files expected to include a copyright notice and/or license 
  boilerplate
  text? 
 
 Since you'll be adding new IP to Python, the new code you write should
 contain your copyright and the standard PSF contributor agreement
 notice, e.g.

I agree with Raymond's argument that we shouldn't add *new* copyright
boilerplate:
http://mail.python.org/pipermail/python-dev/2009-January/085267.html

(the original question was about whether to keep the old one)

Authorship (and therefore IP) is much better documented by version
control than by static chunks of text. These often become hopelessly
outdated, and therefore give wrong information about who actually
authored the code.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Stephen J. Turnbull
s...@pobox.com writes:

  I believe it runs counter to the professed intention of the switch
  away from a centralized version control system, to make it easier
  for more people to contribute to Python.  It certainly seems harder
  for this old dog.

Well, you may be an old dog, but you're also an early adopter.  That
means both that you get to pay for our mistakes (our = authors of PEPs
374 and 385), and that it's going to take a while to disentangle
implementation issues from the real long-run costs and benefits.

Costs of transition were admitted up front.  The professed intention
was to make things *harder* in the short run (but as little as
possible!), while making contribution to Python significantly more
attractive (but not necessarily less work!) in the long run.  I don't
think anybody tried to hide the fact that changing habits would be
required, or to claim that it would be costless.  There were a few
people with a Pollyanna try it, you'll like it attitude, but
certainly those of us involved in PEP 374 knew better than that -- we
knew there were people like you whose patterns of contribution worked
just fine with the svn-based workflow and didn't need or want to
change.  That's why PEP 374 was necessary!

Yes, based on the description you give of your principal contribution
pattern, you take a complexity/effort hit in the transition.  I think
it can be alleviated quite a bit with the help of your reports, but
that will take some time.  All I can say about that time is Sorry!
and Thank you for trying the system while it's still in beta.

I hope you will give it some more time to shake down.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Hrvoje Niksic

On 03/21/2011 01:34 PM, Stephen J. Turnbull wrote:

Subversion never ever creates versions in the repository that
didn't before exist in some working copy.

John Arbash-Meinel disagrees with you, so I think I'll go with his
opinion


Besides, it's easy to confirm:

# create a repository and two checkouts:
[~/work]$ svnadmin create repo
[~/work]$ svn co file:///home/hniksic/work/repo checkout1
Checked out revision 0.
[~/work]$ svn co file:///home/hniksic/work/repo checkout2
Checked out revision 0.

# add a file to checkout 1
[~/work]$ cd checkout1
[~/work/checkout1]$ touch a  svn add a  svn commit -m c1
A a
Adding a
Transmitting file data .
Committed revision 1.

# now add a file to the second checkout without ever seeing
# the new file added to the first one
[~/work/checkout1]$ cd ../checkout2
[~/work/checkout2]$ touch b  svn add b  svn commit -m c2
A b
Adding b
Transmitting file data .
Committed revision 2.

The second commit would be rejected by a DVCS on the grounds of a merge 
with revision 1 never having happened.  What svn calls revision two is 
in reality based on revision 0, a fact the DVCS is aware of.


The message committed revision 2, while technically accurate, is 
misleading if you believe the revision numbers to apply to the entire 
tree (as the svn manual will happily point out).  It doesn't indicate 
that what you have in your tree when the message is displayed can be 
very different from the state of a freshly-checked-out revision 2.  In 
this case, it's missing the file a:


[~/work/checkout2]$ ls
b

This automatic merging often causes people who migrate to a DVCS to feel 
that they have to go through an unnecessary extra step in their 
workflows.  But once you grasp the hole in the svn workflow, what svn 
does (and what one used to take for granted) tends to become 
unacceptable, to put it mildly.


Hrvoje
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Adrian Buehlmann
On 2011-03-21 14:40, R. David Murray wrote:
 On Mon, 21 Mar 2011 18:33:00 +0900, Stephen J. Turnbull 
 step...@xemacs.org wrote:
 R. David Murray writes:
   On Mon, 21 Mar 2011 14:07:46 +0900, Stephen J. Turnbull 
 step...@xemacs.org wrote:
No, at best the DVCS workflow forces the developer on a branch to
merge and test the revisions that will actually be added to the
repository, and perhaps notice system-level anomolies before pushing.
   
   hg does not force the developer to test, it only forces the merge.

 I didn't say any VCS forces the test; I said that the workflow can (in
 the best case).  That's also inaccurate, of course.  I should have
 said require, not force.
 
 The workflow in svn can require this same thing:  before committing,
 you do an svn up and run the test suite.

But with svn you have to redo the test after the commit *if* someone
else committed just before you in the mean time, thereby changing the
preconditions behind your back, thus creating a different state of the
tree compared to the state in which it was at the time you ran your test.

With a DVCS, you can't push in that situation. At least not without
creating a new head (which would require --force in Mercurial).
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Eric Smith

  Given the recent discussion about backwards compatibility: what's
  the best approach? What warning should be emitted, if any?
  (the warning would only be generated if an s# or similar format
   was actually encountered - not just if merely PyArg_ParseTuple is
   called).
 
  I'd say a DeprecationWarning. They are quiet by default anyway...

 Why not a PendingDeprecationWarning?

 Is there still a difference?

Only that it takes one extra release to go from
pending-deprecated-removed, as opposed to deprecated-removed.

Eric.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 06:14 AM, s...@pobox.com wrote:

It, however requires every developer to become facile, if not expert, with
the ins and outs of the Python/Mercurial workflow.  This discourages casual
or intermittent contributions.  My main contribution to the Python codebase
over the past couple years has been to intercept trivial bug reports sent
to the webmaster address calling out typos in the documentation or the
website.  Handling such reports was trivial with Subversion.  Update, edit,
check in.  That is no longer the case with Mercurial.  (And for the website
will no longer be the case in the fairly near future if I understand
correctly.)

I believe it runs counter to the professed intention of the switch away from
a centralized version control system, to make it easier for more people to
contribute to Python.  It certainly seems harder for this old dog.

Does Mercurial have a way of acting like a centralized vcs to the end user,
the way Bazaar does?  IOW, if Skip or others were more comfortable with a
centralized workflow (which is entirely valid imo), can they set up their
local workspace to enable that mode of operation?

If so, the devguide could describe that as a transitional step from the old
svn way of doing things.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Victor Stinner
Le lundi 21 mars 2011 à 15:35 +0100, Stefan Behnel a écrit :
 Victor Stinner, 21.03.2011 15:21:
  Le lundi 21 mars 2011 à 04:09 +0100, Martin v. Löwis a écrit :
  Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
  one outputting int; the other one outputting Py_ssize_t.
 
  The former should have been removed in 3.0, but this was forgotten.
 
  Still, I would like people to move over to the new version, so
  that extension modules will typically support 64-bit collections
  well. Therefore, I'd like to propose that the int version is deprecated
  in 3.3.
 
  By the way, what is the status of migration to Py_ssize_t of CPython
  extensions? I suppose that adding a warning will quickly give an answer.
 
 You'll get a series of very visible warnings from the C compiler when 
 compiling a non-Py_ssize_t extension on a 64bit platform, which is a rather 
 common platform type these days. So I'd doubt that there are any 
 still-in-use extensions left that have not migrated.

Which instrution does emit a warning? If a module still use int, the
compiler was not emit any warning on call to PyArg_Parse*() because the
size arguments are passed as pointers in the magical ... argument.

But when you switch to Py_ssize_t, you may get errors because Py_ssize_t
may be casted to a narrower type (like int).

Victor

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 11:25:31 -0400
Barry Warsaw ba...@python.org wrote:
 
 Does Mercurial have a way of acting like a centralized vcs to the end user,
 the way Bazaar does?  IOW, if Skip or others were more comfortable with a
 centralized workflow (which is entirely valid imo), can they set up their
 local workspace to enable that mode of operation?

I believe something like hg pull -u  hg ci  hg push would
emulate such behaviour: it would first put your working copy in sync
with remote, then let you commit, then push your new change.

We cannot emulate svnmerge for porting between branches, though - and
I doubt bzr can do it. That's because merges in common DVCSes are based
on the DAG, while svnmerge is a prettily ad-hoc free-form thing.

 If so, the devguide could describe that as a transitional step from the old
 svn way of doing things.

I think we should let things settle a few weeks before starting to
describe more workflows in the devguide.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 11:24:24 -0400 (EDT)
Eric Smith e...@trueblade.com wrote:
 
   Given the recent discussion about backwards compatibility: what's
   the best approach? What warning should be emitted, if any?
   (the warning would only be generated if an s# or similar format
was actually encountered - not just if merely PyArg_ParseTuple is
called).
  
   I'd say a DeprecationWarning. They are quiet by default anyway...
 
  Why not a PendingDeprecationWarning?
 
  Is there still a difference?
 
 Only that it takes one extra release to go from
 pending-deprecated-removed, as opposed to deprecated-removed.

Well, some things have been issueing deprecation warnings for several
releases, I think. And given Guido's latest stance on the subject we
may see more of them in the future.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 04:38 PM, Antoine Pitrou wrote:

On Mon, 21 Mar 2011 11:25:31 -0400
Barry Warsaw ba...@python.org wrote:
 
 Does Mercurial have a way of acting like a centralized vcs to the end user,
 the way Bazaar does?  IOW, if Skip or others were more comfortable with a
 centralized workflow (which is entirely valid imo), can they set up their
 local workspace to enable that mode of operation?

I believe something like hg pull -u  hg ci  hg push would
emulate such behaviour: it would first put your working copy in sync
with remote, then let you commit, then push your new change.

Actually, I meant something like 'bzr checkout':

http://doc.bazaar.canonical.com/bzr.2.3/en/user-reference/checkouts-help.html

This would allow individual developers to treat the repository in a
centralized way like they did for svn, but still allowing other developers to
work in a distributed way.

We cannot emulate svnmerge for porting between branches, though - and
I doubt bzr can do it. That's because merges in common DVCSes are based
on the DAG, while svnmerge is a prettily ad-hoc free-form thing.

Sure.

 If so, the devguide could describe that as a transitional step from the old
 svn way of doing things.

I think we should let things settle a few weeks before starting to
describe more workflows in the devguide.

Okay.  I wonder if the merge dance will get easier now that the rush of
changes during Pycon settles down.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 20, 2011, at 04:39 PM, Georg Brandl wrote:

On 20.03.2011 16:21, Guido van Rossum wrote:
 What is rebase? Why does everyone want it and hate it at the same time?

Basically, rebase is a way to avoid having pointless merge commits on the
same branch.

There's something I don't understand about rebase.  It seems like most git and
hg users I hear from advocate rebase, while (ISTM) few Bazaar users do.

I'd like to understand whether that's a cultural thing or whether it's a
byproduct of some aspect of the respective tools.

It could be cultural in that communities using git and hg don't want those
local commits to ever show up in their shared repository, even though they are
mostly harmless.  In this graph:

  A --- B .
 / \
... --- X --- C --- D --- E --- M

A and B do exist, but I shouldn't care or notice them unless I explicitly
drill down.  The mainline still goes from X to C to D to E to M, and looking
at the differences between E and M should tell me all I need to know.
I.e. specifically, I can ignore A and B for most purposes.

It could be that some aspect of the tools causes A and B to not be hidden as
well as they should, so that when looking at the history for example, the fact
that A and B exist is a jarring or annoying artifact that would be better if
they didn't exist.

I'm asking because I don't know hg and git well enough to answer the
question.  In my own use of Bazaar over the last 4+ years, I've almost never
rebased or even been asked to.  Not that some Bazaar users don't use rebase,
but I just don't think it's that common (John can correct me if I'm wrong).

I'm not trolling, I really want to understand.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Deprecating non-Py_ssize_t use of PyArg_ParseTuple

2011-03-21 Thread Stefan Behnel

Victor Stinner, 21.03.2011 16:26:

Le lundi 21 mars 2011 à 15:35 +0100, Stefan Behnel a écrit :

Victor Stinner, 21.03.2011 15:21:

Le lundi 21 mars 2011 à 04:09 +0100, Martin v. Löwis a écrit :

Since Python 2.5, we maintain two versions of PyArg_ParseTuple:
one outputting int; the other one outputting Py_ssize_t.

The former should have been removed in 3.0, but this was forgotten.

Still, I would like people to move over to the new version, so
that extension modules will typically support 64-bit collections
well. Therefore, I'd like to propose that the int version is deprecated
in 3.3.


By the way, what is the status of migration to Py_ssize_t of CPython
extensions? I suppose that adding a warning will quickly give an answer.


You'll get a series of very visible warnings from the C compiler when
compiling a non-Py_ssize_t extension on a 64bit platform, which is a rather
common platform type these days. So I'd doubt that there are any
still-in-use extensions left that have not migrated.


Which instrution does emit a warning? If a module still use int, the
compiler was not emit any warning on call to PyArg_Parse*() because the
size arguments are passed as pointers in the magical ... argument.


Ah, I thought you were talking about Py_ssize_t migration in general, not 
specific to the PyArg_Parse*() functions. I faintly remember seeing lots of 
Py_ssize_t related warnings in Pyrex code ages ago (I think my first Pyrex 
patches date back to the time when Py2.5 came out). They were at least 
obvious to me at the time.


But now that I think about it, I guess it's a lot easier to miss a place 
like PyArg_Parse*() when manually migrating code. The conversion of the 
type string is really not obvious and the C compiler can't provide any 
help. I dropped their usage in Cython years ago for performance reasons, 
but I think Pyrex still uses them, and they are extremely common in hand 
written C code as well as code generated by several wrapper generators, 
which tend to have their own ways of declaring input types. So it's 
actually hard to tell how well extensions are prepared here.


Stefan

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread skip

Thanks for the example, Hrvoje.

Hrvoje This automatic merging often causes people who migrate to a DVCS
Hrvoje to feel that they have to go through an unnecessary extra step
Hrvoje in their workflows.  But once you grasp the hole in the svn
Hrvoje workflow, what svn does (and what one used to take for granted)
Hrvoje tends to become unacceptable, to put it mildly.

In the run-up to a release when there is lots of activity happening, do you
find yourself in a race with other developers to push your changes cleanly?
Suppose I am ready to check in some changes.  I pull, merge, update.  Run
the unit tests again.  That takes awhile.  Then I go to push only to find
someone else has already pushed a changeset.  I then have to lather, rinse,
repeat.  This might happen more than once, especially in the last few days
before an alpha (or the first beta) release.  Historically, most of the
churn in Python's code base occurs at that time.

I don't know how likely this would be to happen, though I can imagine if it
turns out to be a PITA w/ Mercurial that this is where most checkin problems
(fast-forward in Stephen's terminology) would happen with Subversion.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread David Cournapeau
On Tue, Mar 22, 2011 at 1:20 AM, Barry Warsaw ba...@python.org wrote:
 On Mar 20, 2011, at 04:39 PM, Georg Brandl wrote:

On 20.03.2011 16:21, Guido van Rossum wrote:
 What is rebase? Why does everyone want it and hate it at the same time?

Basically, rebase is a way to avoid having pointless merge commits on the
same branch.

 There's something I don't understand about rebase.  It seems like most git and
 hg users I hear from advocate rebase, while (ISTM) few Bazaar users do.

I cannot talk for the communities in general, but that's mostly
cultural from what I have seen, although the tools reflect the
cultural aspect (like fast forward being the default for merge in
git). Hg up to recently did not have rebase at all, so it is less
ingrained it seems.

The reason why I like using rebase in general is because it fits the
way I like thinking about git: as a patch management system when I
work alone on a branch. I don't think private history is that useful
in general, but that depends on the projects of course. Another aspect
is that some people learnt git first through git-svn, where rebase is
the defacto way of working.

The issue of committing something that does not correspond to a tested
working tree is moot IMO: the only way to do it correctly and reliably
is to have some kind of gateway to actually test the commit before
adding it for good to the reference repo. In my experience, the most
common way to commit a broken working tree is to forget adding a new
file or removing an old file (e.g. the code depends on some old file
which has been removed while the old .pyc is still there), and none of
svn/bzr/hg/git prevents that.

David
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Stephen J. Turnbull
Barry Warsaw writes:

  Actually, I meant something like 'bzr checkout':

No.  Of the DVCSes, only bzr has that.

  This would allow individual developers to treat the repository in a
  centralized way like they did for svn, but still allowing other
  developers to work in a distributed way.

I don't think that is true for Python, because of the forward
porting workflow when a patch is relevant to several branches.  They
are still going to have to do a merge, and do it in the right
direction.  I think checkout only makes emulation of centralized
workflow trivial in a single-active-branch development model.  I could
be wrong, but offhand I don't see how it would work.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Paul Moore
On 21 March 2011 16:20, Barry Warsaw ba...@python.org wrote:
 It could be that some aspect of the tools causes A and B to not be hidden as
 well as they should, so that when looking at the history for example, the fact
 that A and B exist is a jarring or annoying artifact that would be better if
 they didn't exist.

My understanding is that Bazaar has a strong concept of the mainline
in the DAG, whereas hg (and presumably git, although I know little of
git) expose all parts of the DAG as equal. So yes, A and B are not
hidden as well as they should in hg and git.

There's a cultural argument over that should, of course...

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 12:20:15 -0400
Barry Warsaw ba...@python.org wrote:
 On Mar 20, 2011, at 04:39 PM, Georg Brandl wrote:
 
 On 20.03.2011 16:21, Guido van Rossum wrote:
  What is rebase? Why does everyone want it and hate it at the same time?
 
 Basically, rebase is a way to avoid having pointless merge commits on the
 same branch.
 
 There's something I don't understand about rebase.  It seems like most git and
 hg users I hear from advocate rebase, while (ISTM) few Bazaar users do.

I don't think many hg users advocate rebase, really. AFAICT the
Mercurial developers themselves don't seem to use it (they do use mq,
OTOH).

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Raymond Hettinger

On Mar 21, 2011, at 8:25 AM, Barry Warsaw wrote:
 
 Does Mercurial have a way of acting like a centralized vcs to the end user,
 the way Bazaar does?  IOW, if Skip or others were more comfortable with a
 centralized workflow (which is entirely valid imo), can they set up their
 local workspace to enable that mode of operation?

I don't think that is the main source of complexity.

The more difficult and fragile part of the workflows are:
* requiring commits to be cross-linked between branches
* and wanting changesets to be collapsed or rebased
  (two operations that destroy and rewrite history).


Raymond___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread R. David Murray
On Mon, 21 Mar 2011 15:59:51 +0100, Adrian Buehlmann adr...@cadifra.com wrote:
 On 2011-03-21 14:40, R. David Murray wrote:
  On Mon, 21 Mar 2011 18:33:00 +0900, Stephen J. Turnbull 
  step...@xemacs.org wrote:
  R. David Murray writes:
On Mon, 21 Mar 2011 14:07:46 +0900, Stephen J. Turnbull 
  step...@xemacs.org wrote:
 No, at best the DVCS workflow forces the developer on a branch to
 merge and test the revisions that will actually be added to the
 repository, and perhaps notice system-level anomolies before pushing.

hg does not force the developer to test, it only forces the merge.
 
  I didn't say any VCS forces the test; I said that the workflow can (in
  the best case).  That's also inaccurate, of course.  I should have
  said require, not force.
  
  The workflow in svn can require this same thing:  before committing,
  you do an svn up and run the test suite.
 
 But with svn you have to redo the test after the commit *if* someone
 else committed just before you in the mean time, thereby changing the
 preconditions behind your back, thus creating a different state of the
 tree compared to the state in which it was at the time you ran your test.
 
 With a DVCS, you can't push in that situation. At least not without
 creating a new head (which would require --force in Mercurial).

So you are worried about the small window between me doing an 'svn up',
seeing no changes, and doing an 'svn ci'?  I suppose that is a legitimate
concern, but considering the fact that if the same thing happens in hg,
the only difference is that I know about it and have to do more work[*],
I don't think it really changes anything.  Well, it means that if your
culture uses the always test workflow you can't be *perfect* about it
if you use svn[**], which I must suppose has been your (and Stephen's)
point from the beginning.

[*] that is, I'm *not* going to rerun the test suite even if I have to
pull/up/merge, unless there are conflicts.

[**] Possibly you could do it using svn locking, but even if you
could it wouldn't be worth it.

--
R. David Murray   http://www.bitdance.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Tres Seaver
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 03/21/2011 10:55 AM, Stephen J. Turnbull wrote:
 s...@pobox.com writes:
 
   I believe it runs counter to the professed intention of the switch
   away from a centralized version control system, to make it easier
   for more people to contribute to Python.  It certainly seems harder
   for this old dog.
 
 Well, you may be an old dog, but you're also an early adopter.  That
 means both that you get to pay for our mistakes (our = authors of PEPs
 374 and 385), and that it's going to take a while to disentangle
 implementation issues from the real long-run costs and benefits.
 
 Costs of transition were admitted up front.  The professed intention
 was to make things *harder* in the short run (but as little as
 possible!), while making contribution to Python significantly more
 attractive (but not necessarily less work!) in the long run.

In our experience with migrating pyramid and repoze components from SVN
to github, the real wins come not from merging changes inside a
repository but in making it dirt-simple for people *wihtout commit
privileges* to give me trivial-to-merge patches (via a public fork).  I
pooh-poohed that advantage at last year's PyCon when people were urging
us to move to github / bitbucket;  by this year's conference, I have
cheerfully eaten my words.

As the developer working inside the main repository, I don't find git
/ hg / bzr much easier to use than svn.  Because I care about projects
using all three, I have to know all three, while still keeping up svn
chops:  I find that mental overhead annoying.

  I don't
 think anybody tried to hide the fact that changing habits would be
 required, or to claim that it would be costless.  There were a few
 people with a Pollyanna try it, you'll like it attitude, but
 certainly those of us involved in PEP 374 knew better than that -- we
 knew there were people like you whose patterns of contribution worked
 just fine with the svn-based workflow and didn't need or want to
 change.  That's why PEP 374 was necessary!
 
 Yes, based on the description you give of your principal contribution
 pattern, you take a complexity/effort hit in the transition.  I think
 it can be alleviated quite a bit with the help of your reports, but
 that will take some time.  All I can say about that time is Sorry!
 and Thank you for trying the system while it's still in beta.
 
 I hope you will give it some more time to shake down.

The push race problem in hg definitely bit us at PyCon this year while
sprinting to get WebOb's test coverage to 100% (preparing for the
Python3 port).  I haven't seen that issue so much with git or bzr, but I
may not have used them in such a race-friendly environment (ten or
eleven developers working in parallel on the WebOb test modules).


Tres.
- -- 
===
Tres Seaver  +1 540-429-0999  tsea...@palladion.com
Palladion Software   Excellence by Designhttp://palladion.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk2HiM0ACgkQ+gerLs4ltQ6jvQCgy29sT5kOuq0HwzMTxNltHj5Q
QH0An0kUTXiZWLTn07YvIwlWIm2LI8Gr
=bZ1G
-END PGP SIGNATURE-

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] VM and Language summit info for those not at Pycon (and those that are!)

2011-03-21 Thread Stefan Behnel

Stefan Behnel, 21.03.2011 11:58:

Guido van Rossum, 21.03.2011 03:46:

Have you tried replacing selected stdlib modules with their
Cython-optimized equivalents in some of the NumPy/SciPy distros? (E.g.
what about Enthought's Python distros?) Depending on how well that
goes I might warm up to Cython more!


Hmm, I hadn't heard about that before. I'll ask on our mailing list if
anyone's aware of them. I doubt that the stdlib participates in the
critical parts of scientific computation code. Maybe alternative CSV
parsers or something like that, but I'd be surprised if they were
compatible with what's in the stdlib.


Sorry, I misread your statement above. You were actually proposing to let 
us work on the existing stdlib modules, and then to get other, already 
Cython-enabled distributions to ship Cython compiled stdlib modules for 
testing.


Yes, I think this is a good idea. Their entry level for using Cython at 
build time will be much lower than for general CPython.


Would you say it's worth a GSoC project to get some of the Python stdlib 
modules compiled and/or some of the C modules rewritten in Cython? The 
Cython project would surely provide collective help through its user 
mailing list, code review, etc.


Stefan

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 01:19 PM, R. David Murray wrote:

So you are worried about the small window between me doing an 'svn up',
seeing no changes, and doing an 'svn ci'?  I suppose that is a legitimate
concern, but considering the fact that if the same thing happens in hg,
the only difference is that I know about it and have to do more work[*],
I don't think it really changes anything.  Well, it means that if your
culture uses the always test workflow you can't be *perfect* about it
if you use svn[**], which I must suppose has been your (and Stephen's)
point from the beginning.

[*] that is, I'm *not* going to rerun the test suite even if I have to
pull/up/merge, unless there are conflicts.

I think if we really want full testing of all changesets landing on
hg.python.org/cpython we're going to need a submit robot like PQM or Tarmac,
although the latter is probably too tightly wedded to the Launchpad API, and I
don't know if the former supports Mercurial.

With the benefits such robots bring, it's also important to understand the
downsides.  There are more moving parts to maintain, and because landings are
serialized, long test suites can sometimes cause significant backlogs.  Other
than during the Pycon sprints, the backlog probably wouldn't be that big.

Another complication we'd have is running the test suite cross-platform, but I
suspect that almost nobody does that today anyway.  So the buildbot farm would
still be important.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] VM and Language summit info for those not at Pycon (and those that are!)

2011-03-21 Thread Michael Foord

On 21/03/2011 17:47, Stefan Behnel wrote:

Stefan Behnel, 21.03.2011 11:58:

Guido van Rossum, 21.03.2011 03:46:

Have you tried replacing selected stdlib modules with their
Cython-optimized equivalents in some of the NumPy/SciPy distros? (E.g.
what about Enthought's Python distros?) Depending on how well that
goes I might warm up to Cython more!


Hmm, I hadn't heard about that before. I'll ask on our mailing list if
anyone's aware of them. I doubt that the stdlib participates in the
critical parts of scientific computation code. Maybe alternative CSV
parsers or something like that, but I'd be surprised if they were
compatible with what's in the stdlib.


Sorry, I misread your statement above. You were actually proposing to 
let us work on the existing stdlib modules, and then to get other, 
already Cython-enabled distributions to ship Cython compiled stdlib 
modules for testing.


Yes, I think this is a good idea. Their entry level for using Cython 
at build time will be much lower than for general CPython.


Would you say it's worth a GSoC project to get some of the Python 
stdlib modules compiled and/or some of the C modules rewritten in 
Cython? The Cython project would surely provide collective help 
through its user mailing list, code review, etc.


Assuming its possible it sounds like an awesome project.

Michael



Stefan

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk



--
http://www.voidspace.org.uk/

May you do good and not evil
May you find forgiveness for yourself and forgive others
May you share freely, never taking more than you give.
-- the sqlite blessing http://www.sqlite.org/different.html

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] VM and Language summit info for those not at Pycon (and those that are!)

2011-03-21 Thread Maciej Fijalkowski
[skipping the whole long discussion]


 Cython is meant to compile Python code. A cython version would just be a
 pure Python module, usable with all other implementations, but with type
 annotations that make it compile to more optimal C code. Type annotations
 can be provided in an external file (.pxd), using decorators and/or using
 Python 3 annotation syntax. There's also a port of Cython to IronPython
 being written. Additionally, other implementations (such as PyPy) could
 start using the available type declarations in order to improve their
 optimisation capabilities as well.


No, PyPy won't be any faster by using Cython type annotations. That
would be missing the whole point. JIT is finding types actually used
without an actual need to tell it to do so. If you really want to
assert isinstance(x, SomeType) would be good enough, but that won't
make anything faster. Places where PyPy is slower than Cython-compiled
code for numeric benchmarks is precisely where those type annotations
actually break Python semantics - for example overflow checking.

Also, from what we (and Unladen Swallow team) found out so far is that
people are actually interested in benchmarks that are real world,
which in python world usually means using twisted or using django
or another using very large library which is a bit hard to find a
hotspot in. That usually means execution time is spread across a
million calls, which makes it infeasible (if not incorrect) to apply
type annotations. That's where (I think) speed.python.org will go,
because that's where most people who contributed benchmarks are
interested in.

We would be happy to run those benchmarks against any VM that can
reasonably run them. Feel also free to contribute a way to run (a
subset of) benchmarks to run under cython, but what we won't do is we
won't provide a custom version of those benchmarks to run with Cython
(type annotations).

Cheers,
fijal
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 06:07 PM, Antoine Pitrou wrote:

On Mon, 21 Mar 2011 12:20:15 -0400
Barry Warsaw ba...@python.org wrote:
 On Mar 20, 2011, at 04:39 PM, Georg Brandl wrote:
 
 On 20.03.2011 16:21, Guido van Rossum wrote:
  What is rebase? Why does everyone want it and hate it at the same time?
 
 Basically, rebase is a way to avoid having pointless merge commits on the
 same branch.
 
 There's something I don't understand about rebase.  It seems like most git 
 and
 hg users I hear from advocate rebase, while (ISTM) few Bazaar users do.

I don't think many hg users advocate rebase, really. AFAICT the
Mercurial developers themselves don't seem to use it (they do use mq,
OTOH).

I guess that begs the question then. ;)  

What harm would there be in relaxing the SHOULD in this paragraph to MAY?

You should collapse changesets of a single feature or bugfix before pushing
the result to the main repository. The reason is that we don’t want the
history to be full of intermediate commits recording the private history of
the person working on a patch. If you are using the rebase extension, consider
adding the --collapse option to hg rebase. The collapse extension is another
choice.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Daniel Stutzbach
On Mon, Mar 21, 2011 at 8:38 AM, Antoine Pitrou solip...@pitrou.net wrote:

 We cannot emulate svnmerge for porting between branches, though - and
 I doubt bzr can do it. That's because merges in common DVCSes are based
 on the DAG, while svnmerge is a prettily ad-hoc free-form thing.


The equivalent way to how we had been using svnmerge would be to use hg
transplant to move patches between branches (and never merging the
branches).

Conversely, the current hg workflow would be similar to committing changes
to the earliest applicable svn branch, then doing a full svnmerge to later
branches.

-- 
Daniel Stutzbach
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] GSoC: speed.python.org

2011-03-21 Thread DasIch
Hello Guys,
I'm interested in participating in the Google Summer of Code this year
and I've been looking at projects in the Wiki, particularly
speed.pypy.org[1] as I'm very interested in the current VM
development. However given my knowledge that project raised several
questions:

1. Up until now the only 3.x Implementation is CPyhon. IronPython,
Jython and PyPy don't support it - and to my knowledge won't get
support for it during or before GSoC - and could not benefit from it.
It seems that a comparison between a Python 2.x implementation and a
3.x implementation is rather pointless; so is this intended to be
rather an additional feature to have 3.x there as well?

2. As a follow-up to 1: It is not specified whether the benchmarks
should be ported using a tool such as 2to3, if this should not happen
or if this is up to the student, this needs clarification. This may be
more clear if it were considered under which umbrella this project
is actually supposed to happen; will those ported benchmarks end up in
CPython or will there be a separate repository for all VMs?

3. Several benchmarks (at least the Django and Twisted ones) have
dependencies which are not (yet) ported to 3.x and porting those
dependencies during GSoC as part of this project is an unrealistic
goal. Should those benchmarks, at least for now, be ignored?

[1]: http://wiki.python.org/moin/SpeedDotPythonDotOrg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 14:29:54 -0400
Barry Warsaw ba...@python.org wrote:
 
 I don't think many hg users advocate rebase, really. AFAICT the
 Mercurial developers themselves don't seem to use it (they do use mq,
 OTOH).
 
 I guess that begs the question then. ;)  
 
 What harm would there be in relaxing the SHOULD in this paragraph to MAY?
 
 You should collapse changesets of a single feature or bugfix before pushing
 the result to the main repository.

Because it's really SHOULD.
Apparently some people misunderstand this statement. collapse
changesets of a single feature or bugfix doesn't mean you must avoid
merges. If that's the impression it gives then the wording SHOULD (;-))
be changed.

The paragraph is aimed at the temptation people may have to commit many
changesets for a single feature/bugfix and push them all even though
some of them don't leave the source tree in a stable state. What it
says is that we don't want work-in-progress changesets in the public
history.

Again, a better wording is welcome.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Maciej Fijalkowski
On Mon, Mar 21, 2011 at 12:33 PM, DasIch dasdas...@googlemail.com wrote:
 Hello Guys,
 I'm interested in participating in the Google Summer of Code this year
 and I've been looking at projects in the Wiki, particularly
 speed.pypy.org[1] as I'm very interested in the current VM
 development. However given my knowledge that project raised several
 questions:

 1. Up until now the only 3.x Implementation is CPyhon. IronPython,
 Jython and PyPy don't support it - and to my knowledge won't get
 support for it during or before GSoC - and could not benefit from it.
 It seems that a comparison between a Python 2.x implementation and a
 3.x implementation is rather pointless; so is this intended to be
 rather an additional feature to have 3.x there as well?

 2. As a follow-up to 1: It is not specified whether the benchmarks
 should be ported using a tool such as 2to3, if this should not happen
 or if this is up to the student, this needs clarification. This may be
 more clear if it were considered under which umbrella this project
 is actually supposed to happen; will those ported benchmarks end up in
 CPython or will there be a separate repository for all VMs?

 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?

 [1]: http://wiki.python.org/moin/SpeedDotPythonDotOrg
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 http://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe: 
 http://mail.python.org/mailman/options/python-dev/fijall%40gmail.com


Hi.

There might be two independent SoCs but as far as I know, a very
interesting SoC on it's own (and one I'm willing to mentor) would be
to:

1. Get this stuff running on speed.python.org
2. Improve backend infrastructure so it actually *can* easily run
multiple VMs, especially that some benchmarks won't run on all.
3. Fix some bugs in frontend, improve things that don't look quite as
good if that should be for the general Python community.

Note that this is a bit orthogonal to port benchmarks to Python 3.
Also if noone would run those benchmarks, just porting them to Python
3 makes little sense.

Cheers,
fijal
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 19:33:55 +0100
DasIch dasdas...@googlemail.com wrote:
 
 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?

Why not reuse the benchmarks in http://hg.python.org/benchmarks/ ?
Many of them are 3.x-compatible.
I don't understand why people are working on multiple benchmark suites
without cooperating these days.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Jesse Noller
Some remarks below.

On Mon, Mar 21, 2011 at 2:33 PM, DasIch dasdas...@googlemail.com wrote:
 Hello Guys,
 I'm interested in participating in the Google Summer of Code this year
 and I've been looking at projects in the Wiki, particularly
 speed.pypy.org[1] as I'm very interested in the current VM
 development. However given my knowledge that project raised several
 questions:

 1. Up until now the only 3.x Implementation is CPyhon. IronPython,
 Jython and PyPy don't support it - and to my knowledge won't get
 support for it during or before GSoC - and could not benefit from it.
 It seems that a comparison between a Python 2.x implementation and a
 3.x implementation is rather pointless; so is this intended to be
 rather an additional feature to have 3.x there as well?


You are correct: But the point of this GSOC project is to do the
porting, the initial deployment of speed.python.org will be on the 2.x
implementation with 3.x to follow as other VMs migrate.

 2. As a follow-up to 1: It is not specified whether the benchmarks
 should be ported using a tool such as 2to3, if this should not happen
 or if this is up to the student, this needs clarification. This may be
 more clear if it were considered under which umbrella this project
 is actually supposed to happen; will those ported benchmarks end up in
 CPython or will there be a separate repository for all VMs?


We will have a common repository for all benchmarks, for all of the
implementations.

 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?


IMHO: Yes. I think MvL can expand on this as well.

 [1]: http://wiki.python.org/moin/SpeedDotPythonDotOrg

FYI, I recommend coordinating with Miquel Torres (cc'ed) and Maciej
Fijalkowski from PyPy on these questions / this project as well. I am
currently coordinating getting the hardware setup for this project's
hosting.

jesse
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Jesse Noller
On Mon, Mar 21, 2011 at 2:48 PM, Antoine Pitrou solip...@pitrou.net wrote:
 On Mon, 21 Mar 2011 19:33:55 +0100
 DasIch dasdas...@googlemail.com wrote:

 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?

 Why not reuse the benchmarks in http://hg.python.org/benchmarks/ ?
 Many of them are 3.x-compatible.
 I don't understand why people are working on multiple benchmark suites
 without cooperating these days.

 Regards

 Antoine.

Antoine: The goal is to *get* PyPy, Jython, IronPython, etc all using
a common set of benchmarks. The idea stems from http://speed.pypy.org/
- those benchmarks grew from the unladen swallow benchmarks, which I
think are the ones in http://hg.python.org/benchmarks/.

So, yeah - the goal is to get us all reading off the same page. The
PyPy people can speak to the changes they made/had to make.

jesse
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Antoine Pitrou
Le lundi 21 mars 2011 à 11:33 -0700, Daniel Stutzbach a écrit :
 On Mon, Mar 21, 2011 at 8:38 AM, Antoine Pitrou solip...@pitrou.net
 wrote:
 We cannot emulate svnmerge for porting between branches,
 though - and
 I doubt bzr can do it. That's because merges in common DVCSes
 are based
 on the DAG, while svnmerge is a prettily ad-hoc free-form
 thing.
 
 
 The equivalent way to how we had been using svnmerge would be to use
 hg transplant to move patches between branches (and never merging the
 branches).

Yes, this has been discussed several times before the migration.
Using hg transplant is seriously suboptimal compared to the model
promoted by a DAG of changesets-based DVCS. Also, transplant has its
own weaknesses.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Daniel Stutzbach
On Sun, Mar 20, 2011 at 9:39 AM, Georg Brandl g.bra...@gmx.net wrote:

 Now, hg pull --rebase prevents that by re-basing the A-B history
 line onto the latest remote head.  After rebasing, the history looks
 like this:

 ... --- X --- C --- D --- E --- A' --- B'


Rebasing also allows you to collapse the commits, so you can make the tree
look like this:

... --- X --- C --- D --- E --- AB'

Collapsing commits is useful for keeping the repository clean.  For example,
I make commit A and send it for review.  Someone points out some flaws and I
make commit B.  I don't want the flawed version checked in to the main
repository, so I collapse the commits into one commit AB'.

Keeping the repository clean makes it easier to use a bisection search to
hunt down the introduction of a bug.  If  every developer's intermediate
commits make it into the main repository, it's hard to go back to an older
revision to test something, because many of the older revisions will be
broken in some way.

In reality
 it works fine if you know the limits: rebasing really only should be
 applied if the changesets are not already known somewhere else,
 only in the local repo you're working with.


The changesets must only be in the local repo *and* they must not have been
merged into another branch yet.

On Sun, Mar 20, 2011 at 9:21 AM, Guido van Rossum gu...@python.org wrote:

 Why does everyone want it and hate it at the same time?


People love it because it's a very powerful tool.  People hate it because it
allows you to shoot yourself in the foot.

-- 
Daniel Stutzbach
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Antoine Pitrou
Le lundi 21 mars 2011 à 14:51 -0400, Jesse Noller a écrit :
 On Mon, Mar 21, 2011 at 2:48 PM, Antoine Pitrou solip...@pitrou.net wrote:
  On Mon, 21 Mar 2011 19:33:55 +0100
  DasIch dasdas...@googlemail.com wrote:
 
  3. Several benchmarks (at least the Django and Twisted ones) have
  dependencies which are not (yet) ported to 3.x and porting those
  dependencies during GSoC as part of this project is an unrealistic
  goal. Should those benchmarks, at least for now, be ignored?
 
  Why not reuse the benchmarks in http://hg.python.org/benchmarks/ ?
  Many of them are 3.x-compatible.
  I don't understand why people are working on multiple benchmark suites
  without cooperating these days.
 
  Regards
 
  Antoine.
 
 Antoine: The goal is to *get* PyPy, Jython, IronPython, etc all using
 a common set of benchmarks. The idea stems from http://speed.pypy.org/
 - those benchmarks grew from the unladen swallow benchmarks, which I
 think are the ones in http://hg.python.org/benchmarks/.

Ok, I can stand corrected. Last time I looked, speed.pypy.org had an
entirely disjunct set of benchmarks.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread skip

Antoine Why not reuse the benchmarks in
Antoine http://hg.python.org/benchmarks/ ?

These looks like basically the same benchmarks as the Unladen Swallow folks
put together, right?  Is there any value in them as regression tests (maybe
with more elaborate inputs and/or longer runtimes)?  If so, I think there is
value it having both 2.x and 3.x versions, even if the 3.x stuff won't be
useful for speed comparisons.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread skip

Daniel If every developer's intermediate commits make it into the main
Daniel repository, it's hard to go back to an older revision to test
Daniel something, because many of the older revisions will be broken in
Daniel some way.

This is what I discovered with my trivial doc patch last week.  I was quite
surprised to see all my local checkins turn up on the python-checkins
mailing list.  Is there not some way to automatically collapse a series of
local commits into one large commit when you do the push?

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Antoine Pitrou
Le lundi 21 mars 2011 à 14:06 -0500, s...@pobox.com a écrit :
 Antoine Why not reuse the benchmarks in
 Antoine http://hg.python.org/benchmarks/ ?
 
 These looks like basically the same benchmarks as the Unladen Swallow folks
 put together, right?

Yes, it's basically the continuation of their work. Most changes since
then have been relatively minor.

 Is there any value in them as regression tests (maybe
 with more elaborate inputs and/or longer runtimes)?

You mean to check behaviour or to check for performance regressions?
IMHO the test suite is long enough to run already and performance tests
should be kept in a separate suite. Whether that suite should be
separate from the main distribution is another question.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread skip

 Is there any value in them as regression tests (maybe with more
 elaborate inputs and/or longer runtimes)?

Antoine You mean to check behaviour or to check for performance
Antoine regressions?

Both.  Semantic regressions, and secondarily, performance regressions.
I can understand the need to stabilize the code and inputs for measuring
performance.  As bad a benchmark as pystone is, one of its saving graces is
that it hasn't changed in so long.

When looking for semantic problems I can see that you would add new packages
to the suite where they expose problems, just as we do today for unit tests.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 14:30:45 -0500
s...@pobox.com wrote:
 
  Is there any value in them as regression tests (maybe with more
  elaborate inputs and/or longer runtimes)?
 
 Antoine You mean to check behaviour or to check for performance
 Antoine regressions?
 
 Both.  Semantic regressions, and secondarily, performance regressions.
 I can understand the need to stabilize the code and inputs for measuring
 performance.  As bad a benchmark as pystone is, one of its saving graces is
 that it hasn't changed in so long.

I think that semantic regressions would be better checked by running
the test suites of these third-party libraries, not benchmarks.
Ideally a separate suite of buildbots or builders could do that. We had
the community buildbots but it seems they ran out of steam.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Raymond Hettinger

On Mar 21, 2011, at 11:56 AM, Daniel Stutzbach wrote:
 
 People love it because it's a very powerful tool.  People hate it because it 
 allows you to shoot yourself in the foot.

There's a certain irony in this.   The original motivation for version control
was to be a safety rope, to serve as a productivity tool to make sure
that work never got lost.

Now we seem to be advocating a complex, fragile workflow that
is hard to learn, hard to get right, that let's you shoot yourself in 
the foot, and that has rebasing/collapsing steps that destroy and 
rewrite history (an possibly muck-up your repo if there was an
intervening push).

If we gave-up on the svnmerge on steroids workflow,
the use of Hg would become dirt simple.   I've used it that way
in personal projects for a couple months and it is 
remarkably easy, taking only minutes to learn.
It also works with Hg right out of the box; no need
for extensions, customizations, or a slew of advanced Hg
features.

If someone has to completely master nuances of Hg
to follow the required workflow, then we're doing it wrong.

ISTM, there has been substantial mission creep from 
the workflow described in the PEP.  If the current workflow
had been described there, I don't think it would have been
readily accepted.


Raymond




___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 12:40:08 -0700
Raymond Hettinger raymond.hettin...@gmail.com wrote:
 
 Now we seem to be advocating a complex, fragile workflow that
 is hard to learn, hard to get right, that let's you shoot yourself in 
 the foot, and that has rebasing/collapsing steps that destroy and 
 rewrite history (an possibly muck-up your repo if there was an
 intervening push).

FWIW, rebase is *not* advocated in the devguide. It's not even a
suggestion. The only sentence which mentions it starts with

   If you are using the rebase_ extension

which is a pretty clear hint that it's an individual choice and not
something we recommend.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Georg Brandl
On 21.03.2011 20:40, Raymond Hettinger wrote:
 
 On Mar 21, 2011, at 11:56 AM, Daniel Stutzbach wrote:


 People love it because it's a very powerful tool.  People hate it because it
 allows you to shoot yourself in the foot.
 
 There's a certain irony in this.   The original motivation for version control
 was to be a safety rope, to serve as a productivity tool to make sure
 that work never got lost.

 Now we seem to be advocating a complex, fragile workflow that
 is hard to learn, hard to get right, that let's you shoot yourself in 
 the foot, and that has rebasing/collapsing steps that destroy and 
 rewrite history (an possibly muck-up your repo if there was an
 intervening push).

That last one is not true.

 If we gave-up on the svnmerge on steroids workflow,
 the use of Hg would become dirt simple.   I've used it that way
 in personal projects for a couple months and it is 
 remarkably easy, taking only minutes to learn.
 It also works with Hg right out of the box; no need
 for extensions, customizations, or a slew of advanced Hg
 features.

Whether you use rebase or not is completely unrelated to the branch
workflow.  (Indeed, without branch merging rebase works even better,
since as we saw it doesn't handle branch merges.)

Georg

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Ben Finney
Barry Warsaw ba...@python.org writes:

 There's something I don't understand about rebase. It seems like most
 git and hg users I hear from advocate rebase, while (ISTM) few Bazaar
 users do.

 I'd like to understand whether that's a cultural thing or whether it's
 a byproduct of some aspect of the respective tools.

As I understand it, the justification usually given for rewriting
history is so that others get a clean view of what one has done.

As a user of Bazaar primarily, that's addressing the problem in the
wrong place: why rewrite *my* history, which is useful to me as is, when
the other person is using Bazaar and so doesn't see revisions they don't
care about?

The advantages given for rewriting history (“don't show individual
commits that went into a merge”) are null for a Bazaar user. Bazaar
doesn't show me those commits within merges anyway, unless I ask for
them.

That is, when showing the log of a branch, each merge appears as a
single entry, unless I ask to expand levels when viewing them. The
detailed revision data is always there, but it doesn't get in the way
unless I ask for it.

That seems to me the ideal: preserve all revision history for those
cases when some user will care about it, but *present* history cleanly
by default.

Whether adding support in Mercurial or Git for similar
clean-presentation-by-default would obviate the need for rewriting
history, I can't tell.

-- 
 \  “The best mind-altering drug is truth.” —Jane Wagner, via Lily |
  `\Tomlin |
_o__)  |
Ben Finney


pgpJwi4AZxO8c.pgp
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Georg Brandl
On 21.03.2011 20:09, s...@pobox.com wrote:
 
 Daniel If every developer's intermediate commits make it into the main
 Daniel repository, it's hard to go back to an older revision to test
 Daniel something, because many of the older revisions will be broken in
 Daniel some way.
 
 This is what I discovered with my trivial doc patch last week.  I was quite
 surprised to see all my local checkins turn up on the python-checkins
 mailing list.  Is there not some way to automatically collapse a series of
 local commits into one large commit when you do the push?

There is, but this is again changing history, with all the possible
benefits and caveats that have been shown in this thread.

Georg


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 11:56 AM, Daniel Stutzbach wrote:

Keeping the repository clean makes it easier to use a bisection search to
hunt down the introduction of a bug.  If  every developer's intermediate
commits make it into the main repository, it's hard to go back to an older
revision to test something, because many of the older revisions will be
broken in some way.

So maybe this gets at my earlier question about rebase being cultural
vs. technology, and the response about bzr having a strong sense of mainline
where hg doesn't.

I don't use the bzr-bisect plugin too much, but I think by default it only
follows commits on the main line, unless a bisect point is identified within a
merge (i.e. side) line.  So again, those merged intermediate changes are
mostly ignored until they're needed.

-Barry



signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 08:58 PM, Georg Brandl wrote:

On 21.03.2011 20:09, s...@pobox.com wrote:
 
 Daniel If every developer's intermediate commits make it into the main
 Daniel repository, it's hard to go back to an older revision to test
 Daniel something, because many of the older revisions will be broken in
 Daniel some way.
 
 This is what I discovered with my trivial doc patch last week.  I was quite
 surprised to see all my local checkins turn up on the python-checkins
 mailing list.  Is there not some way to automatically collapse a series of
 local commits into one large commit when you do the push?

There is, but this is again changing history, with all the possible
benefits and caveats that have been shown in this thread.

I think Ben Finney hit the nail on the head here.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 22, 2011, at 06:57 AM, Ben Finney wrote:

Barry Warsaw ba...@python.org writes:

 There's something I don't understand about rebase. It seems like most
 git and hg users I hear from advocate rebase, while (ISTM) few Bazaar
 users do.

 I'd like to understand whether that's a cultural thing or whether it's
 a byproduct of some aspect of the respective tools.

As I understand it, the justification usually given for rewriting
history is so that others get a clean view of what one has done.

As a user of Bazaar primarily, that's addressing the problem in the
wrong place: why rewrite *my* history, which is useful to me as is, when
the other person is using Bazaar and so doesn't see revisions they don't
care about?

The advantages given for rewriting history (“don't show individual
commits that went into a merge”) are null for a Bazaar user. Bazaar
doesn't show me those commits within merges anyway, unless I ask for
them.

That is, when showing the log of a branch, each merge appears as a
single entry, unless I ask to expand levels when viewing them. The
detailed revision data is always there, but it doesn't get in the way
unless I ask for it.

That seems to me the ideal: preserve all revision history for those
cases when some user will care about it, but *present* history cleanly
by default.

Whether adding support in Mercurial or Git for similar
clean-presentation-by-default would obviate the need for rewriting
history, I can't tell.

Thanks Ben, for such a clear description.  This jives with my observations
exactly.

Cheers,
-Barry



signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread DasIch
On Mon, Mar 21, 2011 at 7:48 PM, Antoine Pitrou solip...@pitrou.net wrote:
 On Mon, 21 Mar 2011 19:33:55 +0100
 DasIch dasdas...@googlemail.com wrote:

 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?

 Why not reuse the benchmarks in http://hg.python.org/benchmarks/ ?
 Many of them are 3.x-compatible.
 I don't understand why people are working on multiple benchmark suites
 without cooperating these days.

I haven't looked to closely but those benchmarks appear to be the ones
developed by the unladen swallow guys and those are used by PyPy among
others.

The difference is that PyPy has more benchmarks particularly ones that
measure performance of real world applications. As good benchmarks are
very hard to come by those appear to be a better starting point.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 07:38 PM, Antoine Pitrou wrote:

On Mon, 21 Mar 2011 14:29:54 -0400
Barry Warsaw ba...@python.org wrote:
 
 I don't think many hg users advocate rebase, really. AFAICT the
 Mercurial developers themselves don't seem to use it (they do use mq,
 OTOH).
 
 I guess that begs the question then. ;)  
 
 What harm would there be in relaxing the SHOULD in this paragraph to MAY?
 
 You should collapse changesets of a single feature or bugfix before pushing
 the result to the main repository.

Because it's really SHOULD.
Apparently some people misunderstand this statement. collapse
changesets of a single feature or bugfix doesn't mean you must avoid
merges. If that's the impression it gives then the wording SHOULD (;-))
be changed.

The paragraph is aimed at the temptation people may have to commit many
changesets for a single feature/bugfix and push them all even though
some of them don't leave the source tree in a stable state. What it
says is that we don't want work-in-progress changesets in the public
history.

Again, a better wording is welcome.

I guess it depends on what work-in-progress changesets means. ;)

If I'm working on a new feature, I am going to make lots of local commits, any
one of which may not actually be stable.  However, when my work on that
feature branch completes, I will have a fully functional, stable branch that's
ready to merge into the default branch.

As Ben described clearly, with Bazaar, I'd just merge my work-in-progress
branch to default and be done.  Tools such as bisect and log would ignore all
my intermediate changes by default, although you *can* drill down into them if
you want.  But I take it that with our Mercurial workflow, we'd rather all
those intermediate commits in my local branch were manually collapsed before I
merge to default.

My discomfort with this is not just that it changes history, but that it
throws away valuable information.  Sure, you're not going to care if I fixed a
typo in NEWS, but you might indeed care that I've addressed the issues you
raised in your first, second, and third reviews.  Each of those would be
represented by a changeset in my local line of development, and by a side
branch in the mainline DAG once my merge is completed.  You might want to dig
into that sideline to see if indeed I addressed the issues in your second
review of my code.  If we have to manually collapse changesets at feature
branch merge time, you can't do that.

Cheers,
-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Tres Seaver
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 03/21/2011 04:33 PM, Barry Warsaw wrote:
 On Mar 21, 2011, at 07:38 PM, Antoine Pitrou wrote:
 
 On Mon, 21 Mar 2011 14:29:54 -0400
 Barry Warsaw ba...@python.org wrote:

 I don't think many hg users advocate rebase, really. AFAICT the
 Mercurial developers themselves don't seem to use it (they do use mq,
 OTOH).

 I guess that begs the question then. ;)  

 What harm would there be in relaxing the SHOULD in this paragraph to MAY?

 You should collapse changesets of a single feature or bugfix before pushing
 the result to the main repository.

 Because it's really SHOULD.
 Apparently some people misunderstand this statement. collapse
 changesets of a single feature or bugfix doesn't mean you must avoid
 merges. If that's the impression it gives then the wording SHOULD (;-))
 be changed.

 The paragraph is aimed at the temptation people may have to commit many
 changesets for a single feature/bugfix and push them all even though
 some of them don't leave the source tree in a stable state. What it
 says is that we don't want work-in-progress changesets in the public
 history.

 Again, a better wording is welcome.
 
 I guess it depends on what work-in-progress changesets means. ;)
 
 If I'm working on a new feature, I am going to make lots of local commits, any
 one of which may not actually be stable.  However, when my work on that
 feature branch completes, I will have a fully functional, stable branch that's
 ready to merge into the default branch.
 
 As Ben described clearly, with Bazaar, I'd just merge my work-in-progress
 branch to default and be done.  Tools such as bisect and log would ignore all
 my intermediate changes by default, although you *can* drill down into them if
 you want.  But I take it that with our Mercurial workflow, we'd rather all
 those intermediate commits in my local branch were manually collapsed before I
 merge to default.
 
 My discomfort with this is not just that it changes history, but that it
 throws away valuable information.  Sure, you're not going to care if I fixed a
 typo in NEWS, but you might indeed care that I've addressed the issues you
 raised in your first, second, and third reviews.  Each of those would be
 represented by a changeset in my local line of development, and by a side
 branch in the mainline DAG once my merge is completed.  You might want to dig
 into that sideline to see if indeed I addressed the issues in your second
 review of my code.  If we have to manually collapse changesets at feature
 branch merge time, you can't do that.

Having bisect able to use the intermediate changesets (as an option)
could even make it easier to pinpoint the correct fix for a bug
(assuming people don't routinely make local changes of complete
guts-on-the-table messes).


Tres.
- -- 
===
Tres Seaver  +1 540-429-0999  tsea...@palladion.com
Palladion Software   Excellence by Designhttp://palladion.com
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk2HuFUACgkQ+gerLs4ltQ6+bACeLETaWB34XZwvJIPYkP3mddrU
JRoAoKCNTnkjFlQe8xXNJdyYUDHoSiFs
=ksyC
-END PGP SIGNATURE-

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Have we lost changeset info in the buildbots

2011-03-21 Thread David Bolen
Victor Stinner victor.stin...@haypocalc.com writes:

 Le lundi 14 mars 2011 à 15:36 -0400, David Bolen a écrit :

 Speaking of bbreport, I sometimes use the published page on that site
 (http://code.google.com/p/bbreport/wiki/PythonBuildbotReport) to check
 over things, but looking at it today, it seems to most recently have been
 generated back in January.  Or is the generated date line wrong?

 I ran a cron task to regenerate this page each hour, but I didn't get
 any feedback and I stopped to use it, so I just removed the cron task.

 Do you want that I restart this cron task?

If it's a hassle to maintain, it's probably not worth it.  I can't say
it's critical to me, but I do find it helpful to review periodically
given how it summarizes issues that may aggregate up to a builder or
slave.  I just keep it open on a tab along with the build slave
summary page which is my primary check, so it's a bit simpler than
having to re-run a command line tool for quick checks.

I did submit one feature request (issue 16) early on when I first
started using it about seeing if it was possible to include slave (and
not just builder) information.

-- David

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 16:33:31 -0400
Barry Warsaw ba...@python.org wrote:
 Each of those would be
 represented by a changeset in my local line of development, and by a side
 branch in the mainline DAG once my merge is completed.  You might want to dig
 into that sideline to see if indeed I addressed the issues in your second
 review of my code.  If we have to manually collapse changesets at feature
 branch merge time, you can't do that.

I'd rather take a look at the final aggregate patch to see if it looks
correct, actually. It's easy to have incremental changes which look
good but lead to a questionable patch in the end. Better to review it
in aggregate, IMO.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Module version variable

2011-03-21 Thread Barry Warsaw
On Mar 18, 2011, at 07:40 PM, Guido van Rossum wrote:

On Fri, Mar 18, 2011 at 7:28 PM, Greg Ewing greg.ew...@canterbury.ac.nz 
wrote:
 Tres Seaver wrote:

 I'm not even sure why you would want __version__ in 99% of modules:  in
 the ordinary cases, a module's version should be either the Python
 version (for a module shipped in the stdlib), or the release of the
 distribution which shipped it.

 It's useful to be able to find out the version of a module
 you're using at run time so you can cope with API changes.

 I had a case just recently where the behaviour of something
 in pywin32 changed between one release and the next. I looked
 for an attribute called 'version' or something similar to
 test, but couldn't find anything.

 +1 on having a standard place to look for version info.

I believe __version__ *is* the standard (like __author__). IIRC it was
proposed by Ping. I think this convention is so old that there isn't a
PEP for it. So yes, we might as well write it down. But it's really
nothing new.

I started an Informational PEP on this at Pycon, and will try to finish a
draft of it this week.  (I'm claiming 396 for it.)

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Module version variable

2011-03-21 Thread Barry Warsaw
On Mar 19, 2011, at 01:51 PM, Antoine Pitrou wrote:

On Fri, 18 Mar 2011 20:12:19 -0700
Toshio Kuratomi a.bad...@gmail.com wrote:
 There is a section in PEP8 about __version__ but it serves a slightly
 different purpose there:
 
 
 Version Bookkeeping
 
 If you have to have Subversion, CVS, or RCS crud in your source file, do
 it as follows.
 
 __version__ = $Revision: 88433 $
 # $Source$

This should be updated (or rather, removed) now that we use Mercurial...

I've made a note about it and will address it in PEP 396.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Benjamin Peterson
2011/3/21 Raymond Hettinger raymond.hettin...@gmail.com:

 On Mar 21, 2011, at 11:56 AM, Daniel Stutzbach wrote:

 People love it because it's a very powerful tool.  People hate it because it
 allows you to shoot yourself in the foot.

 There's a certain irony in this.   The original motivation for version
 control
 was to be a safety rope, to serve as a productivity tool to make sure
 that work never got lost.
 Now we seem to be advocating a complex, fragile workflow that
 is hard to learn, hard to get right, that let's you shoot yourself in
 the foot, and that has rebasing/collapsing steps that destroy and
 rewrite history (an possibly muck-up your repo if there was an
 intervening push).
 If we gave-up on the svnmerge on steroids workflow,
 the use of Hg would become dirt simple.   I've used it that way
 in personal projects for a couple months and it is
 remarkably easy, taking only minutes to learn.
 It also works with Hg right out of the box; no need
 for extensions, customizations, or a slew of advanced Hg
 features.

Python, though, is not your run-of-the-mill pet project. There's
always going to be a learning curve into its development.



-- 
Regards,
Benjamin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Tim Delaney
On 2011-03-22, Ben Finney ben+pyt...@benfinney.id.au wrote:

 That seems to me the ideal: preserve all revision history for those
 cases when some user will care about it, but *present* history cleanly
 by default.

 Whether adding support in Mercurial or Git for similar
 clean-presentation-by-default would obviate the need for rewriting
 history, I can't tell.

That's my thought as well - it's the presentation that makes things
difficult for people. I'm used to it (having used ClearCase for many
years before Mercurial) but I do find the presentation suboptimal.

I've recently been thinking about prototyping a mainline option for
hgrc that the various hg commands would follow (starting with hg log
and glog). Something like:

mainline = default, 3.3, 3.2, 2.7, 3.1, 3.0, 2.6, 2.5

defaulting to:

mainline = default

All hg commands would aquire an operate on all branches option.

The algorithm for hg log would be fairly trivial to change, but hg
glog would be a significant departure (and so would the hgweb log view
- I've played with this before and it's non-trivial).

The idea for glog and hgweb log would be to straight lines for the
mainlines wherever possible (multiple heads on the same mainline
branch would obviously cause deviations). The order the branches are
listed in the mainline option would be the order to display the
branches (so you could ensure that your current version was displayed
first). Merges would be indicated with a separate symbol and the name
of the branch that was merged. Similarly, when viewing all branches,
keeping a straight line would be similarly important.

You'd end up using more horizontal space, but we all seem to have
widescreen monitors these days.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Barry Warsaw
On Mar 21, 2011, at 09:53 PM, Antoine Pitrou wrote:

I'd rather take a look at the final aggregate patch to see if it looks
correct, actually. It's easy to have incremental changes which look
good but lead to a questionable patch in the end. Better to review it
in aggregate, IMO.

I think it would be good to have the option to do either.

-Barry


signature.asc
Description: PGP signature
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 17:25:05 -0400
Barry Warsaw ba...@python.org wrote:
 On Mar 21, 2011, at 09:53 PM, Antoine Pitrou wrote:
 
 I'd rather take a look at the final aggregate patch to see if it looks
 correct, actually. It's easy to have incremental changes which look
 good but lead to a questionable patch in the end. Better to review it
 in aggregate, IMO.
 
 I think it would be good to have the option to do either.

Technically, nothing prevents anyone from committing many small
changesets representing incomplete development. It's just that it
pollutes history, similarly as it polluted history with SVN.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Nick Coghlan
On Tue, Mar 22, 2011 at 3:16 AM, Raymond Hettinger
raymond.hettin...@gmail.com wrote:
 I don't think that is the main source of complexity.
 The more difficult and fragile part of the workflows are:
 * requiring commits to be cross-linked between branches
 * and wanting changesets to be collapsed or rebased
   (two operations that destroy and rewrite history).

Yep, that sounds about right. I think in the long run the first one
*will* turn out to be a better work flow, but it's definitely quite a
shift from our historical way of doing things.

As far as the second point goes, I'm coming to the view that we should
avoid rebase/strip/rollback when intending to push to the main
repository, and do long term work in *separate* cloned repositories.
Then an rdiff with the relevant cpython branch will create a nice
collapsed patch ready for application to the main repository (I have
yet to succeed in generating a nice patch without using rdiff, but I
still have some more experimentation to do with MvL's last proposed
command for that before giving up on the idea).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Martin v. Löwis
 ISTM, there has been substantial mission creep from 
 the workflow described in the PEP.  If the current workflow
 had been described there, I don't think it would have been
 readily accepted.

I don't think PEP 385 actually *was* accepted at all (PEP 374
was, selecting Mercurial). I had meant to insist on a formal
review of the PEP, but gave up on that due to the time pressure
to complete the conversion before PyCon.

Some of the open issues of the PEP are actually still open.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Martin v. Löwis
 It does so at the *tree* level, not at an individual file level.

Thanks - I stand corrected. I was thinking about the file level only (at
which it doesn't do server-side merging - right?).

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Guido van Rossum
On Mon, Mar 21, 2011 at 2:31 PM, Antoine Pitrou solip...@pitrou.net wrote:
 On Mon, 21 Mar 2011 17:25:05 -0400
 Barry Warsaw ba...@python.org wrote:
 On Mar 21, 2011, at 09:53 PM, Antoine Pitrou wrote:

 I'd rather take a look at the final aggregate patch to see if it looks
 correct, actually. It's easy to have incremental changes which look
 good but lead to a questionable patch in the end. Better to review it
 in aggregate, IMO.

 I think it would be good to have the option to do either.

 Technically, nothing prevents anyone from committing many small
 changesets representing incomplete development. It's just that it
 pollutes history, similarly as it polluted history with SVN.

But what the best/right thing to do? Consider the following very
common scenario.

Let's say I'm working on a fairly substantial feature that may take
weeks to complete. My way of working is to explore different
approaches until I'm happy. I like to make checkpoints while I'm
exploring so that I can easily backtrack from experiments. I'm not
pushing any of this to the central repo; I'm just using a local repo.
Over a few weeks this can easily lead to 100+ commits. Occasionally I
push patches to Rietveld for review. When my reviewer and me are happy
we want to push my work to the core repo. But do you really want my
100 commits (many of which represent dead ends) in the core repo? Many
of them probably have checkin messages that make no sense to anyone.

I know I would be sorely tempted to use hg export + hg import (and
extensive testing after the latter of course) so that the approved
changes can land with a single thud in the core repo. But maybe I'm a
dinosaur?

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Nick Coghlan
On Tue, Mar 22, 2011 at 7:25 AM, Barry Warsaw ba...@python.org wrote:
 On Mar 21, 2011, at 09:53 PM, Antoine Pitrou wrote:

I'd rather take a look at the final aggregate patch to see if it looks
correct, actually. It's easy to have incremental changes which look
good but lead to a questionable patch in the end. Better to review it
in aggregate, IMO.

 I think it would be good to have the option to do either.

One of the key elements here is the way we use python-checkins for
after-the-fact review. That works a *lot* better when changes land in
cohesive chunks. Maybe that's a low-tech technique which isn't up with
the latest snazzy DVCS features, but it's certainly served us well for
a long time and should be preserved if possible.

However, keeping the history clean should come a distant second to
keeping it *correct*, so I now believe we should actively discourage
use of the history editing extensions when working on changes intended
to be pushed to the main repository.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Antoine Pitrou
On Tue, 22 Mar 2011 07:32:33 +1000
Nick Coghlan ncogh...@gmail.com wrote:
 
 As far as the second point goes, I'm coming to the view that we should
 avoid rebase/strip/rollback when intending to push to the main
 repository, and do long term work in *separate* cloned repositories.
 Then an rdiff with the relevant cpython branch will create a nice
 collapsed patch ready for application to the main repository (I have
 yet to succeed in generating a nice patch without using rdiff, but I
 still have some more experimentation to do with MvL's last proposed
 command for that before giving up on the idea).

If you use named branches it's very easy, as explained in the devguide:
http://docs.python.org/devguide/committing.html#long-term-development-of-features

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou
On Mon, 21 Mar 2011 22:47:19 +0100
Martin v. Löwis mar...@v.loewis.de wrote:
  ISTM, there has been substantial mission creep from 
  the workflow described in the PEP.  If the current workflow
  had been described there, I don't think it would have been
  readily accepted.
 
 I don't think PEP 385 actually *was* accepted at all (PEP 374
 was, selecting Mercurial). I had meant to insist on a formal
 review of the PEP, but gave up on that due to the time pressure
 to complete the conversion before PyCon.

I don't think a formal review of the PEP would have brought any of the
points which have been made after (or even during) the conversion.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Nick Coghlan
On Tue, Mar 22, 2011 at 7:51 AM, Guido van Rossum gu...@python.org wrote:
 Let's say I'm working on a fairly substantial feature that may take
 weeks to complete. My way of working is to explore different
 approaches until I'm happy. I like to make checkpoints while I'm
 exploring so that I can easily backtrack from experiments. I'm not
 pushing any of this to the central repo; I'm just using a local repo.
 Over a few weeks this can easily lead to 100+ commits. Occasionally I
 push patches to Rietveld for review. When my reviewer and me are happy
 we want to push my work to the core repo. But do you really want my
 100 commits (many of which represent dead ends) in the core repo? Many
 of them probably have checkin messages that make no sense to anyone.

 I know I would be sorely tempted to use hg export + hg import (and
 extensive testing after the latter of course) so that the approved
 changes can land with a single thud in the core repo. But maybe I'm a
 dinosaur?

I don't think so. That line of reasoning is why one of the first
things I did after the transition was complete was to create a
personal sandbox repository on hg.python.org (using the server side
clone feature in the web interface). Any long term work will be done
on feature branches there (e.g. that's where the LHS precedence work
currently lives), with the main repository used only for applying
completed patches.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] I am now lost - committed, pulled, merged, what is collapse?

2011-03-21 Thread Martin v. Löwis
 I don't think that is the main source of complexity.
 
 The more difficult and fragile part of the workflows are:
 * requiring commits to be cross-linked between branches
 * and wanting changesets to be collapsed or rebased
   (two operations that destroy and rewrite history).

I think there would be no technical problems with
giving up the latter - it's just an expression of personal
taste that the devguide says what it says.

As for the former, I think it objectively improves the quality of the
maintenance releases to have managed backports, i.e. tracking that
fixes are actually similar and correlated across branches.

If you find this too complex to manage, one option would be to opt
out of backporting, i.e. apply changes only to the most recent
branches. In many cases, the harm done by not backporting a fix
is rather small - the bug may only affect only infrequent cases,
or many users may be using a different branch, anyway. Others could
then still backport the change if they consider it important (and
do a subsequent null merge to properly link the backport).

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Guido van Rossum
On Mon, Mar 21, 2011 at 3:00 PM, Nick Coghlan ncogh...@gmail.com wrote:
 On Tue, Mar 22, 2011 at 7:51 AM, Guido van Rossum gu...@python.org wrote:
 Let's say I'm working on a fairly substantial feature that may take
 weeks to complete. My way of working is to explore different
 approaches until I'm happy. I like to make checkpoints while I'm
 exploring so that I can easily backtrack from experiments. I'm not
 pushing any of this to the central repo; I'm just using a local repo.
 Over a few weeks this can easily lead to 100+ commits. Occasionally I
 push patches to Rietveld for review. When my reviewer and me are happy
 we want to push my work to the core repo. But do you really want my
 100 commits (many of which represent dead ends) in the core repo? Many
 of them probably have checkin messages that make no sense to anyone.

 I know I would be sorely tempted to use hg export + hg import (and
 extensive testing after the latter of course) so that the approved
 changes can land with a single thud in the core repo. But maybe I'm a
 dinosaur?

 I don't think so. That line of reasoning is why one of the first
 things I did after the transition was complete was to create a
 personal sandbox repository on hg.python.org (using the server side
 clone feature in the web interface). Any long term work will be done
 on feature branches there (e.g. that's where the LHS precedence work
 currently lives), with the main repository used only for applying
 completed patches.

Ah. I just discovered
http://docs.python.org/devguide/committing.html#long-term-development-of-features
which explains how to do this (it came up in the other thread :-). So
my use case is perfectly covered. Great!

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Copyright notices

2011-03-21 Thread Nadeem Vawda
On Mon, Mar 21, 2011 at 2:20 PM, M.-A. Lemburg m...@egenix.com wrote:
 Nadeem Vawda wrote:
 [snip]

 Since you'll be adding new IP to Python, the new code you write should
 contain your copyright and the standard PSF contributor agreement
 notice, e.g.

 
 (c) Copyright 2011 by Nadeem Vawda. Licensed to PSF under a Contributor 
 Agreement.
 

 (please also make sure you have sent the signed agreement to the PSF;
 see http://www.python.org/psf/contrib/)

 We don't have a general copyright or license boiler plate for Python
 source files.

 [snip]

 If the file copies significant code parts from older files, the
 copyright notices from those files will have to added to the
 file comment as well - ideally with a note explaining to which parts
 those copyrights apply and where they originated.

 If you are replacing the old implementation with a new one,
 you don't need to copy over the old copyright statements.

Thanks for the information. I still need to submit a Contributor Agreement, so
I'll do that as soon as possible.

(As an aside, it might be useful to include this info more explicitly in the
devguide, to make it easier for newbies to find. I'll put together a patch
when I get a chance.)

On Mon, Mar 21, 2011 at 4:48 PM, Antoine Pitrou solip...@pitrou.net wrote:
 On Mon, 21 Mar 2011 13:20:59 +0100
 M.-A. Lemburg m...@egenix.com wrote:
 Since you'll be adding new IP to Python, the new code you write should
 contain your copyright and the standard PSF contributor agreement
 notice, e.g.

 I agree with Raymond's argument that we shouldn't add *new* copyright
 boilerplate:
 http://mail.python.org/pipermail/python-dev/2009-January/085267.html

I agree that it would be preferable not to clutter the code with copyright
notices unnecessarily, especially since you can get the same information from
the version control history. However, as things stand, the CA requires their
inclusion, and changing it would (I imagine) involve a lot of work.

Regards,
Nadeem
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Martin v. Löwis
 I know I would be sorely tempted to use hg export + hg import (and
 extensive testing after the latter of course) so that the approved
 changes can land with a single thud in the core repo. But maybe I'm a
 dinosaur?

I certainly agree that there are cases where collapsing changes is
desirable - in particular when the author of the changes thinks it is.

However, what some of us requesting is that the SHOULD collapse
in the devguide is changed to a MAY collapse, making it strictly
an option of the committer. If there is one substantial change,
a typo change, and three merges, asking for a collapse of the typo
change is IMO complicating things too much.

OTOH, collapsing weeks of work isn't that much overhead, relatively.
Plus you may want to push the feature branch to hg.python.org/guido,
so that people can still look at it if they want to.

Regards,
Martin

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Antoine Pitrou

 However, what some of us requesting is that the SHOULD collapse
 in the devguide is changed to a MAY collapse, making it strictly
 an option of the committer. If there is one substantial change,
 a typo change, and three merges, asking for a collapse of the typo
 change is IMO complicating things too much.

Well, it's should, not must ;)
When writing this, I had in mind that other projects have different
workflows, where indeed people never collapse and many tiny changesets
(which are only significant as part of a bigger work) end up in the main
history. The point is to signal that it's not how we work.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Hg: inter-branch workflow

2011-03-21 Thread Martin v. Löwis
 One of the key elements here is the way we use python-checkins for
 after-the-fact review.

I think this can be achieved with a better email hook. I would propose
that there will be one email message per push per branch (rather than
one per changeset). For each branch, it should report what changesets
contributed, and what the final diff between the old head of the branch
and the new head of the branch is.

Or perhaps put the diffs into attachments, one per branch.

Or perhaps suppress the diffs for the branches if the only changesets
on the branches are merges.

In the rare case that a single push includes unrelated changes, people
can still look at the individual changesets in the repo to figure it all
out.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] GSoC: speed.python.org

2011-03-21 Thread Maciej Fijalkowski
On Mon, Mar 21, 2011 at 2:24 PM, DasIch dasdas...@googlemail.com wrote:
 On Mon, Mar 21, 2011 at 7:48 PM, Antoine Pitrou solip...@pitrou.net wrote:
 On Mon, 21 Mar 2011 19:33:55 +0100
 DasIch dasdas...@googlemail.com wrote:

 3. Several benchmarks (at least the Django and Twisted ones) have
 dependencies which are not (yet) ported to 3.x and porting those
 dependencies during GSoC as part of this project is an unrealistic
 goal. Should those benchmarks, at least for now, be ignored?

 Why not reuse the benchmarks in http://hg.python.org/benchmarks/ ?
 Many of them are 3.x-compatible.
 I don't understand why people are working on multiple benchmark suites
 without cooperating these days.

 I haven't looked to closely but those benchmarks appear to be the ones
 developed by the unladen swallow guys and those are used by PyPy among
 others.

 The difference is that PyPy has more benchmarks particularly ones that
 measure performance of real world applications. As good benchmarks are
 very hard to come by those appear to be a better starting point.

The benchmark running code didn't change significantly. The actual
benchmarks however did. US's micro-benchmarks were removed and new
things were added (the biggest addition is probably the whole set of
twisted benchmarks). The original idea was to converge and have the
common repo on hg.python.org, but since unladen run out of steam,
nobody bothered to update the hg.python.org one so we continued on our
own.

Cheers,
fijal
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


  1   2   >