Re: [Python-Dev] Re: Zen of Python

2005-01-20 Thread Alex Martelli
On 2005 Jan 20, at 02:47, Skip Montanaro wrote:
Phillip Actually, this is one of those rare cases where 
optimization
Phillip and clarity go hand in hand.  Human brains just don't 
handle
Phillip nesting that well.  It's easy to visualize two levels of 
nested
Phillip structure, but three is a stretch unless you can abstract 
at
Phillip least one of the layers.

Also, if you think about nesting in a class/instance context, 
something like

self.attr.foo.xyz()
says you are noodling around in the implementation details of 
self.attr (you
know it has a data attribute called foo).  This provides for some 
very
tight coupling between the implementation of whatever self.attr is and 
your
code.  If there is a reason for you to get at whatever xyz() returns, 
it's
probably best to publish a method as part of the api for self.attr.
Good point: this is also known as Law of Demeter and relevant 
summaries and links are for example at 
http://www.ccs.neu.edu/home/lieber/LoD.html .

Alex
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: Unix line endings required for PyRun* breaking embedded Python

2005-01-20 Thread Fredrik Lundh
Just van Rossum wrote:

 I don't think that in general you want to fold multiple empty lines into
 one. This would be my prefered regex:

s = re.sub(r\r\n?, \n, s)

 Catches both DOS and old-style Mac line endings. Alternatively, you can
 use s.splitlines():

s = \n.join(s.splitlines()) + \n

 This also makes sure the string ends with a \n, which may or may not be
 a good thing, depending on your application.

s = s.replace(\r, \n[\n in s:])

/F 



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated Monkey Typing pre-PEP

2005-01-20 Thread Guido van Rossum
[Phillip J. Eby]
 I've revised the draft today to simplify the terminology, discussing only
 two broad classes of adapters.  Since Clark's pending proposals for PEP 246
 align well with the concept of extenders vs. independent adapters, I've
 refocused my PEP to focus exclusively on adding support for extenders,
 since PEP 246 already provides everything needed for independent adapters.
 
 The new draft is here:
 http://peak.telecommunity.com/DevCenter/MonkeyTyping

On the plane to the Amazon.com internal developers conference in
Seattle (a cool crowd BTW) I finally got to read this. I didn't see a
way to attach comments to Phillip's draft, so here's my response. (And
no, it hasn't affected my ideas about optional typing. :)

The Monkey Typing proposal is trying to do too much, I believe. There
are two or three separate problem, and I think it would be better to
deal with each separately.

The first problem is what I'd call incomplete duck typing. There is a
function that takes a sequence argument, and you have an object that
partially implements the sequence protocol. What do you do? In current
Python, you just pass the object and pray -- if the function only uses
the methods that your object implements, it works, otherwise you'll
get a relatively clean AttributeError (e.g. Foo instance has no
attribute '__setitem__').

Phillip worries that solving this with interfaces would cause a
proliferation of partial sequence interfaces representing the needs
of various libraries. Part of his proposal comes down to having a way
to declare that some class C implements some interface I, even if C
doesn't implement all of I's methods (as long as implements at least
one). I like having this ability, but I think this fits in the
existing proposals for declaring interface conformance: there's no
reason why C couldn't have a __conform__ method that claims it
conforms to I even if it doesn't implement all methods. Or if you
don't want to modify C, you can do the same thing using the external
adapter registry.

I'd also like to explore ways of creating partial interfaces on the
fly. For example, if we need only the read() and readlines() methods
of the file protocol, maybe we could declare that as follows::

  def foo(f: file['read', 'readlines']): ...

I find the quoting inelegant, so maybe this would be better::

  file[file.read, file.readlines]

Yet another  idea (which places a bigger burden on the typecheck()
function presumed by the type declaration notation, see my blog on
Artima.com) would be to just use a list of the needed methods::

  [file.read, file.readlines]

All this would work better if file weren't a concrete type but an interface.

Now on to the other problems Phillip is trying to solve with his
proposal. He says, sometimes there's a class that has the
functionality that you need, but it's packaged differently. I'm not
happy with his proposal for solving this by declaring various adapting
functions one at a time, and I'd much rather see this done without
adding new machinery or declarations: when you're using adaptation,
just write an adapter class and register it; without adaptation, you
can still write the adapter class and explicitly instantiate it.

I have to admit that I totally lost track of the proposal when it
started to talk about JetPacks. I believe that this is trying to deal
with stateful adapters. I hope that Phillip can write something up
about these separately from all the other issues, maybe then it's
clearer.

There's one other problem that Phillip tries to tackle in his
proposal: how to implement the rich version of an interface if all
you've got is a partial implementation (e.g. you might have readline()
but you need readlines()). I think this problem is worthy of a
solution, but I think the solution could be found, again, in a
traditional adapter class. Here's a sketch::

class RichFile:
def __init__(self, ref):
self.__ref = ref
if not hasattr(ref, 'readlines'):
self.readlines = self.__readlines # Other forms of this
magic are conceivably
def __readlines(self): # Ignoring the rarely used optional argument
# It's tempting to use [line for line in self.__ref] here but
that doesn't use readline()
lines = []
while True:
line = self.__ref.readline()
if not line:
break
lines.append(line)
return lines
def __getattr__(self, name): # Delegate all other attributes to
the underlying object
return getattr(self.__ref, name)

Phillip's proposal reduces the amount of boilerplate in this class
somewhat (mostly the constructor and the __getattr__() method), but
apart from that it doesn't really seem to do a lot except let you put
pieces of the adapter in different places, which doesn't strike me as
such a great idea.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list

Re: [Python-Dev] Updated Monkey Typing pre-PEP

2005-01-20 Thread Mark Russell
On Thu, 2005-01-20 at 11:07, Guido van Rossum wrote:
 I'd also like to explore ways of creating partial interfaces on the
 fly. For example, if we need only the read() and readlines() methods
 of the file protocol, maybe we could declare that as follows::
 
   def foo(f: file['read', 'readlines']): ...
 
 I find the quoting inelegant, so maybe this would be better::
 
   file[file.read, file.readlines]

Could you not just have a builtin which constructs an interface on the
fly, so you could write:

def foo(f: interface(file.read, file.readlines)): ...

For commonly used subsets of course you'd do something like:

IInputStream = interface(file.read, file.readlines)

def foo(f: IInputStream): ...

I can't see that interface() would need much magic - I would guess you
could implement it in python with ordinary introspection.

Mark Russell

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Getting rid of unbound methods: patch available

2005-01-20 Thread Armin Rigo
Hi,

Removing unbound methods also breaks the 'py' lib quite a bit.  The 'py.test'
framework handles function and bound/unbound method objects all over the
place, and uses introspection on them, as they are the objects defining the
tests to run.
  
It's nothing that can't be repaired, and at places the fix even looks nicer
than the original code, but I thought that it points to large-scale breakage.  
I'm expecting any code that relies on introspection to break at least here or
there.  My bet is that even if it's just for fixes a couple of lines long
everyone will have to upgrade a number of their packages when switching to
Python 2.5 -- unheard of !

For reference, the issues I got with the py lib are described at
 http://codespeak.net/pipermail/py-dev/2005-January/000159.html


Armin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: Unix line endings required for PyRun* breaking embedded Python

2005-01-20 Thread Skip Montanaro

Fredrik s = s.replace(\r, \n[\n in s:])

This fails on admittedly weird strings that mix line endings:

 s = abc\rdef\r\n
 s = s.replace(\r, \n[\n in s:])
 s
'abcdef\n'

where universal newline mode or Just's re.sub() gadget would work.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unix line endings required for PyRun* breaking embedded Python

2005-01-20 Thread Skip Montanaro

Just Skip Montanaro wrote:
 Just re.sub([\r\n]+, \n, s) and I think you're good to go.

Just I don't think that in general you want to fold multiple empty
Just lines into one.

Whoops.  Yes.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated Monkey Typing pre-PEP

2005-01-20 Thread Phillip J. Eby
At 03:07 AM 1/20/05 -0800, Guido van Rossum wrote:
Phillip worries that solving this with interfaces would cause a
proliferation of partial sequence interfaces representing the needs
of various libraries. Part of his proposal comes down to having a way
to declare that some class C implements some interface I, even if C
doesn't implement all of I's methods (as long as implements at least
one). I like having this ability, but I think this fits in the
existing proposals for declaring interface conformance: there's no
reason why C couldn't have a __conform__ method that claims it
conforms to I even if it doesn't implement all methods. Or if you
don't want to modify C, you can do the same thing using the external
adapter registry.
There are some additional things that it does in this area:
1. Avoids namespace collisions when an object has a method with the same 
name as one in an interface, but which doesn't do the same thing.  (A 
common criticism of duck typing by static typing advocates; i.e. how do you 
know that 'read()' has the same semantics as what this routine expects?)

2. Provides a way to say that you conform, without writing a custom 
__conform__ method

3. Syntax for declaring conformance is the same as for adaptation
4. Allows *external* (third-party) code to declare a type's conformance, 
which is important for integrating existing code with code with type 
declarations


I'd also like to explore ways of creating partial interfaces on the
fly. For example, if we need only the read() and readlines() methods
of the file protocol, maybe we could declare that as follows::
  def foo(f: file['read', 'readlines']): ...
FYI, this is similar to the suggestion from Samuele Pedroni that lead to 
PyProtocols having a:

protocols.protocolForType(file, ['read','readlines'])
capability, that implements this idea.  However, the problem with 
implementing it by actually having distinct protocols is that declaring as 
few as seven methods results in 127 different protocol objects with 
conformance relationships to manage.

In practice, I've also personally never used this feature, and probably 
never would unless it had meaning for type declarations.  Also, your 
proposal as shown would be tedious for the declarer compared to just saying 
'file' and letting the chips fall where they may.


Now on to the other problems Phillip is trying to solve with his
proposal. He says, sometimes there's a class that has the
functionality that you need, but it's packaged differently. I'm not
happy with his proposal for solving this by declaring various adapting
functions one at a time, and I'd much rather see this done without
adding new machinery or declarations: when you're using adaptation,
just write an adapter class and register it; without adaptation, you
can still write the adapter class and explicitly instantiate it.
In the common case (at least for my code) an adapter class has only one or 
two methods, but the additional code and declarations needed to make it an 
adapter can increase the code size by 20-50%.  Using @like directly on an 
adapting method would result in a more compact expression in the common case.


I have to admit that I totally lost track of the proposal when it
started to talk about JetPacks. I believe that this is trying to deal
with stateful adapters. I hope that Phillip can write something up
about these separately from all the other issues, maybe then it's
clearer.
Yes, it was for per-object (as a) adapter state, rather than per-adapter 
(has a) state, however.  The PEP didn't try to tackle has a adapters at 
all.


Phillip's proposal reduces the amount of boilerplate in this class
somewhat (mostly the constructor and the __getattr__() method),
Actually, it wouldn't implement the __getattr__; a major point of the 
proposal is that when adapting to an interface, you get *only* the 
attributes from the interface, and of those only the ones that the adaptee 
has implementations for.  So, arbitrary __getattr__ doesn't pass down to 
the adapted item.


 but
apart from that it doesn't really seem to do a lot except let you put
pieces of the adapter in different places, which doesn't strike me as
such a great idea.
The use case for that is that you are writing a package which extends an 
interface IA to create interface IB, and there already exist numerous 
adapters to IA.  As long as IB's additional methods can be defined in terms 
of IA, then you can extend all of those adapters at one stroke.

In other words, external abstract operations are exactly equivalent to 
stateless, lossless, interface-to-interface adapters applied 
transitively.  But the point of the proposal was to avoid having to explain 
to somebody what all those terms mean, while making it easier to do such an 
adaptation correctly and succinctly.

One problem with using concrete adapter classes to full interfaces rather 
than partial interfaces is that it leads to situations like Alex's adapter 
diamond examples, because you 

Re: [Python-Dev] a bunch of Patch reviews

2005-01-20 Thread Martin v. Lwis
Irmen de Jong wrote:
That sounds very convenient, thanks.
Ok, welcome to the project! Please let me know whether
it works.
Does the status of 'python project member' come with
certain expectations that must be complied with ? ;-)
There are a few conventions that are followed more
or less stringently. You should be aware of the
things in the developer FAQ,
http://www.python.org/dev/devfaq.html
Initially, new developers should follow a
write-after-approval procedure, i.e. they should not
commit anything until they got somebody's approval.
Later, we commit things which we feel confident about,
and post other things to SF.
For CVS, I'm following a few more conventions which
I think are not documented anywhere.
- Always add a CVS commit message
- Add an entry to Misc/NEWS, if there is a new feature,
  or if it is a bug fix for a maintenance branch
  (I personally don't list bug fixed in the HEAD revision,
  but others apparently do)
- When committing configure.in, always remember to commit
  configure also (and pyconfig.h.in if it changed; remember
  to run autoheader)
- Always run the test suite before committing
- If you are committing a bug fix, consider to backport
  it to maintenance branches right away. If you don't
  backport it immediately, it likely won't appear in the
  next release. At the moment, backports to 2.4 are
  encouraged; backports to 2.3 are still possible for
  a few more days.
  If you chose not to backport for some reason, document
  that reason in the commit message. If you plan to
  backport, document that intention in the commit message
  (I usually say Will backport to 2.x)
- In the commit message, always refer to the SF tracker
  id. In the tracker item, always refer to CVS version
  numbers. I use the script attached to extract those
  numbers from the CVS commit message, to paste them
  into the SF tracker.
I probably forgot to mention a few things; you'll notice
few enough :-)
HTH,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] a bunch of Patch reviews

2005-01-20 Thread Tim Peters
[Martin v. Löwis]
...
 - Add an entry to Misc/NEWS, if there is a new feature,
   or if it is a bug fix for a maintenance branch
   (I personally don't list bug fixed in the HEAD revision,
   but others apparently do)

You should.  In part this is to comply with license requirements: 
we're a derivative work from CNRI and BeOpen's Python releases, and
their licenses require that we include a brief summary of the changes
made to Python.  That certainly includes changes made to repair bugs.

It's also extremely useful in practice to have a list of repaired bugs
in NEWS!  That saved me hours just yesterday, when trying to account
for a Zope3 test that fails under Python 2.4 but works under 2.3.4. 
2.4 NEWS pointed out that tuple hashing changed to close bug 942952,
which I can't imagine how I would have remembered otherwise.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


RE: [Python-Dev] Updated Monkey Typing pre-PEP

2005-01-20 Thread Raymond Hettinger
[Guido van Rossum]
 There's one other problem that Phillip tries to tackle in his
 proposal: how to implement the rich version of an interface if all
 you've got is a partial implementation (e.g. you might have readline()
 but you need readlines()). I think this problem is worthy of a
 solution, but I think the solution could be found, again, in a
 traditional adapter class. Here's a sketch::
 
 class RichFile:
 def __init__(self, ref):
 self.__ref = ref
 if not hasattr(ref, 'readlines'):
 self.readlines = self.__readlines # Other forms of this
 magic are conceivably
 def __readlines(self): # Ignoring the rarely used optional
argument
 # It's tempting to use [line for line in self.__ref] here but
 that doesn't use readline()
 lines = []
 while True:
 line = self.__ref.readline()
 if not line:
 break
 lines.append(line)
 return lines
 def __getattr__(self, name): # Delegate all other attributes to
 the underlying object
 return getattr(self.__ref, name)

Instead of a __getattr__ solution, I recommend subclassing from a mixin:

class RichMap(SomePartialMapping, UserDict.DictMixin): pass

class RichFile(SomePartialFileClass, Mixins.FileMixin): pass



Raymond

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange segfault in Python threads and linux kernel 2.6

2005-01-20 Thread Donovan Baarda
On Thu, 2005-01-20 at 14:12 +, Michael Hudson wrote:
 Donovan Baarda [EMAIL PROTECTED] writes:
 
  On Wed, 2005-01-19 at 13:37 +, Michael Hudson wrote:
  Donovan Baarda [EMAIL PROTECTED] writes:
[...]
  The main oddness about python threads (before 2.3) is that they run
  with all signals masked.  You could play with a C wrapper (call
  setprocmask, then exec fop) to see if this is what is causing the
  problem.  But please try 2.4.
 
  Python 2.4 does indeed fix the problem. 
 
 That's good to hear.
[...]

I still don't understand what Linux 2.4 vs Linux 2.6 had to do with it.
Reading the man pages for execve(), pthread_sigmask() and sigprocmask(),
I can see some ambiguities, but mostly only if you do things they warn
against (ie, use sigprocmask() instead of pthread_sigmask() in a
multi-threaded app).

The man page for execve() says that the new process will inherit the
Process signal mask (see sigprocmask() ). This implies to me it will
inherit the mask from the main process, not the thread's signal mask.

It looks like Linux 2.4 uses the signal mask of the main thread or
process for the execve(), whereas Linux 2.6 uses the thread's signal
mask. Given that execve() replaces the whole process, including all
threads, I dunno if using the thread's mask is right. Could this be a
Linux 2.6 kernel bug?

  I'm not sure what the correct behaviour should be. The fact that it
  works in python2.4 feels more like a byproduct of the thread mask change
  than correct behaviour. 
 
 Well, getting rid of the thread mask changes was one of the goals of
 the change.

I gathered that... which kinda means the fact that it fixed execvp in
threads is a side effect...(though I also guess it fixed a lot of other
things like this too).

  To me it seems like execvp() should be setting the signal mask back
  to defaults or at least the mask of the main process before doing
  the exec.
 
 Possibly.  I think the 2.4 change -- not fiddling the process mask at
 all -- is the Right Thing, but that doesn't help 2.3 users.  This has
 all been discussed before at some length, on python-dev and in various
 bug reports on SF.

Would a simple bug-fix for 2.3 be to have os.execvp() set the mask to
something sane before executing C execvp()? Given that Python does not
have any visibility of the procmask...

This might be a good idea regardless as it will protect against this bug
resurfacing in the future if someone decides fiddling with the mask for
threads is a good idea again.

 In your situation, I think the simplest thing you can do is dig out an
 old patch of mine that exposes sigprocmask + co to Python and either
 make a custom Python incorporating the patch and use that, or put the
 code from the patch into an extension module.  Then before execing
 fop, use the new code to set the signal mask to something sane.  Not
 pretty, particularly, but it should work.

The extension module that exposes sigprocmask() is probably best for
now...

-- 
Donovan Baarda [EMAIL PROTECTED]
http://minkirri.apana.org.au/~abo/

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Getting rid of unbound methods: patch available

2005-01-20 Thread Noam Raphael
Hello,

I would like to add here another small thing which I encountered this
week, and seems to follow the same logic as does Guido's proposal.

It's about staticmethods. I was writing a class, and its
pretty-printing method got a function for converting a value to a
string as an argument. I wanted to supply a default function. I
thought that it should be in the namespace of the class, since its
main use lies there. So I made it a staticmethod.

But - alas! After I declared the function a staticmethod, I couldn't
make it a default argument for the method, since there's nothing to do
with staticmethod instances.

The minor solution for this is to make staticmethod objects callable.
This would solve my problem. But I suggest a further step: I suggest
that if this is done, it would be nice if classname.staticmethodname
would return the classmethod instance, instead of the function itself.
I know that this things seems to contradict Guido's proposal, since he
suggests to return the function instead of a strange object, and I
suggest to return a strange object instead of a function. But this is
not true; Both are according to the idea that class attributes should
be, when possible, the same objects that were created when defining
the class. This is more consistent with the behaviour of modules
(module attributes are the objects that were created when the code was
run), and is more consistent with the general convention, that running
A = B
causes
A == B
to be true. Currently, Class.func = staticmethod(func), and Class.func
= func, don't behave by this rule. If the suggestions are accepted,
both will.

I just think it's simpler and cleaner that way. Just making
staticmethods callable would solve my practical problem too.

Noam Raphael
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Getting rid of unbound methods: patch available

2005-01-20 Thread Guido van Rossum
 It's about staticmethods. I was writing a class, and its
 pretty-printing method got a function for converting a value to a
 string as an argument. I wanted to supply a default function. I
 thought that it should be in the namespace of the class, since its
 main use lies there. So I made it a staticmethod.
 
 But - alas! After I declared the function a staticmethod, I couldn't
 make it a default argument for the method, since there's nothing to do
 with staticmethod instances.
 
 The minor solution for this is to make staticmethod objects callable.
 This would solve my problem. But I suggest a further step: I suggest
 that if this is done, it would be nice if classname.staticmethodname
 would return the classmethod instance, instead of the function itself.
 I know that this things seems to contradict Guido's proposal, since he
 suggests to return the function instead of a strange object, and I
 suggest to return a strange object instead of a function. But this is
 not true; Both are according to the idea that class attributes should
 be, when possible, the same objects that were created when defining
 the class. This is more consistent with the behaviour of modules
 (module attributes are the objects that were created when the code was
 run), and is more consistent with the general convention, that running
 A = B
 causes
 A == B
 to be true. Currently, Class.func = staticmethod(func), and Class.func
 = func, don't behave by this rule. If the suggestions are accepted,
 both will.

Well, given that attribute assignment can be overloaded, you can't
depend on that requirement all the time.

 I just think it's simpler and cleaner that way. Just making
 staticmethods callable would solve my practical problem too.

The use case is fairly uncommon (though not invalid!), and making
staticmethod callable would add more code without much benefits. I
recommend that you work around it by setting the default to None and
substituting the real default in the function.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated Monkey Typing pre-PEP

2005-01-20 Thread Guido van Rossum
Phillip, it looks like you're not going to give up. :) I really don't
want to accept your proposal into core Python, but I think you ought
to be able to implement everything you propose as part of PEAK (or
whatever other framework).

Therefore, rather than continuing to argue over the merits of your
proposal, I'd like to focus on what needs to be done so you can
implement it. The basic environment you can assume: an adaptation
module according to PEP 246, type declarations according to my latest
blog (configurable per module or per class by defining __typecheck__,
but defaulting to something conservative that either returns the
original object or raises an exception).

What do you need then?

[My plane is about to leave, gotta run!]

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Patch review [ 1093585 ] sanity check for readline remove/replace

2005-01-20 Thread Michiel Jan Laurens de Hoon
Patch review [ 1093585 ] sanity check for readline remove/replace
The functions remove_history_item and replace_history_item in the readline
module respectively remove and replace an item in the history of commands. As
outlined in bug [ 1086603 ], both functions cause a segmentation fault if the
item index is negative. This is actually a bug in the corresponding functions in
readline, which return a NULL pointer if the item index is larger than the size
of the history, but does not check for the item index being negative. I sent a
patch to [EMAIL PROTECTED], so this will probably be fixed in future versions
of readline. But for now, we need a workaround in Python.
The patched code checks if the item index is negative, and issues an error
message if so. I have run the test suite after applying this patch, and I
found no problems with it.
Note that there is one more way to fix this bug, which is to interpret negative
indeces as counting from the end (same as lists and strings for exampe). So 
remove_history_item(-1) removes the last item added to the history etc. In that 
case, get_history_item should change as well. Right now get_history_item(-1) 
returns None, so the patch introduces a small (and probably insignificant) 
inconsistency: get_history_item(-1) returns None but remove_history_item(-1) 
raises an error.

--Michiel.
--
Michiel de Hoon, Assistant Professor
University of Tokyo, Institute of Medical Science
Human Genome Center
4-6-1 Shirokane-dai, Minato-ku
Tokyo 108-8639
Japan
http://bonsai.ims.u-tokyo.ac.jp/~mdehoon

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unix line endings required for PyRun* breaking embedded Python

2005-01-20 Thread Stuart Bishop
Just van Rossum wrote:
Skip Montanaro wrote:

Just re.sub([\r\n]+, \n, s) and I think you're good to go.

I don't think that in general you want to fold multiple empty lines into
one. This would be my prefered regex:
s = re.sub(r\r\n?, \n, s)
Catches both DOS and old-style Mac line endings. Alternatively, you can
use s.splitlines():
s = \n.join(s.splitlines()) + \n
This also makes sure the string ends with a \n, which may or may not be
a good thing, depending on your application.
Do people consider this a bug that should be fixed in Python 2.4.1 and 
Python 2.3.6 (if it ever exists), or is the resposibility for doing this 
transformation on the application that embeds Python?

--
Stuart Bishop [EMAIL PROTECTED]
http://www.stuartbishop.net/
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Re: Unix line endings required for PyRun* breakingembedded Python

2005-01-20 Thread Fredrik Lundh
Stuart Bishop wrote:

 Do people consider this a bug that should be fixed in Python 2.4.1 and Python 
 2.3.6 (if it ever 
 exists), or is the resposibility for doing this transformation on the 
 application that embeds 
 Python?

the text you quoted is pretty clear on this:

It is envisioned that such strings always have the
   standard \n line feed, if the strings come from a file that file can
   be read with universal newlines.

just add the fix, already  (you don't want plpythonu to depend on a future
release anyway)

/F



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Getting rid of unbound methods: patch available

2005-01-20 Thread Noam Raphael
  and is more consistent with the general convention, that running
  A = B
  causes
  A == B
  to be true. Currently, Class.func = staticmethod(func), and Class.func
  = func, don't behave by this rule. If the suggestions are accepted,
  both will.
 
 Well, given that attribute assignment can be overloaded, you can't
 depend on that requirement all the time.
 
Yes, I know. For example, I don't know how you can make this work for
classmethods. (although I have the idea that if nested scopes were
including classes, and there was a way to assign names to a different
scope, then there would be no need for them. But I have no idea how
this can be done, so never mind.)

I just think of it as a very common convention, and I don't find the
exceptions aesthetically pleasing. But of course, I accept practical
reasons for not making it that way.

 I recommend that you work around it by setting the default to None and
 substituting the real default in the function.

That's a good idea, I will probably use it. (I thought of a different
way: don't use decorators, and wrap the function in a staticmethod
after defining the function that uses it. But this is really ugly.)

Thanks for your reply,
Noam
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com