Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Nick Coghlan
Phillip J. Eby wrote:
 This is still rather rough, but I figured it's easier to let everybody fill 
 in the remaining gaps by arguments than it is for me to pick a position I 
 like and try to convince everybody else that it's right.  :)  Your feedback 
 is requested and welcome.

I think you're actually highlighting a bigger issue with the behaviour of 
yield inside a with block, and working around it rather than fixing the 
fundamental problem.

The issue with yield causing changes to leak to outer scopes isn't limited 
to coroutine style usage - it can happen with generator-iterators, too.

What's missing is a general way of saying suspend this context temporarily, 
and resume it when done.

An example use-case not involving 'yield' at all is the asynchronise 
functionality. A generator-iterator that works in a high precision 
decimal.Context(), but wants to return values from inside a loop using normal 
precision is another example not involving coroutines.

The basic idea would be to provide syntax that allows a with statement to be 
suspended, along the lines of:

   with EXPR as VAR:
   for VAR2 in EXPR2:
   without:
   BLOCK

To mean:

   abc = (EXPR).__with__()
   exc = (None, None, None)
   VAR = abc.__enter__()
   try:
   for VAR2 in EXPR2:
   try:
   abc.__suspend__()
   try:
   BLOCK
   finally:
   abc.__resume__()
   except:
   exc = sys.exc_info()
   raise
   finally:
   abc.__exit__(*exc)


To keep things simple, just as 'break' and 'continue' work only on the 
innermost loop, 'without' would only apply to the innermost 'with' statement.

Locks, for example, could support this via:

   class Lock(object):
 def __with__(self):
 return self

 def __enter__(self):
 self.acquire()
 return self

 def __resume__(self):
 self.acquire()

 def __suspend__(self):
 self.release()

 def __exit__(self):
 self.release()


(Note that there's a potential problem if the call to acquire() in __resume__ 
fails, but that's no different than if this same dance is done manually).

Cheers,
Nick.

P.S. Here's a different generator wrapper that could be used to create a 
generator-based suspendable context that can be invoked multiple times 
through use of the without keyword. If applied to the PEP 343 
decimal.Context() __with__ method example, it would automatically restore the 
original context for the duration of the without block:

   class SuspendableGeneratorContext(object):

  def __init__(self, func, args, kwds):
  self.gen = None
  self.func = func
  self.args = args
  self.kwds = kwds

  def __with__(self):
  return self

  def __enter__(self):
  if self.gen is not None:
  raise RuntimeError(context already in use)
  gen = self.func(*args, **kwds)
  try:
  result = gen.next()
  except StopIteration:
  raise RuntimeError(generator didn't yield)
  self.gen = gen
  return result


  def __resume__(self):
  if self.gen is None:
  raise RuntimeError(context not suspended)
  gen = self.func(*args, **kwds)
  try:
  gen.next()
  except StopIteration:
  raise RuntimeError(generator didn't yield)
  self.gen = gen

  def __suspend__(self):
  try:
  self.gen.next()
  except StopIteration:
  return
  else:
  raise RuntimeError(generator didn't stop)

  def __exit__(self, type, value, traceback):
  gen = self.gen
  self.gen = None
  if type is None:
  try:
  gen.next()
  except StopIteration:
  return
  else:
  raise RuntimeError(generator didn't stop)
  else:
  try:
  gen.throw(type, value, traceback)
  except (type, StopIteration):
  return
  else:
  raise RuntimeError(generator caught exception)

   def suspendable_context(func):
  def helper(*args, **kwds):
  return SuspendableGeneratorContext(func, args, kwds)
  return helper

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.blogspot.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Early PEP draft (For Python 3000?)

2005-10-20 Thread Jim Jewett
I'll try to be more explicit; if Josiah and I are talking past each
other, than the explanation was clearly not yet mature.

(In http://mail.python.org/pipermail/python-dev/2005-October/057251.html)
Eyal Lotem suggested:

 Name: Attribute access for all namespaces ...

   global x ; x = 1
 Replaced by:
   module.x = 1

I responded:
 Attribute access as an option would be nice, but might be slower.

 Also note that one common use for a __dict__ is that you don't
 know what keys are available; meeting this use case with
 attribute access would require some extra machinery, such as
 an iterator over attributes.

Josiah Carlson responded
(http://mail.python.org/pipermail/python-dev/2005-October/057451.html)

 This particular use case is easily handled.  Put the following
 once at the top of the module...

 module = __import__(__name__)

 Then one can access (though perhaps not quickly) the module-level
 variables for that module.  To access attributes, it is a quick scan
 through module.__dict__, dir(), or vars().

My understanding of the request was that all namespaces --
including those returned by globals() and locals() -- should
be used with attribute access *instead* of __dict__ access.

module.x is certainly nicer than module.__dict__['x']

Even with globals() and locals(), I usually *wish* I could
use attribute access, to avoid creating a string when what
I really want is a name.

The catch is that sometimes I don't know the names in
advance, and have to iterate over the dict -- as you
suggested.  That works fine today; my question is what
to do instead if __dict__ is unavailable.

Note that vars(obj) itself conceptually returns a NameSpace
rather than a dict, so that isn't the answer.

My inclination is to add an __iterattr__ that returns
(attribute name, attribute value) pairs, and to make this the
default iterator for NameSpace objects.

Whether the good of
  (1)  not needing to mess with __dict__, and
  (2)  not needing to pretend that strings are names
is enough to justify an extra magic method ... I'm not as sure.

-jJ
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?

PEP 343 is still (back) on the drawing table, PEP 342 has barely been
implemented (did it survive the AST-branch merge?), and already you
are talking about adding more stuff. Please put on the brakes!

If there's anything this discussion shows me, it's that implicit
contexts are a dangerous concept, and should be treated with much
skepticism.

I would recommend that if you find yourself needing context data while
programming an asynchronous application using generator trampolines
simulating coroutines, you ought to refactor the app so that the
context is explicitly passed along rather than grabbed implicitly.
Zope doesn't *require* you to get the context from a thread-local, and
I presume that SQLObject also has a way to explicitly use a specific
connection (I'm assuming cursors and similar data structures have an
explicit reference to the connection used to create them). Heck, even
Decimal allows you to invoke every operation as a method on a
decimal.Context object!

I'd rather not tie implicit contexts to the with statement,
conceptually. Most uses of the with-statement are purely local (e.g.
with open(fn) as f), or don't apply to coroutines (e.g. with
my_lock). I'd say that with redirect_stdout(f) also doesn't apply
-- we already know it doesn't work in threaded applications, and that
restriction is easily and logically extended to coroutines.

If you're writing a trampoline for an app that needs to modify decimal
contexts, the decimal module already provides the APIs for explicitly
saving and restoring contexts.

I know that somewhere in the proto-PEP Phillip argues that the context
API needs to be made a part of the standard library so that his
trampoline can efficiently swap implicit contexts required by
arbitrary standard and third-party library code. My response to that
is that library code (whether standard or third-party) should not
depend on implicit context unless it assumes it can assume complete
control over the application. (That rules out pretty much everything
except Zope, which is fine with me. :-)

Also, Nick wants the name 'context' for PEP-343 style context
managers. I think it's overloading too much to use the same word for
per-thread or per-coroutine context.

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Early PEP draft (For Python 3000?)

2005-10-20 Thread Josiah Carlson

Jim Jewett [EMAIL PROTECTED] wrote:
 I'll try to be more explicit; if Josiah and I are talking past each
 other, than the explanation was clearly not yet mature.
 
 (In http://mail.python.org/pipermail/python-dev/2005-October/057251.html)
 Eyal Lotem suggested:
 
  Name: Attribute access for all namespaces ...
 
global x ; x = 1
  Replaced by:
module.x = 1
 
 I responded:
  Attribute access as an option would be nice, but might be slower.
 
  Also note that one common use for a __dict__ is that you don't
  know what keys are available; meeting this use case with
  attribute access would require some extra machinery, such as
  an iterator over attributes.
 
 Josiah Carlson responded
 (http://mail.python.org/pipermail/python-dev/2005-October/057451.html)
 
  This particular use case is easily handled.  Put the following
  once at the top of the module...
 
  module = __import__(__name__)
 
  Then one can access (though perhaps not quickly) the module-level
  variables for that module.  To access attributes, it is a quick scan
  through module.__dict__, dir(), or vars().
 
 My understanding of the request was that all namespaces --
 including those returned by globals() and locals() -- should
 be used with attribute access *instead* of __dict__ access.

Yeah, I missed the transition from arbitrary stack frame access to
strictly global and local scope attribute access.


 module.x is certainly nicer than module.__dict__['x']
 
 Even with globals() and locals(), I usually *wish* I could
 use attribute access, to avoid creating a string when what
 I really want is a name.

Indeed.

 The catch is that sometimes I don't know the names in
 advance, and have to iterate over the dict -- as you
 suggested.  That works fine today; my question is what
 to do instead if __dict__ is unavailable.
 
 Note that vars(obj) itself conceptually returns a NameSpace
 rather than a dict, so that isn't the answer.

 help(vars)
vars(...)
vars([object]) - dictionary

Without arguments, equivalent to locals().
With an argument, equivalent to object.__dict__.

When an object lacks a dictionary, dir() works just fine.

 help(dir)
Help on built-in function dir:

dir(...)
dir([object]) - list of strings

Return an alphabetized list of names comprising (some of) the attributes
of the given object, and of attributes reachable from it:

No argument:  the names in the current scope.
Module object:  the module attributes.
Type or class object:  its attributes, and recursively the attributes of
its bases.
Otherwise:  its attributes, its class's attributes, and recursively the
attributes of its class's base classes.


 My inclination is to add an __iterattr__ that returns
 (attribute name, attribute value) pairs, and to make this the
 default iterator for NameSpace objects.

def __iterattr__(obj):
for i in dir(obj):
yield i, getattr(obj, i)


 Whether the good of
   (1)  not needing to mess with __dict__, and
   (2)  not needing to pretend that strings are names
 is enough to justify an extra magic method ... I'm not as sure.

I don't know, but leaning towards no; dir() works pretty well.  Yeah, you
have to use getattr(), but there are worse things.

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] enumerate with a start index

2005-10-20 Thread Lisandro Dalcin
On 10/19/05, Martin Blais [EMAIL PROTECTED] wrote:
 Just wondering, would anyone think of it as a good idea if the
 enumerate() builtin could accept a start argument?

And why not an additional step argument? Anyway, perhaps all this
can be done with a 'xrange' object...


--
Lisandro Dalcín
---
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
PTLC - Güemes 3450, (3000) Santa Fe, Argentina
Tel/Fax: +54-(0)342-451.1594
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Phillip J. Eby
At 10:40 PM 10/20/2005 +1000, Nick Coghlan wrote:
Phillip J. Eby wrote:
  This is still rather rough, but I figured it's easier to let everybody 
 fill
  in the remaining gaps by arguments than it is for me to pick a position I
  like and try to convince everybody else that it's right.  :)  Your 
 feedback
  is requested and welcome.

I think you're actually highlighting a bigger issue with the behaviour of
yield inside a with block, and working around it rather than fixing the
fundamental problem.

The issue with yield causing changes to leak to outer scopes isn't limited
to coroutine style usage - it can happen with generator-iterators, too.

What's missing is a general way of saying suspend this context temporarily,
and resume it when done.

Actually, it's fairly simple to write a generator decorator using 
context.swap() that saves and restores the current execution state around 
next()/send()/throw() calls, if you prefer it to be the generator's 
responsibility to maintain such context.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Jeremy Hylton
On 10/20/05, Guido van Rossum [EMAIL PROTECTED] wrote:
 Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?

 PEP 343 is still (back) on the drawing table, PEP 342 has barely been
 implemented (did it survive the AST-branch merge?), and already you
 are talking about adding more stuff. Please put on the brakes!

Yes.  PEP 342 survived the merge of the AST branch.  I wonder, though,
if the Grammar for it can be simplified at all.  I haven't read the
PEP closely, but I found the changes a little hard to follow.  That
is, why was the grammar changed the way it was -- or how would you
describe the intent of the changes?  It was hard when doing the
transformation in ast.c to be sure that the intent of the changes was
honored.  On the other hand, it seemed to have extensive tests and
they all pass.

Jeremy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
On 10/20/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 04:04 PM 10/20/2005 -0400, Jeremy Hylton wrote:
 On 10/20/05, Guido van Rossum [EMAIL PROTECTED] wrote:
   Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?
  
   PEP 343 is still (back) on the drawing table, PEP 342 has barely been
   implemented (did it survive the AST-branch merge?), and already you
   are talking about adding more stuff. Please put on the brakes!
 
 Yes.  PEP 342 survived the merge of the AST branch.  I wonder, though,
 if the Grammar for it can be simplified at all.  I haven't read the
 PEP closely, but I found the changes a little hard to follow.  That
 is, why was the grammar changed the way it was -- or how would you
 describe the intent of the changes?

 The intent was to make it so that '(yield optional_expr)' always works, and
 also that   [lvalue =] yield optional_expr works.  If you can find another
 way to hack the grammar so that both of 'em work, it's certainly okay by
 me.  The changes I made were just the simplest things I could figure out to 
 do.

Right.

 I seem to recall that the hard part was the need for 'yield expr,expr' to
 be interpreted as '(yield expr,expr)', not '(yield expr),expr', for
 backward compatibility reasons.

But only at the statement level.

These should be errors IMO:

  foo(yield expr, expr)
  foo(expr, yield expr)
  foo(1 + yield expr)
  x = yield expr, expr
  x = expr, yield expr
  x = 1 + yield expr

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] A solution to the evils of static typing and interfaces?

2005-10-20 Thread Simon Belak
Hi,

I was thinking why not have a separate file for all the proposed 
optional meta-information (in particular interfaces, static types)? 
Something along the lines of IDLs in CORBA (with pythonic syntax, of 
curse). This way most of the benefits are retained without 
contaminating the actual syntax (dare I be so pretentious to even hope 
making both sides happy?).

For the sole purpose of illustration, let meta-files have extension .pym 
and linking to source-files be name based:

parrot.py
parrot.pym
(parrot.pyc)

With some utilities like a prototype generator (to and from meta-files) 
and a synchronization tool, time penalty on development for having two 
separate files could be kept within reason.

We could even go as far as introducing a syntax allowing custom 
meta-information to be added.

For example something akin to decorators.

parrot.pym:

@sharedinstance
class Parrot:

# Methods
# note this are only prototypes so no semicolon or suite is needed

@cache
def playDead(a : int, b : int) - None

# Attributes

@const
name : str

where sharedinstance, cache and const are custom meta-information.

This opens up countless possibilities for third-party interpreter 
enchantments and/or optimisations by providing a fully portable (as all 
meta-information are optional) language extensions.


P.S. my sincerest apologies if I am reopening a can of worms here
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
On 10/20/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 08:57 AM 10/20/2005 -0700, Guido van Rossum wrote:
 Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?
 
 PEP 343 is still (back) on the drawing table, PEP 342 has barely been
 implemented (did it survive the AST-branch merge?), and already you
 are talking about adding more stuff. Please put on the brakes!

 Sorry.  I thought that 343 was just getting a minor tune-up.

Maybe, but the issues on the table are naming issues -- is __with__
the right name, or should it be __context__? Should the decorator be
applied implicitly? Should the decorator be called @context or
@contextmanager?

 In the months
 since the discussion and approval (and implementation; Michael Hudson
 actually had a PEP 343 patch out there),

Which he described previously as a hack and apparently didn't feel
comfortable checking in. At least some of it will have to be redone,
(a) for the AST code, and (b) for the revised PEP.

 I've been doing a lot of thinking
 about how they will be used in applications, and thought that it would be a
 good idea to promote people using task-specific variables in place of
 globals or thread-locals.

That's clear, yes. :-)

I still find it unlikely that a lot of people will be using trampoline
frameworks. You and Twisted, that's all I expect.

 The conventional wisdom is that global variables are bad, but the truth is
 that they're very attractive because they allow you to have one less thing
 to pass around and think about in every line of code.

Which doesn't make them less bad -- they're still there and perhaps
more likely to trip you up when you least expect it. I think there's a
lot of truth in that conventional wisdom.

 Without globals, you
 would sooner or later end up with every function taking twenty arguments to
 pass through states down to other code, or else trying to cram all this
 data into some kind of context object, which then won't work with code
 that doesn't know about *your* definition of what a context is.

Methinks you are exaggerating for effect.

 Globals are thus extremely attractive for practical software
 development.  If they weren't so useful, it wouldn't be necessary to warn
 people not to use them, after all.  :)

 The problem with globals, however, is that sometimes they need to be
 changed in a particular context.  PEP 343 makes it safer to use globals
 because you can always offer a context manager that changes them
 temporarily, without having to hand-write a try-finally block.  This will
 make it even *more* attractive to use globals, which is not a problem as
 long as the code has no multitasking of any sort.

Hm. There are different kinds of globals. Most globals don't need to
be context-managed at all, because they can safely be shared between
threads, tasks or coroutines. Caches usually fall in this category
(e.g. the compiled regex cache). A little locking is all it takes.

The globals that need to be context-managed are the pernicious kind of
which you can never have too few. :-)

They aren't just accumulating global state, they are implicit
parameters, thereby truly invoking the reasons why globals are frowned
upon.

 Of course, the multithreading scenario is usually fixed by using
 thread-locals.  All I'm proposing is that we replace thread locals with
 task locals, and promote the use of task-local variables for managed
 contexts (such as the decimal context) *that would otherwise be a global or
 a thread-local variable*.  This doesn't seem to me like a very big deal;
 just an encouragement for people to make their stuff easy to use with PEP
 342 and 343.

I'm all for encouraging people to make their stuff easy to use with
these PEPs, and with multi-threading use.

But IMO the best way to accomplish those goals is to refrain from
global (or thread-local or task-local) context as much as possible,
for example by passing along explicit context.

The mere existence of a standard library module to make handling
task-specific contexts easier sends the wrong signal; it suggests that
it's a good pattern to use, which it isn't -- it's a last-resort
pattern, when all other solutions fail.

If it weren't for Python's operator overloading, the decimal module
would have used explicit contexts (like the Java version); but since
it would be really strange to have such a fundamental numeric type
without the ability to use the conventional operator notation, we
resorted to per-thread context. Even that doesn't always do the right
thing -- handling decimal contexts is surprisingly subtle (as Nick can
testify based on his experiences attempting to write a decimal context
manager for the with-statement!).

Yes, coroutines make it even subtler.

But I haven't seen the use case yet for mixing coroutines with changes
to decimal context settings; somehow it doesn't strike me as a likely
use case (not that you can't construct one, so don't bother -- I can
imagine it too, I just think YAGNI).

 By the way, I don't 

Re: [Python-Dev] Coroutines, generators, function calling

2005-10-20 Thread Andrew Koenig
 so the new syntax would
 not be useful, unless it was something that provided access to the index
 item as a variable, like:
 
 yield foo(i) for i in x
 
 which barely saves you anything (a colon, a newline, and an indent).

Not even that, because you can omit the newline and indent:

for i in x: yield foo(i)

There's a bigger difference between

for i in x: yield i

and

yield from x

Moreover, I can imagine optimization opportunities for yield from that
would not make sense in the context of comprehensions.



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Questionable AST wibbles

2005-10-20 Thread Neal Norwitz
Jeremy,

There are a bunch of mods from the AST branch that got integrated into
head.  Hopefully, by doing this on python-dev more people will get
involved.  I'll describe  high level things first, but there will be a
ton of details later on.  If people don't want to see this crap on
python-dev, I can take this offline.

Highlevel overview of code size (rough decrease of 300 C source lines):
 * Python/compile.c -2733 (was 6822 now 4089)
 * Python/Python-ast.c +2281 (new file)
 * Python/asdl.c +92 (new file)
 * plus other minor mods

symtable.h has lots of changes to structs and APIs.  Not sure what
needs to be doc'ed.

I was very glad to see that ./python compileall.py Lib took virtually
the same time before and after AST.  Yeah!  Unfortunately, I can't say
the same for memory usage for running compileall:

Before AST: [10120 refs]
After AST:  [916096 refs]

I believe there aren't that many true memory leaks from running
valgrind.  Though there are likely some ref leaks.  Most of this is
probably stuff that we are just hanging on to that is not required.  I
will continue to run valgrind to find more problems.

A bunch of APIs changed and there is some additional name pollution. 
Since these are pretty internal APIs, I'm not sure that part is a big
deal.  I will try to find more name pollution and eliminate it by
prefixing with Py.

One API change which I think was a mistake was _Py_Mangle() losing 2
parameters (I think this was how it was a long time ago).  See
typeobject.c, Python.h, compile.c.

pythonrun.h has a bunch of changes.  I think a lot of the APIs
changed, but there might be backwards compatible macros.  I'm not
sure.  I need to review closely.

symtable.h has lots of changes to structs and APIs.  Not sure what
needs to be doc'ed.  Some #defines are history (I think they are in
the enum now):  TYPE_*.

code.h was added, but it mostly contains stuff from compile.h.  Should
we remove code.h and just put everything in compile.h?  This will
remove lots little changes.
code.h  compile.h are tightly coupled.  If we keep them separate, I
would like to see some other changes.

This probably is not a big deal, but I was surprised by this change:

+++ test_repr.py20 Oct 2005 19:59:24 -  1.20
@@ -123,7 +123,7 @@

def test_lambda(self):
self.failUnless(repr(lambda x: x).startswith(
-function lambda))
+function lambda))

This one may be only marginally worse (names w/parameter unpacking):

test_grammar.py

-verify(f4.func_code.co_varnames == ('two', '.2', 'compound',
-'argument',  'list'))
+vereq(f4.func_code.co_varnames,
+  ('two', '.1', 'compound', 'argument',  'list'))

There are still more things I need to review.  These were the biggest
issues I found.  I don't think most are that big of a deal, just
wanted to point stuff out.

n
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Phillip J. Eby
At 07:57 PM 10/20/2005 -0700, Guido van Rossum wrote:
(Sorry for the long post -- there just wasn't anything you said that I
felt could be left unquoted. :-)

Wow.  You've brought up an awful lot of stuff I want to respond to, about 
the nature of frameworks, AOP, Chandler, PEP 342, software deployment, 
etc.  But I know you're busy, and the draft I was working on in reply to 
this has gotten simply huge and still unfinished, so I think I should just 
turn it all into a blog article on Why Frameworks Are Evil And What We Can 
Do To Stop Them.  :)

I don't think I've exaggerated anything, though.  I think maybe you're 
perceiving more vehemence than I actually have on the issue.  Context 
variables are a very small thing and I've not been arguing that they're a 
big one.  In the scope of the coming Global War On Frameworks, they are 
pretty small potatoes.  :)

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com