Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-21 Thread Michael Hudson
Guido van Rossum [EMAIL PROTECTED] writes:

 On 10/20/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 08:57 AM 10/20/2005 -0700, Guido van Rossum wrote:
 Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?
 
 PEP 343 is still (back) on the drawing table, PEP 342 has barely been
 implemented (did it survive the AST-branch merge?), and already you
 are talking about adding more stuff. Please put on the brakes!

 Sorry.  I thought that 343 was just getting a minor tune-up.

 Maybe, but the issues on the table are naming issues -- is __with__
 the right name, or should it be __context__? Should the decorator be
 applied implicitly? Should the decorator be called @context or
 @contextmanager?

 In the months
 since the discussion and approval (and implementation; Michael Hudson
 actually had a PEP 343 patch out there),

 Which he described previously as a hack

Err, that was the code I used for my talk at EuroPython.  That really
*was* a hack.  The code on SF is much better.

 and apparently didn't feel comfortable checking in.

Well, I was kind of hoping for a review, or positive comment on the
tracker, or *something* (Phillip posted half a review here a couple of
weeks ago, but I've been stupidly stupidly busy since then).

 At least some of it will have to be redone, (a) for the AST code,

Indeed.  Not much, I hope, the compiler changes were fairly simple.

 and (b) for the revised PEP.

Which I still haven't digested :-/

Cheers,
mwh

-- 
  I'm about to search Google for contract assassins to go to Iomega
  and HP's programming groups and kill everyone there with some kind
  of electrically charged rusty barbed thing.
-- http://bofhcam.org/journal/journal.html, 2002-01-08
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-21 Thread Nick Coghlan
Phillip J. Eby wrote:
 Actually, it's fairly simple to write a generator decorator using 
 context.swap() that saves and restores the current execution state 
 around next()/send()/throw() calls, if you prefer it to be the 
 generator's responsibility to maintain such context.

Yeah, I also realised there's a fairly obvious solution to my decimal.Context 
problem too:

 def iter_sin(iterable):
orig_ctx = decimal.getcontext()
with orig_ctx as ctx:
ctx.prec += 10
for r in iterable:
y = sin(r) # Very high precision during calculation
with orig_ctx:
yield +y # Interim results have normal precision
# We get ctx back here
# We get orig_ctx back here

That is, if you want to be able to restore the original context just *save* 
the damn thing. . .

Ah well, chalk up the __suspend__/__resume__ idea up as another case of me 
getting overly enthusiastic about a complex idea without looking for simpler 
solutions first. It's not like it would be the first time ;)

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.blogspot.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-21 Thread Nick Coghlan
Guido van Rossum wrote:
 If it weren't for Python's operator overloading, the decimal module
 would have used explicit contexts (like the Java version); but since
 it would be really strange to have such a fundamental numeric type
 without the ability to use the conventional operator notation, we
 resorted to per-thread context. Even that doesn't always do the right
 thing -- handling decimal contexts is surprisingly subtle (as Nick can
 testify based on his experiences attempting to write a decimal context
 manager for the with-statement!).

Indeed. Fortunately it isn't as complicated as I feared last night (it turned 
out to be a problem with me trying to hit a small nail with the new 
sledgehammer I was playing with, forgetting entirely about the trusty old 
normal hammer still in the toolkit).

 But I haven't seen the use case yet for mixing coroutines with changes
 to decimal context settings; somehow it doesn't strike me as a likely
 use case (not that you can't construct one, so don't bother -- I can
 imagine it too, I just think YAGNI).

For Python 2.5, I think the approach of generators explicitly reverting 
altered contexts around yield expressions is a reasonable way to go.

This concept is workable for generators, because they *know* when they're 
going to lose control (i.e., by invoking yield), whereas it's impossible for 
threads to know when the eval loop is going to drop them in favour of a 
different thread.

I think the parallel between __iter__ and __with__ continues to work here, too 
- alternate context managers to handle reversion of the context (e.g., 
Lock.released()) can be provided as separate methods, just as alternative 
iterators are provided (e.g., dict.iteritems(), dict.itervalues()).

Also, just as we eventually added itertools to support specific ways of 
working with iterators, I expect to eventually see contexttools to support 
specific ways of working with contexts (e.g. duck-typed contexts like 
closing, or a 'nested' context that allowed multiple resources to be 
properly managed by a single with statement).

contexttools would also be the place for ideas like suspending and resuming a 
context - rather than requiring specific syntax, it could be implemented as a 
context manager:

   ctx = suspendable_context(EXPR)
   with ctx as VAR:
 # VAR would still be the result of (EXPR).__with__().__enter__()
 # It's just that suspendable_context would be taking care of
 # making that happen, rather than it happening the usual way
 with ctx.suspended():
   # Context is suspended here
 # Context is resumed here

I do *not* think we should add contexttools in Python 2.5, because there's far 
too much chance of YAGNI. We need experience with the 'with' statement before 
we can really identify the tools that are appropriate.

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.blogspot.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Nick Coghlan
Phillip J. Eby wrote:
 This is still rather rough, but I figured it's easier to let everybody fill 
 in the remaining gaps by arguments than it is for me to pick a position I 
 like and try to convince everybody else that it's right.  :)  Your feedback 
 is requested and welcome.

I think you're actually highlighting a bigger issue with the behaviour of 
yield inside a with block, and working around it rather than fixing the 
fundamental problem.

The issue with yield causing changes to leak to outer scopes isn't limited 
to coroutine style usage - it can happen with generator-iterators, too.

What's missing is a general way of saying suspend this context temporarily, 
and resume it when done.

An example use-case not involving 'yield' at all is the asynchronise 
functionality. A generator-iterator that works in a high precision 
decimal.Context(), but wants to return values from inside a loop using normal 
precision is another example not involving coroutines.

The basic idea would be to provide syntax that allows a with statement to be 
suspended, along the lines of:

   with EXPR as VAR:
   for VAR2 in EXPR2:
   without:
   BLOCK

To mean:

   abc = (EXPR).__with__()
   exc = (None, None, None)
   VAR = abc.__enter__()
   try:
   for VAR2 in EXPR2:
   try:
   abc.__suspend__()
   try:
   BLOCK
   finally:
   abc.__resume__()
   except:
   exc = sys.exc_info()
   raise
   finally:
   abc.__exit__(*exc)


To keep things simple, just as 'break' and 'continue' work only on the 
innermost loop, 'without' would only apply to the innermost 'with' statement.

Locks, for example, could support this via:

   class Lock(object):
 def __with__(self):
 return self

 def __enter__(self):
 self.acquire()
 return self

 def __resume__(self):
 self.acquire()

 def __suspend__(self):
 self.release()

 def __exit__(self):
 self.release()


(Note that there's a potential problem if the call to acquire() in __resume__ 
fails, but that's no different than if this same dance is done manually).

Cheers,
Nick.

P.S. Here's a different generator wrapper that could be used to create a 
generator-based suspendable context that can be invoked multiple times 
through use of the without keyword. If applied to the PEP 343 
decimal.Context() __with__ method example, it would automatically restore the 
original context for the duration of the without block:

   class SuspendableGeneratorContext(object):

  def __init__(self, func, args, kwds):
  self.gen = None
  self.func = func
  self.args = args
  self.kwds = kwds

  def __with__(self):
  return self

  def __enter__(self):
  if self.gen is not None:
  raise RuntimeError(context already in use)
  gen = self.func(*args, **kwds)
  try:
  result = gen.next()
  except StopIteration:
  raise RuntimeError(generator didn't yield)
  self.gen = gen
  return result


  def __resume__(self):
  if self.gen is None:
  raise RuntimeError(context not suspended)
  gen = self.func(*args, **kwds)
  try:
  gen.next()
  except StopIteration:
  raise RuntimeError(generator didn't yield)
  self.gen = gen

  def __suspend__(self):
  try:
  self.gen.next()
  except StopIteration:
  return
  else:
  raise RuntimeError(generator didn't stop)

  def __exit__(self, type, value, traceback):
  gen = self.gen
  self.gen = None
  if type is None:
  try:
  gen.next()
  except StopIteration:
  return
  else:
  raise RuntimeError(generator didn't stop)
  else:
  try:
  gen.throw(type, value, traceback)
  except (type, StopIteration):
  return
  else:
  raise RuntimeError(generator caught exception)

   def suspendable_context(func):
  def helper(*args, **kwds):
  return SuspendableGeneratorContext(func, args, kwds)
  return helper

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://boredomandlaziness.blogspot.com
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?

PEP 343 is still (back) on the drawing table, PEP 342 has barely been
implemented (did it survive the AST-branch merge?), and already you
are talking about adding more stuff. Please put on the brakes!

If there's anything this discussion shows me, it's that implicit
contexts are a dangerous concept, and should be treated with much
skepticism.

I would recommend that if you find yourself needing context data while
programming an asynchronous application using generator trampolines
simulating coroutines, you ought to refactor the app so that the
context is explicitly passed along rather than grabbed implicitly.
Zope doesn't *require* you to get the context from a thread-local, and
I presume that SQLObject also has a way to explicitly use a specific
connection (I'm assuming cursors and similar data structures have an
explicit reference to the connection used to create them). Heck, even
Decimal allows you to invoke every operation as a method on a
decimal.Context object!

I'd rather not tie implicit contexts to the with statement,
conceptually. Most uses of the with-statement are purely local (e.g.
with open(fn) as f), or don't apply to coroutines (e.g. with
my_lock). I'd say that with redirect_stdout(f) also doesn't apply
-- we already know it doesn't work in threaded applications, and that
restriction is easily and logically extended to coroutines.

If you're writing a trampoline for an app that needs to modify decimal
contexts, the decimal module already provides the APIs for explicitly
saving and restoring contexts.

I know that somewhere in the proto-PEP Phillip argues that the context
API needs to be made a part of the standard library so that his
trampoline can efficiently swap implicit contexts required by
arbitrary standard and third-party library code. My response to that
is that library code (whether standard or third-party) should not
depend on implicit context unless it assumes it can assume complete
control over the application. (That rules out pretty much everything
except Zope, which is fine with me. :-)

Also, Nick wants the name 'context' for PEP-343 style context
managers. I think it's overloading too much to use the same word for
per-thread or per-coroutine context.

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Phillip J. Eby
At 10:40 PM 10/20/2005 +1000, Nick Coghlan wrote:
Phillip J. Eby wrote:
  This is still rather rough, but I figured it's easier to let everybody 
 fill
  in the remaining gaps by arguments than it is for me to pick a position I
  like and try to convince everybody else that it's right.  :)  Your 
 feedback
  is requested and welcome.

I think you're actually highlighting a bigger issue with the behaviour of
yield inside a with block, and working around it rather than fixing the
fundamental problem.

The issue with yield causing changes to leak to outer scopes isn't limited
to coroutine style usage - it can happen with generator-iterators, too.

What's missing is a general way of saying suspend this context temporarily,
and resume it when done.

Actually, it's fairly simple to write a generator decorator using 
context.swap() that saves and restores the current execution state around 
next()/send()/throw() calls, if you prefer it to be the generator's 
responsibility to maintain such context.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Jeremy Hylton
On 10/20/05, Guido van Rossum [EMAIL PROTECTED] wrote:
 Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?

 PEP 343 is still (back) on the drawing table, PEP 342 has barely been
 implemented (did it survive the AST-branch merge?), and already you
 are talking about adding more stuff. Please put on the brakes!

Yes.  PEP 342 survived the merge of the AST branch.  I wonder, though,
if the Grammar for it can be simplified at all.  I haven't read the
PEP closely, but I found the changes a little hard to follow.  That
is, why was the grammar changed the way it was -- or how would you
describe the intent of the changes?  It was hard when doing the
transformation in ast.c to be sure that the intent of the changes was
honored.  On the other hand, it seemed to have extensive tests and
they all pass.

Jeremy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
On 10/20/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 04:04 PM 10/20/2005 -0400, Jeremy Hylton wrote:
 On 10/20/05, Guido van Rossum [EMAIL PROTECTED] wrote:
   Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?
  
   PEP 343 is still (back) on the drawing table, PEP 342 has barely been
   implemented (did it survive the AST-branch merge?), and already you
   are talking about adding more stuff. Please put on the brakes!
 
 Yes.  PEP 342 survived the merge of the AST branch.  I wonder, though,
 if the Grammar for it can be simplified at all.  I haven't read the
 PEP closely, but I found the changes a little hard to follow.  That
 is, why was the grammar changed the way it was -- or how would you
 describe the intent of the changes?

 The intent was to make it so that '(yield optional_expr)' always works, and
 also that   [lvalue =] yield optional_expr works.  If you can find another
 way to hack the grammar so that both of 'em work, it's certainly okay by
 me.  The changes I made were just the simplest things I could figure out to 
 do.

Right.

 I seem to recall that the hard part was the need for 'yield expr,expr' to
 be interpreted as '(yield expr,expr)', not '(yield expr),expr', for
 backward compatibility reasons.

But only at the statement level.

These should be errors IMO:

  foo(yield expr, expr)
  foo(expr, yield expr)
  foo(1 + yield expr)
  x = yield expr, expr
  x = expr, yield expr
  x = 1 + yield expr

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Guido van Rossum
On 10/20/05, Phillip J. Eby [EMAIL PROTECTED] wrote:
 At 08:57 AM 10/20/2005 -0700, Guido van Rossum wrote:
 Whoa, folks! Can I ask the gentlemen to curb their enthusiasm?
 
 PEP 343 is still (back) on the drawing table, PEP 342 has barely been
 implemented (did it survive the AST-branch merge?), and already you
 are talking about adding more stuff. Please put on the brakes!

 Sorry.  I thought that 343 was just getting a minor tune-up.

Maybe, but the issues on the table are naming issues -- is __with__
the right name, or should it be __context__? Should the decorator be
applied implicitly? Should the decorator be called @context or
@contextmanager?

 In the months
 since the discussion and approval (and implementation; Michael Hudson
 actually had a PEP 343 patch out there),

Which he described previously as a hack and apparently didn't feel
comfortable checking in. At least some of it will have to be redone,
(a) for the AST code, and (b) for the revised PEP.

 I've been doing a lot of thinking
 about how they will be used in applications, and thought that it would be a
 good idea to promote people using task-specific variables in place of
 globals or thread-locals.

That's clear, yes. :-)

I still find it unlikely that a lot of people will be using trampoline
frameworks. You and Twisted, that's all I expect.

 The conventional wisdom is that global variables are bad, but the truth is
 that they're very attractive because they allow you to have one less thing
 to pass around and think about in every line of code.

Which doesn't make them less bad -- they're still there and perhaps
more likely to trip you up when you least expect it. I think there's a
lot of truth in that conventional wisdom.

 Without globals, you
 would sooner or later end up with every function taking twenty arguments to
 pass through states down to other code, or else trying to cram all this
 data into some kind of context object, which then won't work with code
 that doesn't know about *your* definition of what a context is.

Methinks you are exaggerating for effect.

 Globals are thus extremely attractive for practical software
 development.  If they weren't so useful, it wouldn't be necessary to warn
 people not to use them, after all.  :)

 The problem with globals, however, is that sometimes they need to be
 changed in a particular context.  PEP 343 makes it safer to use globals
 because you can always offer a context manager that changes them
 temporarily, without having to hand-write a try-finally block.  This will
 make it even *more* attractive to use globals, which is not a problem as
 long as the code has no multitasking of any sort.

Hm. There are different kinds of globals. Most globals don't need to
be context-managed at all, because they can safely be shared between
threads, tasks or coroutines. Caches usually fall in this category
(e.g. the compiled regex cache). A little locking is all it takes.

The globals that need to be context-managed are the pernicious kind of
which you can never have too few. :-)

They aren't just accumulating global state, they are implicit
parameters, thereby truly invoking the reasons why globals are frowned
upon.

 Of course, the multithreading scenario is usually fixed by using
 thread-locals.  All I'm proposing is that we replace thread locals with
 task locals, and promote the use of task-local variables for managed
 contexts (such as the decimal context) *that would otherwise be a global or
 a thread-local variable*.  This doesn't seem to me like a very big deal;
 just an encouragement for people to make their stuff easy to use with PEP
 342 and 343.

I'm all for encouraging people to make their stuff easy to use with
these PEPs, and with multi-threading use.

But IMO the best way to accomplish those goals is to refrain from
global (or thread-local or task-local) context as much as possible,
for example by passing along explicit context.

The mere existence of a standard library module to make handling
task-specific contexts easier sends the wrong signal; it suggests that
it's a good pattern to use, which it isn't -- it's a last-resort
pattern, when all other solutions fail.

If it weren't for Python's operator overloading, the decimal module
would have used explicit contexts (like the Java version); but since
it would be really strange to have such a fundamental numeric type
without the ability to use the conventional operator notation, we
resorted to per-thread context. Even that doesn't always do the right
thing -- handling decimal contexts is surprisingly subtle (as Nick can
testify based on his experiences attempting to write a decimal context
manager for the with-statement!).

Yes, coroutines make it even subtler.

But I haven't seen the use case yet for mixing coroutines with changes
to decimal context settings; somehow it doesn't strike me as a likely
use case (not that you can't construct one, so don't bother -- I can
imagine it too, I just think YAGNI).

 By the way, I don't 

Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-20 Thread Phillip J. Eby
At 07:57 PM 10/20/2005 -0700, Guido van Rossum wrote:
(Sorry for the long post -- there just wasn't anything you said that I
felt could be left unquoted. :-)

Wow.  You've brought up an awful lot of stuff I want to respond to, about 
the nature of frameworks, AOP, Chandler, PEP 342, software deployment, 
etc.  But I know you're busy, and the draft I was working on in reply to 
this has gotten simply huge and still unfinished, so I think I should just 
turn it all into a blog article on Why Frameworks Are Evil And What We Can 
Do To Stop Them.  :)

I don't think I've exaggerated anything, though.  I think maybe you're 
perceiving more vehemence than I actually have on the issue.  Context 
variables are a very small thing and I've not been arguing that they're a 
big one.  In the scope of the coming Global War On Frameworks, they are 
pretty small potatoes.  :)

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-19 Thread Josiah Carlson

Phillip J. Eby [EMAIL PROTECTED] wrote:
 For efficiency's sake, however, CPython could simply store the
 execution context dictionary in its thread state structure, creating
 an empty dictionary at thread initialization time.  This would make it
 somewhat easier to offer a C API for access to context variables,
 especially where efficiency of access is desirable.  But the proposal
 does not depend on this.

What about a situation in which corutines are handled by multiple
threads?  Any time a corutine passed from one thread to another, it
would lose its state.

While I agree with the obvious don't do that response, I don't believe
that the proposal will actually go very far in preventing real problems
when using context managers and generators or corutines.  Why?  How much
task state is going to be monitored/saved?  Just sys?  Perhaps sys and
the module in which a corutine was defined?  Eventually you will have
someone who says, I need Python to be saving and restoring the state of
the entire interpreter so that I can have a per-user execution
environment that cannot be corrupted by another user.  But how much
farther out is that?

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-19 Thread Phillip J. Eby
At 07:30 PM 10/19/2005 -0700, Josiah Carlson wrote:
What about a situation in which corutines are handled by multiple
threads?  Any time a corutine passed from one thread to another, it
would lose its state.

It's the responsibility of a coroutine scheduler to take a snapshot() when 
a task is suspended, and to swap() it in when resumed.  So it doesn't 
matter that you've changed what thread you're running in, as long as you 
keep the context with the coroutine that owns it.


While I agree with the obvious don't do that response, I don't believe
that the proposal will actually go very far in preventing real problems
when using context managers and generators or corutines.  Why?  How much
task state is going to be monitored/saved?  Just sys?  Perhaps sys and
the module in which a corutine was defined?

As I mentioned in the PEP, I don't think that we would bother having 
Python-defined variables be context-specific until Python 3.0.  This is 
mainly intended for the kinds of things described in the proposal: ZODB 
current transaction, current database connection, decimal context, 
etc.  Basically, anything that you'd have a thread-local for now, and 
indeed most anything that you'd use a global variable and 'with:' for.


   Eventually you will have
someone who says, I need Python to be saving and restoring the state of
the entire interpreter so that I can have a per-user execution
environment that cannot be corrupted by another user.  But how much
farther out is that?

I don't see how that's even related.  This is simply a replacement for 
thread-local variables that allows you to also be compatible with 
lightweight (coroutine-based) threads.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Pre-PEP: Task-local variables

2005-10-19 Thread Josiah Carlson

Phillip J. Eby [EMAIL PROTECTED] wrote:
 It's the responsibility of a coroutine scheduler to take a snapshot() when 
 a task is suspended, and to swap() it in when resumed.  So it doesn't 
 matter that you've changed what thread you're running in, as long as you 
 keep the context with the coroutine that owns it.
 
 As I mentioned in the PEP, I don't think that we would bother having 
 Python-defined variables be context-specific until Python 3.0.  This is 
 mainly intended for the kinds of things described in the proposal: ZODB 
 current transaction, current database connection, decimal context, 
 etc.  Basically, anything that you'd have a thread-local for now, and 
 indeed most anything that you'd use a global variable and 'with:' for.
 
 I don't see how that's even related.  This is simply a replacement for 
 thread-local variables that allows you to also be compatible with 
 lightweight (coroutine-based) threads.

I just re-read the proposal with your clarifications in mind.  Looks
good.  +1

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com