Re: [Python-Dev] PEP 548: More Flexible Loop Control

2017-09-07 Thread Guido van Rossum
No worries. We all learned stuff!

On Wed, Sep 6, 2017 at 4:22 PM, R. David Murray 
wrote:

> On Wed, 06 Sep 2017 09:43:53 -0700, Guido van Rossum 
> wrote:
> > I'm actually not in favor of this. It's another way to do the same thing.
> > Sorry to rain on your dream!
>
> So it goes :)  I learned things by going through the process, so it
> wasn't wasted time for me even if (or because) I made several mistakes.
> Sorry for wasting anyone else's time :(
>
> --David
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] HTTPS on bugs.python.org

2017-09-07 Thread INADA Naoki
Fixed.  Thanks to infra team.
http://psf.upfronthosting.co.za/roundup/meta/issue638

INADA Naoki  


On Fri, Sep 1, 2017 at 9:57 PM, Victor Stinner  wrote:
> Hi,
>
> When I go to http://bugs.python.org/ Firefox warns me that the form on
> the left to login (user, password) sends data in clear text (HTTP).
>
> Ok, I switch manually to HTTPS: add "s" in "http://; of the URL.
>
> I log in.
>
> I go to an issue using HTTPS like https://bugs.python.org/issue31250
>
> I modify an issue using the form and click on [Submit Changes] (or
> just press Enter): I'm back to HTTP. Truncated URL:
>
> http://bugs.python.org/issue31250?@ok_message=msg%20301099%20created%...
>
> Hum, again I switch manually to HTTPS by modifying the URL:
>
> https://bugs.python.org/issue31250?@ok_message=msg%20301099%20created%...
>
> I click on the "clear this message" link: oops, I'm back to the HTTP world...
>
> http://bugs.python.org/issue31250
>
> So, would it be possible to enforce HTTPS on the bug tracker?
>
> The best would be to always generate HTTPS urls and *maybe* redirect
> HTTP to HTTPS.
>
> Sorry, I don't know what are the best practices. For example, should
> we use HTTPS only cookies?
>
> Victor
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Adrian Petrescu
Would that not be a security concern, if you can get Python to execute
arbitrary code just by setting an environment variable?

On Thu, Sep 7, 2017 at 10:47 PM, Barry Warsaw  wrote:

> On Sep 7, 2017, at 19:34, Nick Coghlan  wrote:
>
> > Now that you put it that way, it occurs to me that CI environments
> > could set "PYTHONBREAKPOINTHOOK=sys:exit" to make breakpoint() an
> > immediate failure rather than halting the CI run waiting for input
> > that will never arrive.
>
> You better watch out Nick.  You’re starting to sway me on adding the
> environment variable.
>
> -Barry
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> apetresc%40gmail.com
>
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 19:34, Nick Coghlan  wrote:

> Now that you put it that way, it occurs to me that CI environments
> could set "PYTHONBREAKPOINTHOOK=sys:exit" to make breakpoint() an
> immediate failure rather than halting the CI run waiting for input
> that will never arrive.

You better watch out Nick.  You’re starting to sway me on adding the 
environment variable.

-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 18:03, Fernando Perez  wrote:

> Ah, perfect! I've subscribed to the PR on github and can pitch in there 
> further if my input is of any use.

Awesome, thanks!
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 16:19, Terry Reedy  wrote:

> I think breakpoint() should have a db= parameter so one can select a debugger 
> in one removable line.  The sys interface is more useful for IDEs to change 
> the default, possible with other args (like breakpoints and colors) bound to 
> the callable.

I’m skeptical about that.  I think any particular user is going to 
overwhelmingly use the same debugger, so having to repeat themselves every time 
they want to enter the debugger is going to get tedious fast.  I know it would 
annoy *me* if I had to tell it to use pdb every time I wrote `breakpoint()`, 
and I’m almost never going to use anything else.

I’m also not sure what useful semantics for `db` would be.  E.g. what 
specifically would you set `db` to in order to invoke idle or tkdb (‘gdb’ would 
be an unfortunate name I think, given the popular existing GNU Debugger ;).  I 
don’t even know what useful thing I’d set `db` to to mean “invoke pdb”.  Please 
don’t say “well, pdb will still be the default so you don’t have to set it to 
anything” because that does kind of miss the point.

I also want to keep breakpoint() as generic as possible.  I think doing so 
allows it to be a nice API into whatever interesting and sophisticated 
implementations folks can think up underneath it.  So *if* you came up with a 
cool thing that interpreted `db` in some special way, there should be a very 
low barrier to providing that to your users, e.g.:

* pip install myfancymetadebugger
* put a snippet which sets sys.breakpointhook in your $PYTHONSTARTUP file
* profit!

That second item could be replaced with

export PYTHONBREAKPOINTHOOK=myfancymetadebugger.invoke

(hmm, does that mean the envar is getting more interesting?)

> Breakpoint() should pass on other args.

I strongly believe it should pass through *all* args.

> A somewhat separate point: the name breakpoint() is slightly misleading, 
> which has consequences if it is (improperly) called more than once. While 
> breakpoint() acts as a breakpoint, what it does (at least in the default pdb 
> case) is *initialize* and start a *new* debugger, possibly after an import.  
> Re-importing a module is no big deal.  Replacing an existing debugger with a 
> *new* one, and tossing away all defined aliases and breakpoints and Bdb's 
> internal caches, is.  It is likely not what most people would want or expect. 
>  I think it more likely that people will call breakpoint() multiple times 
> than they would, for instance, call pdb.set_trace() multiple times.

Multiple calls to pdb.set_trace() is fairly common in practice today, so I’m 
not terribly concerned about it.  There’s nothing fundamentally different with 
multiple calls to breakpoint() today.  If we care, we can provide a more 
efficient/different API and make that the default.  The machinery in PEP 553 
can easily support that, but doing it is outside the scope of the PEP.

> With a gui debugger, having one window go and another appear might be 
> additionally annoying.  If the first window is not immediately GCed, having 
> two windows would be confusing.  Perhaps breakpoint() could be made a no-op 
> after the first call.

Your sys.breakpointhook could easily implement that, with a much better user 
experience than what built-in breakpoint() could do anyway.

Cheers,
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Nick Coghlan
On 7 September 2017 at 19:17, Barry Warsaw  wrote:
> On Sep 7, 2017, at 18:12, Nick Coghlan  wrote:
>>
>> Related to this is the suggestion that we make the default
>> sys.breakpointhook() a no-op, so that accidentally checking in calls
>> to breakpoint() won' t hang CI systems.
>>
>> Then folks that wanted to use the functionality would set
>> "PYTHONBREAKPOINTHOOK=pdb:set_trace"
>
> I’d rather do it the other way ‘round because I want it to Just Work for the 
> average developer, and maintainers of CI or production systems should be able 
> to fairly easily tweak their environments to noop breakpoint().  Although 
> maybe we want a shortcut for that, e.g. PYTHONBREAKPOINTHOOK=0 or some such.

Now that you put it that way, it occurs to me that CI environments
could set "PYTHONBREAKPOINTHOOK=sys:exit" to make breakpoint() an
immediate failure rather than halting the CI run waiting for input
that will never arrive.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 18:12, Nick Coghlan  wrote:
> 
> Related to this is the suggestion that we make the default
> sys.breakpointhook() a no-op, so that accidentally checking in calls
> to breakpoint() won' t hang CI systems.
> 
> Then folks that wanted to use the functionality would set
> "PYTHONBREAKPOINTHOOK=pdb:set_trace"

I’d rather do it the other way ‘round because I want it to Just Work for the 
average developer, and maintainers of CI or production systems should be able 
to fairly easily tweak their environments to noop breakpoint().  Although maybe 
we want a shortcut for that, e.g. PYTHONBREAKPOINTHOOK=0 or some such.

(Note, I’m still not sure it’s worth supporting the environment variable, but I 
am interesting in collecting the feedback on it.)

-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 16:58, Gregory P. Smith  wrote:

> Input from OS package distributors would be interesting.  Would they use this?

I suspect it won’t be that interesting to the Debian ecosystem, since we 
generate pyc files on package install.  We do that because we can support 
multiple versions of Python installed simultaneously and we don’t know which 
versions are installed on the target machine.  I suppose our stdlib package 
could ship pycs, but we don’t.

Reproducible builds may still be interesting in other situations though, such 
as CI machines, but then SOURCE_DATE_EPOCH is probably good enough.

-Barry




signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 16:58, Gregory P. Smith wrote:
> +1 on this PEP.

Thanks!

> Questions:
> 
> Input from OS package distributors would be interesting.  Would they use
> this?  Which way would it impact their startup time (loading the .py file
> vs just statting it.  does that even matter?  source files are often
> eventually loaded for linecache use in tracebacks anyways)?

I an anticipate distributors will use the mode where the pyc is simply
trusted and the source file isn't hashed. That would make the io
overhead identical to today.

> 
> Would they benefit from a pyc that can contain _both_ timestamp+length,
> and
> the source_hash?  if both were present, I assume that only one would be
> checked at startup.  i'm not sure what would make the decision of what to
> check.  one fails, check the other?  i personally do not have a use for
> this case so i'd omit the complexity without a demonstrated need.

Yeah, it could act as a multi-tiered cache key. I agree with your
conclusion to pass for now.

> 
> Something to also state in the PEP:
> 
> This is intentionally not a "secure" hash.  Security is explicitly a
> non-goal.

Added a sentence.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Nick Coghlan
On 7 September 2017 at 16:58, Gregory P. Smith  wrote:
> +1 on this PEP.
>
> The TL;DR summary of this PEP:
>   The pyc date+length metadata check was a convenient hack.  It still works
> well for many people and use cases, it isn't going away.
>   PEP 552 proposes a new alternate hack that relies on file contents instead
> of os and filesystem date metadata.
> Assumption: The hash function is significantly faster than re-parsing
> the source.  (guaranteed to be true)
>
> Questions:
>
> Input from OS package distributors would be interesting.  Would they use
> this?  Which way would it impact their startup time (loading the .py file vs
> just statting it.  does that even matter?  source files are often eventually
> loaded for linecache use in tracebacks anyways)?

Christian and I asked some of our security folks for their personal
wishlists recently, and one of the items that came up was "The
recompile is based on a timestamp. How do you know the pyc file on
disk really is related to the py file that is human readable? Can it
be based on a hash or something like that?"

This is a restating of the reproducible build use case: for a given
version of Python, a given source file should always give the same
source hash and marshaled code object, and once it does, it's easier
to do an independent compilation from the source file and check you
get the same answer.

While you can implement that for timestamp based formats by adjusting
input file metadata (and that's exactly what distros do with
_SOURCE_DATE_EPOCH), it's still pretty annoying, and not particularly
build cache friendly, since the same file in different source
artifacts may produce different build outputs.

> Would they benefit from a pyc that can contain _both_ timestamp+length, and
> the source_hash?  if both were present, I assume that only one would be
> checked at startup.  i'm not sure what would make the decision of what to
> check.  one fails, check the other?  i personally do not have a use for this
> case so i'd omit the complexity without a demonstrated need.

I don't see any way we'd benefit from having both items present.

However, I do wonder whether we could encode *all* the mode settings
into the magic number, such that we did something like reserving the
top 3 bits for format flags:

* number & 0x1FFF -> the traditional magic number
* number & 0x8000 -> timestamp or hash?
* number & 0x4000 -> checked or not?
* number & 0x2000 -> reserved for future format changes

By default we'd still produce the checked-timestamp format, but
managed build systems (including Linux distros) could opt-in to the
unchecked-hash format.

> Something to also state in the PEP:
>
> This is intentionally not a "secure" hash.  Security is explicitly a
> non-goal.

I don't think it's so much that security is a non-goal, as that the
(admittedly minor) security improvement comes from making it easier to
reproduce the expected machine-readable output from a given
human-readable input, rather than from the nature of the hashing
function used.

> Rationale behind my support:

+1 from me as well, for the reasons Greg gives (while Fedora doesn't
currently do any per-file build artifact caching, I hope we will in
the future, and output formats based on input artifact hashes will
make that much easier than formats based on input timestamps).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Nick Coghlan
On 7 September 2017 at 11:43, Barry Warsaw  wrote:
> Environment variable
> 
>
> Should we add an environment variable so that ``sys.breakpointhook()``
> can be
> set outside of the Python invocation?  E.g.::
>
> $ export PYTHONBREAKPOINTHOOK=my.debugger:Debugger
>
> This would provide execution environments such as IDEs which run Python code
> inside them, to set an internal breakpoint hook before any Python code
> executes.

Related to this is the suggestion that we make the default
sys.breakpointhook() a no-op, so that accidentally checking in calls
to breakpoint() won' t hang CI systems.

Then folks that wanted to use the functionality would set
"PYTHONBREAKPOINTHOOK=pdb:set_trace"

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Fernando Perez

On 2017-09-07 23:00:43 +, Barry Warsaw said:


On Sep 7, 2017, at 14:25, Barry Warsaw  wrote:


I’ll see what it takes to add `header` to pdb.set_trace(), but I’ll do 
that as a separate PR (i.e. not as part of this PEP).


Turns out to be pretty easy.

https://bugs.python.org/issue31389
https://github.com/python/cpython/pull/3438


Ah, perfect! I've subscribed to the PR on github and can pitch in there 
further if my input is of any use.


Thanks again,

f


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Glenn Linderman

On 9/7/2017 4:19 PM, Terry Reedy wrote:
A somewhat separate point: the name breakpoint() is slightly 
misleading, which has consequences if it is (improperly) called more 
than once. While breakpoint() acts as a breakpoint, what it does (at 
least in the default pdb case) is *initialize* and start a *new* 
debugger, possibly after an import. 


So maybe the original debug() call should be renamed debugger() [but 
with the extra optional parameters discussed], and an additional 
breakpoint() call could be added that would be much more like hitting a 
breakpoint in the already initialized debugger.


There seem to be two directions to go here: If breakpoint is called 
without a prior call to debugger, it should (1) call debugger implicitly 
with defaults, or (2) be a noop.


I prefer the latter, but either is workable.  Or, one could add all 
those parameters to every breakpoint() call, which makes (1) more 
workable, but code more wordy.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Nick Coghlan
On 7 September 2017 at 07:06, Ethan Furman  wrote:
> The concern is *how* PEP 550 provides it:
>
> - explicitly, like threading.local(): has to be set up manually,
>   preferably with a context manager
>
> - implicitly: it just happens under certain conditions

A recurring point of confusion with the threading.local() analogy
seems to be that there are actually *two* pieces to that analogy:

* threading.local() <-> contextvars.ContextVar
* PyThreadState_GetDict() <-> LogicalContext

(See 
https://github.com/python/cpython/blob/a6a4dc816d68df04a7d592e0b6af8c7ecc4d4344/Python/pystate.c#L584
for the definition of the PyThreadState_GetDict)

For most practical purposes as a *user* of thread locals, the
involvement of PyThreadState and the state dict is a completely hidden
implementation detail. However, every time you create a new thread,
you're implicitly getting a new Python thread state, and hence a new
thread state dict, and hence a new set of thread local values.

Similarly, as a *user* of context variables, you'll generally be able
to ignore the manipulation of the execution context going on behind
the scenes - you'll just get, set, and delete individual context
variables without worrying too much about exactly where and how
they're stored.

PEP 550 itself doesn't have that luxury, though, since in addition to
defining how users will access and update these values, it *also*
needs to define how the interpreter will implicitly manage the
execution context for threads and generators and how event loops
(including asyncio as the reference implementation) are going to be
expected to manage the execution context explicitly when scheduling
coroutines.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Gregory P. Smith
+1 on this PEP.

The TL;DR summary of this PEP:
  The pyc date+length metadata check was a convenient hack.  It still works
well for many people and use cases, it isn't going away.
  PEP 552 proposes a new alternate hack that relies on file contents
instead of os and filesystem date metadata.
Assumption: The hash function is significantly faster than re-parsing
the source.  (guaranteed to be true)

Questions:

Input from OS package distributors would be interesting.  Would they use
this?  Which way would it impact their startup time (loading the .py file
vs just statting it.  does that even matter?  source files are often
eventually loaded for linecache use in tracebacks anyways)?

Would they benefit from a pyc that can contain _both_ timestamp+length, and
the source_hash?  if both were present, I assume that only one would be
checked at startup.  i'm not sure what would make the decision of what to
check.  one fails, check the other?  i personally do not have a use for
this case so i'd omit the complexity without a demonstrated need.

Something to also state in the PEP:

This is intentionally not a "secure" hash.  Security is explicitly a
non-goal.

Rationale behind my support:

We use a superset of Bazel at Google (unsurprising) and have had to jump
through a lot of messy hoops to deal with timestamp metadata winding up in
output files vs deterministic builds.  What Benjamin describes here sounds
exactly like what we would want.

It allows deterministic builds in distributed build and cached operation
systems where timestamps are never going to be guaranteed.

It allows the check to work on filesystems which do not preserve timestamps.

Also importantly, it allows the check to be disabled via the check_source
bit.  Today we use a modified importer at work that skips checking
timestamps anyways as the way we ship applications where the entire set of
dependencies present is already guaranteed at build time to be correct and
being modified at runtime is not possible or not a concern. This PEP would
avoid the need for an extra importer or modified interpreter logic to make
this happen.

-G

On Thu, Sep 7, 2017 at 3:47 PM Benjamin Peterson 
wrote:

>
>
> On Thu, Sep 7, 2017, at 14:43, Guido van Rossum wrote:
> > On Thu, Sep 7, 2017 at 2:40 PM, Benjamin Peterson 
> > wrote:
> >
> > >
> > >
> > > On Thu, Sep 7, 2017, at 14:19, Guido van Rossum wrote:
> > > > Nice one.
> > > >
> > > > It would be nice to specify the various APIs needed as well.
> > >
> > > The compileall and py_compile ones?
> > >
> >
> > Yes, and the SipHash mod to specify the key you mentioned.
>
> Done.
>
> >
> > >
> > > > Why do you keep the mtime-based format as an option? (Maybe because
> it's
> > > > faster? Did you measure it?)
> > >
> > > I haven't actually measured anything, but stating a file will
> definitely
> > > be faster than reading it completely and hashing it. I suppose if the
> > > speed difference between timestamp-based and hash-based pycs turned out
> > > to be small we could feel good about dropping the timestamp format
> > > completely. However, that difference might be hard to determine
> > > definitely as I expect the speed hit will vary widely based on system
> > > parameters such as disk speed and page cache size.
> > >
> > > My goal in this PEP was to preserve the current pyc invalidation
> > > behavior, which works well today for many use cases, as the default.
> The
> > > hash-based pycs are reserved for distribution and other power use
> cases.
> > >
> >
> > OK, maybe you can clarify that a bit in the PEP.
>
> I've added a paragraph to the Rationale section.
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Elvis Pranskevichus wrote:
By default, generators reference an empty LogicalContext object that is 
allocated once (like the None object).  We can do that because LCs are 
immutable.


Ah, I see. That wasn't clear from the implementation, where

gen.__logical_context__ = contextvars.LogicalContext()

looks like it's creating a new one.

However, there's another thing: it looks like every
time a generator is resumed/suspended, an execution
context node is created/discarded.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Terry Reedy

On 9/7/2017 12:52 PM, Barry Warsaw wrote:

On Sep 7, 2017, at 07:46, Guido van Rossum  wrote:


Without following all this (or the PEP, yet) super exactly, does this mean you 
are satisfied with what the PEP currently proposes or would you like changes? 
It's a little unclear from what you wrote.


I’m also unsure whether Terry is good with the existing PEP or suggesting 
changes.


It seems to me that simplifying 'import pdb; pbd.set_trace()' to 
'breakpoint()' is not, in itself, justification for a new builtin. 
Rather, the justification is the possibility of invoking dbs other than 
pdb.  So I have been examining the feasibility of doing that, with 
IDLE's Idb + rpc + Debugger widget as a test case and example.  My 
conclusion: possible, but not as graceful as I would like.


In response to Barry's idea of PYTHONBREAKPOINTHOOK (default 
pdb.set_trace, and is 'HOOK' really needed), I examined further how 
IDLE's debugger is different and how the differences cause problems. 
Then a thought occurred to me: how about rewriting it as a pure tkinter 
GUI debugger, much more like pdb.  Call the module either gdb or tkdb. 
PYTHONBREAKPOINT=[package.].gdb.set_trace would then make more sense.


As with pdb, the UI would run in the same process as the debugger and 
user code.  No rpc setup needed.  No interference between the debug gui 
and the rest of IDLE (as sometimes happens now).  No need to pickle 
objects to display the bindings of global and local names. The reasons 
for separating user code from IDLE mostly do not apply to the debugger.


I would make it a single window with 2 panes, much like turtledemo, but 
with the turtle screen replaced with the GUI display.  (Try IDLE to see 
what that would look like now, or pull PR 2494 first to see it with one 
proposed patch.) Gdb should not be limited to running from IDLE but 
could be launched even when running code from the console.


After an initial version, the entry point could accept a list of 
breakpoint lines and a color mapping for syntax coloring.


I think breakpoint() should have a db= parameter so one can select a 
debugger in one removable line.  The sys interface is more useful for 
IDEs to change the default, possible with other args (like breakpoints 
and colors) bound to the callable.


Breakpoint() should pass on other args. In particular, people who invoke 
gdb from within a tkinter program should be able to pass in the root 
widget, to be the master of the debug window.  This has to be done from 
within the user program, and the addition of breakpoint allows that, and 
makes running the gui with user code more feasible.

---

A somewhat separate point: the name breakpoint() is slightly misleading, 
which has consequences if it is (improperly) called more than once. 
While breakpoint() acts as a breakpoint, what it does (at least in the 
default pdb case) is *initialize* and start a *new* debugger, possibly 
after an import.  Re-importing a module is no big deal.  Replacing an 
existing debugger with a *new* one, and tossing away all defined aliases 
and breakpoints and Bdb's internal caches, is.  It is likely not what 
most people would want or expect.  I think it more likely that people 
will call breakpoint() multiple times than they would, for instance, 
call pdb.set_trace() multiple times.


With a gui debugger, having one window go and another appear might be 
additionally annoying.  If the first window is not immediately GCed, 
having two windows would be confusing.  Perhaps breakpoint() could be 
made a no-op after the first call.



I think that's true for any IDE that has existing integrated debug 
capabilities. However for every IDE I would hope that calling breakpoint() will 
*also* break into the IDE's debugger. My use case is that sometimes I have a 
need for a *conditional* breakpoint where it's just much simpler to decide 
whether to break or not in Python code than it would be to use the IDE's 
conditional breakpoint facilities.


That certainly aligns with my own experience and expectations.  I guess I’m a 
fellow old dog. :)


--
Terry Jan Reedy


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 14:25, Barry Warsaw  wrote:
> 
> I’ll see what it takes to add `header` to pdb.set_trace(), but I’ll do that 
> as a separate PR (i.e. not as part of this PEP).

Turns out to be pretty easy.

https://bugs.python.org/issue31389
https://github.com/python/cpython/pull/3438

Cheers,
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 549: Instance Properties (aka: module properties)

2017-09-07 Thread Larry Hastings



On 09/07/2017 03:49 PM, Larry Hastings wrote:
Excluding Lib/test, there are 375 uses of "@property" in the stdlib in 
trunk, 60 uses of __getattr__, and 34 of __getattribute__.


I spent a minute looking at the output and realized there were a bunch 
of hits inside pydoc_data/topics.py, aka documentation.  And then I 
realized I was only interested in definitions, not docs or calls or 
other hackery.


Removing pydoc_data/topics.py and changing the search terms results in:

   "@property" 375 hits
   "def __getattr__" 28 hits
   "def __getattribute__(" 2 hits

Unless the average use of __getattr__ serve in excess of 13 members 
each, property is the most popular technique of the three.



//arry/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 549: Instance Properties (aka: module properties)

2017-09-07 Thread Larry Hastings



On 09/06/2017 09:45 AM, Guido van Rossum wrote:
So we're looking for a competing PEP here. Shouldn't be long, just 
summarize the discussion about use cases and generality here.


I don't think it's necessarily a /competing/ PEP; in my opinion, they 
serve slightly different use cases.  After all, property (and 
__getattribute__) were added long after __getattr__; if __getattr__ was 
a reasonable solution for property's use cases, why did we bother adding 
property?


One guiding principle I use when writing Python: if I need to provide an 
API, but there's conceptually only one of the thing, build it directly 
into a module as opposed to writing a class and making users use an 
instance.  (For example: the random module, although these days it 
provides both.)  So I see the situation as symmetric between modules and 
classes.  What is the use case for property / __getattr__ / 
__getattribute__ on a module?  The same as the use case for property / 
__getattr__ / __getattribute__ on a class.


Excluding Lib/test, there are 375 uses of "@property" in the stdlib in 
trunk, 60 uses of __getattr__, and 34 of __getattribute__.  Of course, 
property is used once per member, whereas each instance of __getattr__ 
and __getattribute__ could be used for arbitrarily-many members.  On the 
other hand, it's also possible that some uses of __getattr__ are legacy 
uses, and if property had been available it would have used that 
instead.  Anyway I assert that property is easily the most popular of 
these three techniques.


TBH I forgot the specific use case that inspired this--it's on a project 
I haven't touched in a while, in favor of devoting time to the 
Gilectomy.  But I can cite at least one place in the standard library 
that would have been better if it'd been implemented as a module 
property: os.stat_float_times().



//arry/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:43, Guido van Rossum wrote:
> On Thu, Sep 7, 2017 at 2:40 PM, Benjamin Peterson 
> wrote:
> 
> >
> >
> > On Thu, Sep 7, 2017, at 14:19, Guido van Rossum wrote:
> > > Nice one.
> > >
> > > It would be nice to specify the various APIs needed as well.
> >
> > The compileall and py_compile ones?
> >
> 
> Yes, and the SipHash mod to specify the key you mentioned.

Done.

> 
> >
> > > Why do you keep the mtime-based format as an option? (Maybe because it's
> > > faster? Did you measure it?)
> >
> > I haven't actually measured anything, but stating a file will definitely
> > be faster than reading it completely and hashing it. I suppose if the
> > speed difference between timestamp-based and hash-based pycs turned out
> > to be small we could feel good about dropping the timestamp format
> > completely. However, that difference might be hard to determine
> > definitely as I expect the speed hit will vary widely based on system
> > parameters such as disk speed and page cache size.
> >
> > My goal in this PEP was to preserve the current pyc invalidation
> > behavior, which works well today for many use cases, as the default. The
> > hash-based pycs are reserved for distribution and other power use cases.
> >
> 
> OK, maybe you can clarify that a bit in the PEP.

I've added a paragraph to the Rationale section.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:54, Antoine Pitrou wrote:
> On Thu, 07 Sep 2017 14:32:19 -0700
> Benjamin Peterson  wrote:
> > > 
> > > Not sure how common that situation is (certainly the source tree wasn't
> > > read-only when you checked it out or untar'ed it), but isn't it easily
> > > circumvented by copying the source tree before building?  
> > 
> > Well, yes, in these kind of "batch" build situations, copying is
> > probably fine. However, I want to be able to have pyc determinism even
> > when developing. Copying the entire source every time I change something
> > isn't a nice.
> 
> Hmm... Are you developing from a read-only source tree?

No, but the build system is building from one (at least conceptually).

> 
> > The larger point is that while the SOURCE_EPOCH patch will likely work
> > for Linux distributions, I'm interested in being able to have
> > deterministic pycs in "normal" Python development workflows.
> 
> That's an interesting idea, but is there a concrete motivation or is it
> platonical?  After all, if you're changing something in the source tree
> it's expected that the overall "signature" of the build will be
> modified too.

Yes, I have used Bazel to build pycs. Having pycs be deterministic
allows interesting build system optimizations like Bazel distributed
caching to work well for Python.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Antoine Pitrou
On Thu, 07 Sep 2017 14:40:33 -0700
Benjamin Peterson  wrote:
> On Thu, Sep 7, 2017, at 14:19, Guido van Rossum wrote:
> > Nice one.
> > 
> > It would be nice to specify the various APIs needed as well.  
> 
> The compileall and py_compile ones?
> 
> > 
> > Why do you keep the mtime-based format as an option? (Maybe because it's
> > faster? Did you measure it?)  
> 
> I haven't actually measured anything, but stating a file will definitely
> be faster than reading it completely and hashing it. I suppose if the
> speed difference between timestamp-based and hash-based pycs turned out
> to be small we could feel good about dropping the timestamp format
> completely. However, that difference might be hard to determine
> definitely as I expect the speed hit will vary widely based on system
> parameters such as disk speed and page cache size.

Also, while some/many of us have fast development machines with
performant SSDs, Python can be used in situations where "disk" I/O is
still slow (imagine a Raspberry Pi system or similar, grinding through
a SD card or USB key to load py and pyc files).

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Antoine Pitrou
On Thu, 07 Sep 2017 14:32:19 -0700
Benjamin Peterson  wrote:
> > 
> > Not sure how common that situation is (certainly the source tree wasn't
> > read-only when you checked it out or untar'ed it), but isn't it easily
> > circumvented by copying the source tree before building?  
> 
> Well, yes, in these kind of "batch" build situations, copying is
> probably fine. However, I want to be able to have pyc determinism even
> when developing. Copying the entire source every time I change something
> isn't a nice.

Hmm... Are you developing from a read-only source tree?

> The larger point is that while the SOURCE_EPOCH patch will likely work
> for Linux distributions, I'm interested in being able to have
> deterministic pycs in "normal" Python development workflows.

That's an interesting idea, but is there a concrete motivation or is it
platonical?  After all, if you're changing something in the source tree
it's expected that the overall "signature" of the build will be
modified too.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Guido van Rossum
On Thu, Sep 7, 2017 at 2:40 PM, Benjamin Peterson 
wrote:

>
>
> On Thu, Sep 7, 2017, at 14:19, Guido van Rossum wrote:
> > Nice one.
> >
> > It would be nice to specify the various APIs needed as well.
>
> The compileall and py_compile ones?
>

Yes, and the SipHash mod to specify the key you mentioned.

>
> > Why do you keep the mtime-based format as an option? (Maybe because it's
> > faster? Did you measure it?)
>
> I haven't actually measured anything, but stating a file will definitely
> be faster than reading it completely and hashing it. I suppose if the
> speed difference between timestamp-based and hash-based pycs turned out
> to be small we could feel good about dropping the timestamp format
> completely. However, that difference might be hard to determine
> definitely as I expect the speed hit will vary widely based on system
> parameters such as disk speed and page cache size.
>
> My goal in this PEP was to preserve the current pyc invalidation
> behavior, which works well today for many use cases, as the default. The
> hash-based pycs are reserved for distribution and other power use cases.
>

OK, maybe you can clarify that a bit in the PEP.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:19, Guido van Rossum wrote:
> Nice one.
> 
> It would be nice to specify the various APIs needed as well.

The compileall and py_compile ones?

> 
> Why do you keep the mtime-based format as an option? (Maybe because it's
> faster? Did you measure it?)

I haven't actually measured anything, but stating a file will definitely
be faster than reading it completely and hashing it. I suppose if the
speed difference between timestamp-based and hash-based pycs turned out
to be small we could feel good about dropping the timestamp format
completely. However, that difference might be hard to determine
definitely as I expect the speed hit will vary widely based on system
parameters such as disk speed and page cache size.

My goal in this PEP was to preserve the current pyc invalidation
behavior, which works well today for many use cases, as the default. The
hash-based pycs are reserved for distribution and other power use cases.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:21, Antoine Pitrou wrote:
> On Thu, 07 Sep 2017 14:08:58 -0700
> Benjamin Peterson  wrote:
> > On Thu, Sep 7, 2017, at 14:00, Antoine Pitrou wrote:
> > > On Thu, 07 Sep 2017 13:39:21 -0700
> > > Benjamin Peterson  wrote:  
> > > > Hello,
> > > > I've written a short PEP about an import extension to allow pycs to be
> > > > more deterministic by optional replacing the timestamp with a hash of
> > > > the source file: https://www.python.org/dev/peps/pep-0552/  
> > > 
> > > Why isn't https://github.com/python/cpython/pull/296 a good enough
> > > solution to this problem?  It has a simple implementation, and requires
> > > neither maintaining two different pyc formats nor reading the entire
> > > source file to check whether the pyc file is up to date.  
> > 
> > The main objection to that model is that it requires modifying source
> > timestamps, which isn't possible for builds on read-only source trees.
> 
> Not sure how common that situation is (certainly the source tree wasn't
> read-only when you checked it out or untar'ed it), but isn't it easily
> circumvented by copying the source tree before building?

Well, yes, in these kind of "batch" build situations, copying is
probably fine. However, I want to be able to have pyc determinism even
when developing. Copying the entire source every time I change something
isn't a nice.

> 
> > This proposal also allows reproducible builds even if the files are
> > being modified in an edit-run-tests cycle.
> 
> I don't follow you here.  Could you elaborate?

If you require source timestamps to be fixed and deterministic, Python
won't notice when a file is modified.

The larger point is that while the SOURCE_EPOCH patch will likely work
for Linux distributions, I'm interested in being able to have
deterministic pycs in "normal" Python development workflows.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:19, Freddy Rietdijk wrote:
> > The main objection to that model is that it requires modifying source
> timestamps, which isn't possible for builds on read-only source trees.
> 
> Why not set the source timestamps of the source trees to say 1 first?

If the source-tree is readonly (because you don't want your build system
to modify source files on principal), you cannot do that.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 12:09, Fernando Perez  wrote:
>> The PEP has an open issue regarding breakpoint() taking *args and **kws, 
>> which would just be passed through the call stack.  It sounds like you’d be 
>> in favor of that enhancement.
> 
> If you go witht the `(*a, **k)` pass-through API, would you have a special 
> keyword-only arg called 'header' or similar? That seems like a decent 
> compromise to support the feature with the builtin while allowing other 
> implementations to offer more features. In any case, +1 to a pass-through 
> API, as long as the built-in supports some kind of mechanism to help the user 
> get their bearings with "you're here" type messages.

I don’t think I want to specify what goes in *args or **kws, I just want to 
pass them straight through.  The user will have to have some understanding of 
what debugger they are using and what arguments their breakpoint hook allows.

I’ll see what it takes to add `header` to pdb.set_trace(), but I’ll do that as 
a separate PR (i.e. not as part of this PEP).

Cheers,
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Antoine Pitrou
On Thu, 07 Sep 2017 14:08:58 -0700
Benjamin Peterson  wrote:
> On Thu, Sep 7, 2017, at 14:00, Antoine Pitrou wrote:
> > On Thu, 07 Sep 2017 13:39:21 -0700
> > Benjamin Peterson  wrote:  
> > > Hello,
> > > I've written a short PEP about an import extension to allow pycs to be
> > > more deterministic by optional replacing the timestamp with a hash of
> > > the source file: https://www.python.org/dev/peps/pep-0552/  
> > 
> > Why isn't https://github.com/python/cpython/pull/296 a good enough
> > solution to this problem?  It has a simple implementation, and requires
> > neither maintaining two different pyc formats nor reading the entire
> > source file to check whether the pyc file is up to date.  
> 
> The main objection to that model is that it requires modifying source
> timestamps, which isn't possible for builds on read-only source trees.

Not sure how common that situation is (certainly the source tree wasn't
read-only when you checked it out or untar'ed it), but isn't it easily
circumvented by copying the source tree before building?

> This proposal also allows reproducible builds even if the files are
> being modified in an edit-run-tests cycle.

I don't follow you here.  Could you elaborate?

Thanks

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 13:52, Terry Reedy  wrote:

> pdb.set_trace is a public and stable interface.  IDLE's is private and likely 
> to be initially unstable.  I can imagine that the function that I would want 
> to bind to sys.__breakpoint__ would be a bound method


To be pedantic, you’re not supposed to touch sys.__breakpointhook__ although 
like sys.__excepthook__ and sys.__displayhook__ they are not enforced to be 
read-only.

Cheers,
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Guido van Rossum
Nice one.

It would be nice to specify the various APIs needed as well.

Why do you keep the mtime-based format as an option? (Maybe because it's
faster? Did you measure it?)


On Thu, Sep 7, 2017 at 1:39 PM, Benjamin Peterson 
wrote:

> Hello,
> I've written a short PEP about an import extension to allow pycs to be
> more deterministic by optional replacing the timestamp with a hash of
> the source file: https://www.python.org/dev/peps/pep-0552/
>
> Thanks for reading,
> Benjamin
>
> P.S. I came up with the idea for this PEP while awake.
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Freddy Rietdijk
> The main objection to that model is that it requires modifying source
timestamps, which isn't possible for builds on read-only source trees.

Why not set the source timestamps of the source trees to say 1 first?
That's what is done with the Nix package manager. The Python interpreter is
patched (mostly similar to the referred PR) and checks whether
SOURCE_DATE_EPOCH is set, and if so, sets the mtime to 1.

On Thu, Sep 7, 2017 at 11:08 PM, Benjamin Peterson 
wrote:

>
>
> On Thu, Sep 7, 2017, at 14:00, Antoine Pitrou wrote:
> > On Thu, 07 Sep 2017 13:39:21 -0700
> > Benjamin Peterson  wrote:
> > > Hello,
> > > I've written a short PEP about an import extension to allow pycs to be
> > > more deterministic by optional replacing the timestamp with a hash of
> > > the source file: https://www.python.org/dev/peps/pep-0552/
> >
> > Why isn't https://github.com/python/cpython/pull/296 a good enough
> > solution to this problem?  It has a simple implementation, and requires
> > neither maintaining two different pyc formats nor reading the entire
> > source file to check whether the pyc file is up to date.
>
> The main objection to that model is that it requires modifying source
> timestamps, which isn't possible for builds on read-only source trees.
> This proposal also allows reproducible builds even if the files are
> being modified in an edit-run-tests cycle.
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> freddyrietdijk%40fridh.nl
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 14:04, Fred Drake  wrote:
> 
> On Thu, Sep 7, 2017 at 4:52 PM, Terry Reedy  wrote:
>> Environmental variables tend to be a pain on Windows and nigh unusable by
>> beginners.  Leaving that aside, I see these problems with trying to use one
>> for IDLE's *current* debugger.
>> 
>> pdb is universal, in the sense of working with any python run with actual or
>> simulated stdin and stdout.  IDLE's idb is specific to working with IDLE. So
>> one could not set an EV to 'idlelib.idb.start' and leave it while switching
>> between IDLE and console.
> 
> Would it work for IDLE to set the environment variable for the child process?

That’s exactly how I envision the environment variable would be used.  If the 
process being debugged is run in an environment set up by the IDE, this would 
be the way for the IDE to communicate to the subprocess under debug, how it 
should behave in order to communicate properly with the debugger.

> The user certainly should not need to be involved in that.

Right.

> That doesn't address the issue of setting up the communications
> channel before breakpoint() is called, but allows the breakpointhook
> that gets used to work with whatever has been arranged.

Right again!  I think setting up the communication channel is outside the scope 
of this PEP.

Cheers,
-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson


On Thu, Sep 7, 2017, at 14:00, Antoine Pitrou wrote:
> On Thu, 07 Sep 2017 13:39:21 -0700
> Benjamin Peterson  wrote:
> > Hello,
> > I've written a short PEP about an import extension to allow pycs to be
> > more deterministic by optional replacing the timestamp with a hash of
> > the source file: https://www.python.org/dev/peps/pep-0552/
> 
> Why isn't https://github.com/python/cpython/pull/296 a good enough
> solution to this problem?  It has a simple implementation, and requires
> neither maintaining two different pyc formats nor reading the entire
> source file to check whether the pyc file is up to date.

The main objection to that model is that it requires modifying source
timestamps, which isn't possible for builds on read-only source trees.
This proposal also allows reproducible builds even if the files are
being modified in an edit-run-tests cycle.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Fred Drake
On Thu, Sep 7, 2017 at 4:52 PM, Terry Reedy  wrote:
> Environmental variables tend to be a pain on Windows and nigh unusable by
> beginners.  Leaving that aside, I see these problems with trying to use one
> for IDLE's *current* debugger.
>
> pdb is universal, in the sense of working with any python run with actual or
> simulated stdin and stdout.  IDLE's idb is specific to working with IDLE. So
> one could not set an EV to 'idlelib.idb.start' and leave it while switching
> between IDLE and console.

Would it work for IDLE to set the environment variable for the child process?

The user certainly should not need to be involved in that.

That doesn't address the issue of setting up the communications
channel before breakpoint() is called, but allows the breakpointhook
that gets used to work with whatever has been arranged.


  -Fred

-- 
Fred L. Drake, Jr.
"A storm broke loose in my mind."  --Albert Einstein
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Antoine Pitrou
On Thu, 07 Sep 2017 13:39:21 -0700
Benjamin Peterson  wrote:
> Hello,
> I've written a short PEP about an import extension to allow pycs to be
> more deterministic by optional replacing the timestamp with a hash of
> the source file: https://www.python.org/dev/peps/pep-0552/

Why isn't https://github.com/python/cpython/pull/296 a good enough
solution to this problem?  It has a simple implementation, and requires
neither maintaining two different pyc formats nor reading the entire
source file to check whether the pyc file is up to date.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Terry Reedy

On 9/7/2017 12:50 PM, Barry Warsaw wrote:

On Sep 6, 2017, at 23:10, Terry Reedy  wrote:


Environmental variables are set to strings, not objects.  It is not clear how 
you intend to handle the conversion.


The environment variable names a module import path.  Without quibbling about 
the details of the syntax (because honestly, I’m not convinced it’s a useful 
feature), it would work roughly like:

* The default value is equivalent to PYTHONBREAKPOINTHOOK=pdb.set_trace
* breakpoint() splits the value on the rightmost dot
* modules on the LHS are imported, then the RHS is getattr’d out of that
* That’s the callable breakpoint() calls


Environmental variables tend to be a pain on Windows and nigh unusable 
by beginners.  Leaving that aside, I see these problems with trying to 
use one for IDLE's *current* debugger.


pdb is universal, in the sense of working with any python run with 
actual or simulated stdin and stdout.  IDLE's idb is specific to working 
with IDLE. So one could not set an EV to 'idlelib.idb.start' and leave 
it while switching between IDLE and console.


pdb runs in one process, with communication between debugger and text ui 
handled by call and return.  It can be started on the fly.  IDLE runs 
code in a separate process.  The debugger has to run in the user 
process.  IDLE currently runs the GUI in the IDLE process.  So a 
complicated communication process has to be set up with rpc proxies and 
adaptors, and this is best done *before* code runs.  The on-the-fly 
function should not need an import and it might be better to not have one.


pdb.set_trace is a public and stable interface.  IDLE's is private and 
likely to be initially unstable.  I can imagine that the function that I 
would want to bind to sys.__breakpoint__ would be a bound method


Thinking about the above prompted me to rethink IDLE's debugger and 
consider rewriting it as an IDLE-independent gui debugger.  I'll will 
say more in response to Guido and you.


--
Terry Jan Reedy


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 552: deterministic pycs

2017-09-07 Thread Benjamin Peterson
Hello,
I've written a short PEP about an import extension to allow pycs to be
more deterministic by optional replacing the timestamp with a hash of
the source file: https://www.python.org/dev/peps/pep-0552/

Thanks for reading,
Benjamin

P.S. I came up with the idea for this PEP while awake.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Fernando Perez

On 2017-09-07 00:20:17 +, Barry Warsaw said:

Thanks Fernando, this is exactly the kind of feedback from other 
debuggers that I’m looking for.  It certainly sounds like a handy 
feature; I’ve found myself wanting something like that from pdb from 
time to time.


Glad it's useful, thanks for the pep!

The PEP has an open issue regarding breakpoint() taking *args and 
**kws, which would just be passed through the call stack.  It sounds 
like you’d be in favor of that enhancement.


If you go witht the `(*a, **k)` pass-through API, would you have a 
special keyword-only arg called 'header' or similar? That seems like a 
decent compromise to support the feature with the builtin while 
allowing other implementations to offer more features. In any case, +1 
to a pass-through API, as long as the built-in supports some kind of 
mechanism to help the user get their bearings with "you're here" type 
messages.


Cheers

f


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 553 V2 - builtin breakpoint() (was Re: PEP 553: Built-in debug())

2017-09-07 Thread Barry Warsaw
Thanks for all the great feedback folks!  Here then is PEP 553 version
2.  The notable changes are:

* Change the name of the built-in from debug() to breakpoint()
* Modify the signature to be breakpoint(*args, **kws)

https://www.python.org/dev/peps/pep-0553/

Included below for convenience.

Cheers,
-Barry

PEP: 553
Title: Built-in breakpoint()
Author: Barry Warsaw 
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 2017-09-05
Python-Version: 3.7
Post-History: 2017-09-05, 2017-09-07


Abstract


This PEP proposes adding a new built-in function called ``breakpoint()``
which
enters a Python debugger at the point of the call.  Additionally, two new
names are added to the ``sys`` module to make the debugger pluggable.


Rationale
=

Python has long had a great debugger in its standard library called ``pdb``.
Setting a break point is commonly written like this::

foo()
import pdb; pdb.set_trace()
bar()

Thus after executing ``foo()`` and before executing ``bar()``, Python will
enter the debugger.  However this idiom has several disadvantages.

* It's a lot to type (27 characters).

* It's easy to typo.  The PEP author often mistypes this line, e.g. omitting
  the semicolon, or typing a dot instead of an underscore.

* It ties debugging directly to the choice of pdb.  There might be other
  debugging options, say if you're using an IDE or some other development
  environment.

* Python linters (e.g. flake8 [1]_) complain about this line because it
  contains two statements.  Breaking the idiom up into two lines further
  complicates the use of the debugger,

These problems can be solved by modeling a solution based on prior art in
other languages, and utilizing a convention that already exists in Python.


Proposal


The JavaScript language provides a ``debugger`` statement [2]_ which enters
the debugger at the point where the statement appears.

This PEP proposes a new built-in function called ``breakpoint()``
which enters a Python debugger at the call site.  Thus the example
above would be written like so::

foo()
breakpoint()
bar()

Further, this PEP proposes two new name bindings for the ``sys``
module, called ``sys.breakpointhook()`` and
``sys.__breakpointhook__``.  By default, ``sys.breakpointhook()``
implements the actual importing and entry into ``pdb.set_trace()``,
and it can be set to a different function to change the debugger that
``breakpoint()`` enters.  ``sys.__breakpointhook__`` then stashes the
default value of ``sys.breakpointhook()`` to make it easy to reset.
This exactly models the existing ``sys.displayhook()`` /
``sys.__displayhook__`` and ``sys.excepthook()`` /
``sys.__excepthook__`` hooks [3]_.

The signature of the built-in is ``breakpoint(*args, **kws)``.  The
positional
and keyword arguments are passed straight through to
``sys.breakpointhook()``
and the signatures must match or a ``TypeError`` will be raised.  The return
from ``sys.breakpointhook()`` is passed back up to, and returned from
``breakpoint()``.  Since ``sys.breakpointhook()`` by default calls
``pdb.set_trace()`` by default it accepts no arguments.


Open issues
===

Confirmation from other debugger vendors


We want to get confirmation from at least one alternative debugger
implementation (e.g. PyCharm) that the hooks provided in this PEP will
be useful to them.

Breakpoint bytecode
---

Related, there has been an idea to add a bytecode that calls
``sys.breakpointhook()``.  Whether built-in ``breakpoint()`` emits
this bytecode (or gets peephole optimized to the bytecode) is an open
issue.  The bytecode is useful for debuggers that actively modify
bytecode streams to trampoline into their own debugger.  Having a
"breakpoint" bytecode might allow them to avoid bytecode modification
in order to invoke this trampoline.  *NOTE*: It probably makes sense to
split
this idea into a separate PEP.

Environment variable


Should we add an environment variable so that ``sys.breakpointhook()``
can be
set outside of the Python invocation?  E.g.::

$ export PYTHONBREAKPOINTHOOK=my.debugger:Debugger

This would provide execution environments such as IDEs which run Python code
inside them, to set an internal breakpoint hook before any Python code
executes.

Call a fancier object by default


Some folks want to be able to use other ``pdb`` interfaces such as
``pdb.pm()``.  Although this is a less commonly used API, it could be
supported by binding ``sys.breakpointhook`` to an object that implements
``__call__()``.  Calling this object would call ``pdb.set_trace()``, but the
object could expose other methods, such as ``pdb.pm()``, making
invocation of
it as handy as ``breakpoint.pm()``.


Implementation
==

A pull request exists with the proposed implementation [4]_.


Rejected alternatives
=

A new 

Re: [Python-Dev] Memory bitmaps for the Python cyclic garbage collector

2017-09-07 Thread Antoine Pitrou
On Thu, 7 Sep 2017 11:30:12 -0600
Neil Schemenauer  wrote:
> 
> * The GC process would work nearly the same as it does now.  Rather than
>   only traversing the linked list, we would also have to crawl over the
>   GC object arenas, check blocks of memory that have the tracked bit
>   set.

Small note: the linked lists are also used for distinguishing between
generations. In other words, there is not one linked list but three of
them (one per generation).  This means you probably need two bits per
object, not one.

Other than that, it's an interesting proposal.  I'm looking forward to
the concrete results (performance, and maintenance overhead) :-)

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 549: Instance Properties (aka: module properties)

2017-09-07 Thread Neil Schemenauer
Larry Hastings  wrote:
> The TL;DR summary: add support for property objects to modules.
> I've already posted a prototype.

I posted an idea to python-ideas about lazy module loading.  If the
lazy loading idea works, having properties would allow modules to
continue to be "lazy safe" but to easily do init logic when needed,
e.g. getting of the property.

There should be a very clean way to do that, IMHO.  Using __class__
is not clean and it would be unfortunate to have the __class__
song-and-dance in a bunch of modules.  Using property() seems more
Pythonic.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Consolidate stateful runtime globals

2017-09-07 Thread Neil Schemenauer
Is there any issue with unit-at-a-time optimization?  I would
imagine that a static global would allow optimizations that are not
safe for a exported global (not sure the C term for it).

I suspect it doesn't matter and I support the idea in general.
Global variables in extension modules kills the idea of a
mark-and-sweap or some other GC mechanism.  That's probably not
going to happen but identifying all of the global state seems like a
step forward.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Memory bitmaps for the Python cyclic garbage collector

2017-09-07 Thread Neil Schemenauer
Python objects that participate in cyclic GC (things like lists, dicts,
sets but not strings, ints and floats) have extra memory overhead.  I
think it is possible to mostly eliminate this overhead.  Also, while
the GC is running, this GC state is mutated, which destroys
copy-on-write optimizations.  This change would mostly fix that
issue.

All objects that participate in cyclic GC have the Py_TPFLAGS_HAVE_GC
bit set in their type.  That causes an extra chunk of memory to be
allocated *before* the ob_refcnt struct member.  This is the PyGC_Head
struct.

The whole object looks like this in memory (PyObject pointer is at
arrow):

union __gc_head *gc_next;
union __gc_head *gc_prev;
Py_ssize_t gc_refs;
-->
Py_ssize_t ob_refcnt
struct _typeobject *ob_type;
[rest of PyObject members]


So, 24 bytes of overhead on a 64-bit machine.  The smallest Python
object that can have a pointer to another object (e.g. a single PyObject
* member) is 48 bytes.  Removing PyGC_Head would cut the size of these
objects in half.

Carl Shaprio questioned me today on why we use a double linked-list and
not the memory bitmap.  I think the answer is that there is no good
reason. We use a double linked list only due to historical constraints
that are no longer present.

Long ago, Python objects could be allocated using the system malloc or
other memory allocators.  Since we could not control the memory
location, bitmaps would be inefficient.  Today, we allocate all Python
objects via our own function.  Python objects under a certain size are
allocated using our own malloc, obmalloc, and are stored in memory
blocks known "arenas".

The PyGC_Head struct performs three functions.  First, it allows the GC
to find all Python objects that will be checked for cycles (i.e. follow
the linked list).  Second, it stores a single bit of information to let
the GC know if it is safe to traverse the object, set with
PyObject_GC_Track().  Finally, it has a scratch area to compute the
effective reference count while tracing refs (gc_refs).

Here is a sketch of how we can remove the PyGC_Head struct for small
objects (say less than 512 bytes).  Large objects or objects created by
a different memory allocator will still have the PyGC_Head overhead.

* Have memory arenas that contain only objects with the
  Py_TPFLAGS_HAVE_GC flag.  Objects like ints, strings, etc will be
  in different arenas, not have bitmaps, not be looked at by the
  cyclic GC.

* For those arenas, add a memory bitmap.  The bitmap is a bit array that
  has a bit for each fixed size object in the arena.  The memory used by
  the bitmap is a fraction of what is needed by PyGC_Head.  E.g. an
  arena that holds up to 1024 objects of 48 bytes in size would have a
  bitmap of 1024 bits.

* The bits will be set and cleared by PyObject_GC_Track/Untrack()

* We also need an array of Py_ssize_t to take over the job of gc_refs.
  That could be allocated only when GC is working and it only needs to
  be the size of the number of true bits in the bitmap.  Or, it could be
  allocated when the arena is allocated and be sized for the full arena.

* Objects that are too large would still get the PyGC_Head struct
  allocated "in front" of the PyObject.  Because they are big, the
  overhead is not so bad.

* The GC process would work nearly the same as it does now.  Rather than
  only traversing the linked list, we would also have to crawl over the
  GC object arenas, check blocks of memory that have the tracked bit
  set.

There are a lot of smaller details to work out but I see no reason
why the idea should not work.  It should significantly reduce memory
usage.  Also, because the bitmap and gc_refs are contiguous in
memory, locality will be improved.  Łukasz Langa has mentioned that
the current GC causes issues with copy-on-write memory in big
applications.  This change should solve that issue.

To implement, I think the easiest path is to create new malloc to be
used by small GC objects, e.g. gcmalloc.c.  It would be similar to
obmalloc but have the features needed to keep track of the bitmap.
obmalloc has some quirks that makes it hard to use for this purpose.
Once the idea is proven, gcmalloc could be merged or made to be a
variation of obmalloc.  Or, maybe just optimized and remain
separate.  obmalloc is complicated and highly optimized.  So, adding
additional functionality to it will be challenging.

I believe this change would be ABI compatible.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 10:00, Christian Heimes  wrote:

> Setuptools' entry points [1] use colon between import and function, e.g.
> "pdb:set_trace" would import pdb and then execute set_trace. The
> approach can be augmented to allow calling a class method, too.
> 
> So
> 
>  "package.module:myclass.classfunc"
> 
> would do :
> 
>  from package.module import myclass
>  myclass.classfunc

Yep, that’s how it's described in the PEP 553 open issue.  I just didn’t 
include that complication in my response.

-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Compiling without multithreading support -- still useful?

2017-09-07 Thread Antoine Pitrou

This is now in git master after being merged by Victor in
https://github.com/python/cpython/pull/3385.

Regards

Antoine.


On Tue, 5 Sep 2017 18:36:51 +0200
Antoine Pitrou  wrote:
> Hello,
> 
> It's 2017 and we are still allowing people to compile CPython without
> threads support.  It adds some complication in several places
> (including delicate parts of our internal C code) without a clear
> benefit.  Do people still need this?
> 
> Regards
> 
> Antoine.
> 
> 



___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Christian Heimes
On 2017-09-07 09:50, Barry Warsaw wrote:
> On Sep 6, 2017, at 23:10, Terry Reedy  wrote:
>>
>> Environmental variables are set to strings, not objects.  It is not clear 
>> how you intend to handle the conversion.
> 
> The environment variable names a module import path.  Without quibbling about 
> the details of the syntax (because honestly, I’m not convinced it’s a useful 
> feature), it would work roughly like:
> 
> * The default value is equivalent to PYTHONBREAKPOINTHOOK=pdb.set_trace
> * breakpoint() splits the value on the rightmost dot
> * modules on the LHS are imported, then the RHS is getattr’d out of that
> * That’s the callable breakpoint() calls

Setuptools' entry points [1] use colon between import and function, e.g.
"pdb:set_trace" would import pdb and then execute set_trace. The
approach can be augmented to allow calling a class method, too.

So

  "package.module:myclass.classfunc"

would do :

  from package.module import myclass
  myclass.classfunc


Regards,
Christian

[1]
https://setuptools.readthedocs.io/en/latest/setuptools.html#automatic-script-creation
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 7, 2017, at 07:46, Guido van Rossum  wrote:

> Without following all this (or the PEP, yet) super exactly, does this mean 
> you are satisfied with what the PEP currently proposes or would you like 
> changes? It's a little unclear from what you wrote.

I’m also unsure whether Terry is good with the existing PEP or suggesting 
changes.

> I think that's true for any IDE that has existing integrated debug 
> capabilities. However for every IDE I would hope that calling breakpoint() 
> will *also* break into the IDE's debugger. My use case is that sometimes I 
> have a need for a *conditional* breakpoint where it's just much simpler to 
> decide whether to break or not in Python code than it would be to use the 
> IDE's conditional breakpoint facilities.

That certainly aligns with my own experience and expectations.  I guess I’m a 
fellow old dog. :)

-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Barry Warsaw
On Sep 6, 2017, at 23:10, Terry Reedy  wrote:
> 
> Environmental variables are set to strings, not objects.  It is not clear how 
> you intend to handle the conversion.

The environment variable names a module import path.  Without quibbling about 
the details of the syntax (because honestly, I’m not convinced it’s a useful 
feature), it would work roughly like:

* The default value is equivalent to PYTHONBREAKPOINTHOOK=pdb.set_trace
* breakpoint() splits the value on the rightmost dot
* modules on the LHS are imported, then the RHS is getattr’d out of that
* That’s the callable breakpoint() calls

-Barry



signature.asc
Description: Message signed with OpenPGP
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Guido van Rossum
On Wed, Sep 6, 2017 at 11:59 PM, Terry Reedy  wrote:

> On 9/6/2017 6:45 PM, Barry Warsaw wrote:
>
>> On Sep 6, 2017, at 14:59, Terry Reedy  wrote:
>>
>>>
>>> Currently, the debugger is started in response to a menu seletion in the
>>> IDLE process while the python process is idle.  One reason for the 'idle'
>>> requirement' is because when code is exec-uting, the loop that reads
>>> commands, executes them, and sends responses is blocked on the exec call.
>>> The IDLE process sets up its debugger window, its ends of the rpc channels,
>>> and commands to python process to set up Idb and the other ends of the
>>> channels.  The challenge would be to initiate setup from the server process
>>> and deal with the blocked loop.
>>>
>>
>> Would the environment variable idea in the latest version of the PEP help
>> you here?
>>
>
> That seems to be a mechanism for users to convey which function to call.
> It does not simplify the function call chosen.
>
> pdb works because a) it is designed to be imported into and run in a user
> module and b) it uses the existing stdin and stdout streams to interact
> with the user.  Idb, on the other hand, is designed to be import in
> idlelib.run, which is (almost) invisible to user code, and use a special
> channel to the GUI window.
>
> Having said that, I see a solution: before running user code, set things
> up as now, except do not start tracing and do not show the debugger window,
> but do set a run.idb_ready flag.   Then breakpoint_handler() can check the
> flag and start tracing either through idb or pdb.
>

Without following all this (or the PEP, yet) super exactly, does this mean
you are satisfied with what the PEP currently proposes or would you like
changes? It's a little unclear from what you wrote.


> What makes breakpoint() less than super useful in IDLE is that IDLE
> already allows setting of persistent breakpoints on a line by
> right-clicking.  Since they are not stored in the file, there is nothing to
> erase when one runs the code in Python directly.
>

I think that's true for any IDE that has existing integrated debug
capabilities. However for every IDE I would hope that calling breakpoint()
will *also* break into the IDE's debugger. My use case is that sometimes I
have a need for a *conditional* breakpoint where it's just much simpler to
decide whether to break or not in Python code than it would be to use the
IDE's conditional breakpoint facilities.

(Plus there's something about old dogs and new tricks, but I've forgotten
exactly how that one goes. :-)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Stefan Krah
On Thu, Sep 07, 2017 at 09:41:10AM -0400, Elvis Pranskevichus wrote:
> threading.local(), the isolation mechanism, is *implicit*.  
> decimal.localcontext() is an  *explicit* resource manager that relies on 
> threading.local() magic.  PEP 550 simply provides a threading.local() 
> alternative that works in tasks and generators.  That's it!

If there only were a name that would make it explicit, like TaskLocalStorage. ;)


Seriously, the problem with 'context' is that it is:

  a) A predefined set of state values like in the Decimal (I think also
 the OpenSSL) context.

 But such a context is put inside another context (the ExecutionContext).

  b) A theoretical concept from typed Lambda calculus (in the context
 'gamma' the variable 'v' has type 't').

 But this concept would be associated with lexical scope and would
 extend to functions (not only tasks and generators).

  c) ``man 3 setcontext``. A replacement for setjmp/longjmp.  Somewhat
 related in that it could be used to implement coroutines.

  d) The .NET flowery language.  I do did not fully understand what the
 .NET ExecutionContext and its 2881 implicit flow rules are.

  ...



Stefan Krah



___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Guido van Rossum
I write it in a new thread, but I also want to write it here -- I need a
time out in this discussion so I can think about it more.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 550 discussion time out

2017-09-07 Thread Guido van Rossum
I am declaring a time out for discussing of PEP 550 and its competitor, PEP
555. There has been some good discussion but also some mud-slinging (in
various directions), and I've been frankly overwhelmed by the exchanges,
despite having spent much of Tuesday talking to Yury about it and lying
awake thinking about it for two nights since then.

I need to sit down and think more about the problem we're trying to solve,
what are the constraints (and why), and what would the minimal solution.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Consolidate stateful runtime globals

2017-09-07 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:17 PM, Benjamin Peterson 
wrote:

> On Wed, Sep 6, 2017, at 10:08, Antoine Pitrou wrote:
> > On Wed, 06 Sep 2017 09:42:29 -0700
> > Benjamin Peterson  wrote:
> > > On Wed, Sep 6, 2017, at 03:14, Antoine Pitrou wrote:
> > > >
> > > > Hello,
> > > >
> > > > I'm a bit concerned about
> > > > https://github.com/python/cpython/commit/
> 76d5abc8684bac4f2fc7cccfe2cd940923357351
> > > >
> > > > My main gripe is that makes writing C code more tedious.  Simple C
> > > > global variables such as "_once_registry" are now spelled
> > > > "_PyRuntime.warnings.once_registry".  The most egregious example
> seems
> > > > to be "_PyRuntime.ceval.gil.locked" (used to be simply "gil_locked").
> > > >
> > > > Granted, C is more verbose than Python, but it doesn't have to become
> > > > that verbose.  I don't know about you, but when code becomes annoying
> > > > to type, I tend to try and take shortcuts.
> > >
> > > How often are you actually typing the names of runtime globals, though?
> >
> > Not very often, but if I want to experiment with some low-level
> > implementation details, it is nice to avoid the hassle.
>
> It seems like this could be remediated with some inline functions or
> macros, which would also help safely encapsulate state.
>
> >
> > There's also a readability argument: with very long names, expressions
> > can become less easy to parse.
> >
> > > If you are using a globals, perhaps the typing time will allow you to
> > > fully consider the gravity of the situation.
> >
> > Right, I needed to be reminded of how perilous the use of C globals is.
> > Perhaps I should contact the PSRT the next time I contemplate using a C
> > global.
>
> It's not just you but future readers.


Great. Related to this, there is also discussion on dangers of globals and
other widely-scoped variables in the Rationale section of PEP 555
(Context-local variables), for anyone interested. But if you read the draft
I posted on python-ideas last Monday, you've already seen it.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Elvis Pranskevichus
On Thursday, September 7, 2017 10:06:14 AM EDT Ethan Furman wrote:
> I might be, and I wouldn't be surprised.  :)  On the other hand, one
> can look at isolation as being a resource.
> > threading.local(), the isolation mechanism, is *implicit*.
> 
> I don't think so.  You don't get threading.local() unless you call it
> -- that makes it explicit.
> > decimal.localcontext() is an  *explicit* resource manager that
> > relies on threading.local() magic.  PEP 550 simply provides a
> > threading.local() alternative that works in tasks and generators. 
> > That's it!
> 
> The concern is *how* PEP 550 provides it:
> 
> - explicitly, like threading.local(): has to be set up manually,
>preferably with a context manager
> 
> - implicitly: it just happens under certain conditions
> 

You literally replace threading.local() with contextvars.ContextVar():

import threading

_decimal_context = threading.local()

def set_decimal_context(ctx):
_decimal_context.context = ctx


Becomes:


import contextvars

_decimal_context = contextvars.ContextVar('decimal.Context')

def set_decimal_context(ctx):
_decimal_context.set(ctx)


   Elvis
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Elvis Pranskevichus
On Thursday, September 7, 2017 6:37:58 AM EDT Greg Ewing wrote:
> 2) You ignore it and always use a context manager, in
> which case it's not strictly necessary for the implicit
> context push to occur, since the relevant context managers
> can take care of it.
> 
> So there doesn't seem to be any great advantage to the
> automatic context push, and it has some disadvantages,
> such as yield-from not quite working as expected in
> some situations.

The advantage is that context managers don't need to *always* 
allocate and push an LC. [1]

> Also, it seems that every generator is going to incur
> the overhead of allocating a logical_context even when
> it doesn't actually change any context vars, which most
> generators won't.

By default, generators reference an empty LogicalContext object that is 
allocated once (like the None object).  We can do that because LCs are 
immutable. 

  Elvis


[1] https://mail.python.org/pipermail/python-dev/2017-September/
149265.html
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Elvis Pranskevichus
On Thursday, September 7, 2017 9:05:58 AM EDT Ethan Furman wrote:
> The disagreement seems to be whether a LogicalContext should be
> created implicitly vs explicitly (or opt-out vs opt-in). As a user
> trying to track down a decimal context change not propagating, I
> would not suspect the above code of automatically creating a
> LogicalContext and isolating the change, whereas Greg's context
> manager version is abundantly clear.
> 
> The implicit vs explicit argument comes down, I think, to resource
> management: some resources in Python are automatically managed
> (memory), and some are not (files) -- which type should LCs be?

You are confusing resource management with the isolation mechanism.  PEP 
550 contextvars are analogous to threading.local(), which the PEP makes 
very clear from the outset.

threading.local(), the isolation mechanism, is *implicit*.  
decimal.localcontext() is an  *explicit* resource manager that relies on 
threading.local() magic.  PEP 550 simply provides a threading.local() 
alternative that works in tasks and generators.  That's it!


 Elvis
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Consolidate stateful runtime globals

2017-09-07 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:17 PM, Benjamin Peterson 
wrote:

> On Wed, Sep 6, 2017, at 10:08, Antoine Pitrou wrote:
> > On Wed, 06 Sep 2017 09:42:29 -0700
> > Benjamin Peterson  wrote:
> > > On Wed, Sep 6, 2017, at 03:14, Antoine Pitrou wrote:
> > > >
> > > > Hello,
> > > >
> > > > I'm a bit concerned about
> > > > https://github.com/python/cpython/commit/76d5abc8684bac4f2fc
> 7cccfe2cd940923357351
> > > >
> > > > My main gripe is that makes writing C code more tedious.  Simple C
> > > > global variables such as "_once_registry" are now spelled
> > > > "_PyRuntime.warnings.once_registry".  The most egregious example
> seems
> > > > to be "_PyRuntime.ceval.gil.locked" (used to be simply "gil_locked").
> > > >
> > > > Granted, C is more verbose than Python, but it doesn't have to become
> > > > that verbose.  I don't know about you, but when code becomes annoying
> > > > to type, I tend to try and take shortcuts.
> > >
> > > How often are you actually typing the names of runtime globals, though?
> >
> > Not very often, but if I want to experiment with some low-level
> > implementation details, it is nice to avoid the hassle.
>
> It seems like this could be remediated with some inline functions or
> macros, which would also help safely encapsulate state.
>
> >
> > There's also a readability argument: with very long names, expressions
> > can become less easy to parse.
> >
> > > If you are using a globals, perhaps the typing time will allow you to
> > > fully consider the gravity of the situation.
> >
> > Right, I needed to be reminded of how perilous the use of C globals is.
> > Perhaps I should contact the PSRT the next time I contemplate using a C
> > global.
>
> It's not just you but future readers.


Great. Related to this, there is also discussion on dangers of globals and
other widely-scoped variables in the Rationale section of PEP 555
(Context-local variables), for anyone interested. But if you read the draft
I posted on python-ideas last Monday, you've already seen it.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Ethan Furman

On 09/07/2017 06:41 AM, Elvis Pranskevichus wrote:

On Thursday, September 7, 2017 9:05:58 AM EDT Ethan Furman wrote:



The disagreement seems to be whether a LogicalContext should be
created implicitly vs explicitly (or opt-out vs opt-in). As a user
trying to track down a decimal context change not propagating, I
would not suspect the above code of automatically creating a
LogicalContext and isolating the change, whereas Greg's context
manager version is abundantly clear.

The implicit vs explicit argument comes down, I think, to resource
management: some resources in Python are automatically managed
(memory), and some are not (files) -- which type should LCs be?


You are confusing resource management with the isolation mechanism.  PEP
550 contextvars are analogous to threading.local(), which the PEP makes
very clear from the outset.


I might be, and I wouldn't be surprised.  :)  On the other hand, one can look 
at isolation as being a resource.


threading.local(), the isolation mechanism, is *implicit*.


I don't think so.  You don't get threading.local() unless you call it -- that 
makes it explicit.


decimal.localcontext() is an  *explicit* resource manager that relies on
threading.local() magic.  PEP 550 simply provides a threading.local()
alternative that works in tasks and generators.  That's it!


The concern is *how* PEP 550 provides it:

- explicitly, like threading.local(): has to be set up manually,
  preferably with a context manager

- implicitly: it just happens under certain conditions

--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Elvis Pranskevichus
On Thursday, September 7, 2017 3:54:15 AM EDT Greg Ewing wrote:
> This problem would also not arise if context vars
> simply had names instead of being magic key objects:
> 
> def foo():
>contextvars.set("mymodule.myvar", 1)
> 
> That's another thing I think would be an improvement,
> but it's orthogonal to what we're talking about here
> and would be best discussed separately.

On the contrary, using simple names (PEP 550 V1 was actually doing that) 
is a regression.  It opens up namespace clashing issues.  Imagine you 
have a variable named "foo", and then some library you import also 
decides to use the name "foo", what then?  That's one of the reasons why 
we do `local = threading.local()` instead of 
`threading.set_local("foo", 1)`.

  Elvis
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Ethan Furman

On 09/06/2017 11:57 PM, Yury Selivanov wrote:

On Wed, Sep 6, 2017 at 11:39 PM, Greg Ewing wrote:



Here's one way that refactoring could trip you up.
Start with this:

async def foo():
   calculate_something()
   #in a coroutine, so we can be lazy and not use a cm


Where exactly does PEP 550 encourage users to be "lazy and not use a
cm"?  PEP 550 provides a mechanism for implementing context managers!
What is this example supposed to show?


That using a CM is not required, and tracking down a bug caused by not using a 
CM can be difficult.


How is PEP 550 is at fault of somebody being lazy and not using a
context manager?


Because PEP 550 makes a CM unnecessary in the simple (common?) case, hiding the need for a CM in not-so-simple cases. 
For comparison: in Python 3 we are now warned about files that have been left open (because explicitly closing files was 
unnecessary in CPython due to an implementation detail) -- the solution?  make files context managers whose __exit__ 
closes the file.



PEP 550 has a hard requirement to make it possible for decimal/other
libraries to start using its APIs and stay backwards compatible, so it
allows `decimal.setcontext(ctx)` function to be implemented.  We are
fixing things here.


I appreciate that the scientific and number-crunching communities have been a major driver of enhancements for Python 
(such as rich comparisons and, more recently, matrix operators), but I don't think an enhancement for them that makes 
life more difficult for the rest is a net win.


--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [python-committers] Cherry picker bot deployed in CPython repo

2017-09-07 Thread Senthil Kumaran
On Tue, Sep 5, 2017 at 6:10 PM, Mariatta Wijaya 
wrote:

> Hi,
>
> The cherry picker bot has just been deployed to CPython repo, codenamed
> miss-islington.
>
> miss-islington made the very first backport PR for CPython and became a
> first time GitHub contributor: https://github.com/python/cpython/pull/3369
>
>
>
Thanks Mariatta.  This is an awesome contribution. It's going to extremely
helpful.

-- 
Senthil
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Ethan Furman

On 09/07/2017 03:37 AM, Greg Ewing wrote:


If I understand correctly, instead of using a context
manager, your fractions example could be written like
this:

def fractions(precision, x, y):
 ctx = decimal.getcontext().copy()
 decimal.setcontext(ctx)
 ctx.prec = precision
 yield MyDecimal(x) / MyDecimal(y)
 yield MyDecimal(x) / MyDecimal(y ** 2)

and it would work without leaking changes to the decimal
context, despite the fact that it doesn't use a context
manager or do anything else to explicitly put back the
old context.


The disagreement seems to be whether a LogicalContext should be created implicitly vs explicitly (or opt-out vs opt-in). 
 As a user trying to track down a decimal context change not propagating, I would not suspect the above code of 
automatically creating a LogicalContext and isolating the change, whereas Greg's context manager version is abundantly 
clear.


The implicit vs explicit argument comes down, I think, to resource management: some resources in Python are 
automatically managed (memory), and some are not (files) -- which type should LCs be?


--
~Ethan~

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Ethan Furman

On 09/07/2017 04:39 AM, Greg Ewing wrote:


1) Under "Generators" it says:

once set in the generator, the context variable is guaranteed
not to change between iterations;

This suggests that you're not allowed to set() a given
context variable more than once in a given generator,
but some of the examples seem to contradict that. So I'm
not sure what this is trying to say.


I believe I can answer this part: the guarantee is that

- the context variable will not be changed while the yield is in effect -- or,
  said another way, while the generator is suspended;

- the context variable will not be changed by subgenerators

- the context variable /may/ be changed by normal functions/class methods
  (since calling them would be part of the iteration)

--
~Ethan~

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

There are a couple of things in the PEP I'm confused about:

1) Under "Generators" it says:

   once set in the generator, the context variable is guaranteed
   not to change between iterations;

This suggests that you're not allowed to set() a given
context variable more than once in a given generator,
but some of the examples seem to contradict that. So I'm
not sure what this is trying to say.

2) I don't understand why the logical_contexts have to
be immutable. If every task or generator that wants its
own task-local storage has its own logical_context
instance, why can't it be updated in-place?

--
Greg

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

There is one thing I misunderstood. Since generators and
coroutines are almost exactly the same underneath, I had
thought that the automatic logical_context creation for
generators was also going to apply to coroutines, but
from reading the PEP again it seems that's not the case.
Somehow I missed that the first time. Sorry about that.

So, context vars do behave like "task locals storage"
for asyncio Tasks, which is good.

The only issue is whether a generator should be considered
an "ad-hoc task" for this purpose. I can see your reasons
for thinking that it should be. I can also understand your
thinking that the yield-from issue is such an obscure
corner case that it's not worth worrying about,
especially since there is a workaround available (setting
_logical_context_ to None) if needed.

I'm not sure how I feel about that now. I agree that
it's an obscure case, but the workaround seems even more
obscure, and is unlikely to be found by anyone who
isn't closely familiar with the inner workings.

I think I'd be happier if there were a higher-level way
of applying this workaround, such as a decorator:

@subgenerator
def g():
   ...

Then the docs could say "If you want a generator to *not*
have its own task local storage, wrap it with @subgenerator."

By the way, I think "Task Local Storage" would be a
much better title for this PEP. It instantly conveys the
basic idea in a way that "Execution Context" totally
fails to do.

It might also serve as a source for some better
terminology for parts of the implementation, such as
TaskLocalStorage and TaskLocalStorageStack instead
of logical_context and execution_context. I found
the latter terms almost devoid of useful meaning when
trying to understand the implementation.

--
Greg

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Koos Zevenhoven
On Thu, Sep 7, 2017 at 10:54 AM, Greg Ewing 
wrote:

> Yury Selivanov wrote:
>
>> def foo():
>>  var = ContextVar()
>>  var.set(1)
>>
>> for _ in range(10**6): foo()
>>
>> If 'var' is strongly referenced, we would have a bunch of them.
>>
>
> Erk. This is not how I envisaged context vars would be
> used. What I thought you would do is this:
>
>my_context_var = ContextVar()
>
>def foo():
>   my_context_var.set(1)
>
> This problem would also not arise if context vars
> simply had names instead of being magic key objects:
>
>def foo():
>   contextvars.set("mymodule.myvar", 1)
>
> That's another thing I think would be an improvement,
> but it's orthogonal to what we're talking about here
> and would be best discussed separately.
>
>
​​​There are lots of things in this discussion that I should have commented
on, but here's one related to this.

PEP 555 does not have the resource-management issue described above and
needs no additional tricks to achieve that:

# using PEP 555

def foo():
   var = contextvars.Var()
   with var.assign(1):
   # do something [*]

​for _ in range(10**6):
foo(​)


Every time foo is called, a new context variable is created, but that's
perfectly fine, and lightweight. As soon as the context manager exits,
there are no references to the Assignment object returned by var.assign(1),
and as soon as foo() returns, there are no references to var, so everything
should get cleaned up nicely.

And regarding string keys, they have pros and cons, and they can be added
easily, so let's not go there now.

-- Koos


[*] (nit-picking) without closures that would keep the var reference alive


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

I understand what Koos is
talking about, but I really don't follow you.  Using the
"with-statements to be skipped" language is very confusing and doesn't
help to understand you.


If I understand correctly, instead of using a context
manager, your fractions example could be written like
this:

def fractions(precision, x, y):
ctx = decimal.getcontext().copy()
decimal.setcontext(ctx)
ctx.prec = precision
yield MyDecimal(x) / MyDecimal(y)
yield MyDecimal(x) / MyDecimal(y ** 2)

and it would work without leaking changes to the decimal
context, despite the fact that it doesn't use a context
manager or do anything else to explicitly put back the
old context.

Am I right about that?

This is what I mean by "skipping context managers" --
that it's possible in some situations to get by without
using a context manager, by taking advantage of the
implicit local context push that happens whenever a
generator is started up.

Now, there are two possibilities:

1) You take advantage of this, and don't use context
managers in some or all of the places where you don't
need to. You seem to agree that this would be a bad
idea.

2) You ignore it and always use a context manager, in
which case it's not strictly necessary for the implicit
context push to occur, since the relevant context managers
can take care of it.

So there doesn't seem to be any great advantage to the
automatic context push, and it has some disadvantages,
such as yield-from not quite working as expected in
some situations.

Also, it seems that every generator is going to incur
the overhead of allocating a logical_context even when
it doesn't actually change any context vars, which most
generators won't.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Compiling without multithreading support -- still useful?

2017-09-07 Thread Antoine Pitrou

I've proposed a PEP 11 update in this PR:
https://github.com/python/peps/pull/394

Regards

Antoine.


On Tue, 5 Sep 2017 18:36:51 +0200
Antoine Pitrou  wrote:
> Hello,
> 
> It's 2017 and we are still allowing people to compile CPython without
> threads support.  It adds some complication in several places
> (including delicate parts of our internal C code) without a clear
> benefit.  Do people still need this?
> 
> Regards
> 
> Antoine.
> 
> 



___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

def foo():
 var = ContextVar()
 var.set(1)

for _ in range(10**6): foo()

If 'var' is strongly referenced, we would have a bunch of them.


Erk. This is not how I envisaged context vars would be
used. What I thought you would do is this:

   my_context_var = ContextVar()

   def foo():
  my_context_var.set(1)

This problem would also not arise if context vars
simply had names instead of being magic key objects:

   def foo():
  contextvars.set("mymodule.myvar", 1)

That's another thing I think would be an improvement,
but it's orthogonal to what we're talking about here
and would be best discussed separately.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

The PEP gives you a Task Local Storage, where Task is:

1. your single-threaded code
2. a generator
3. an async task

If you correctly use context managers, PEP 550 works intuitively and
similar to how one would think that threading.local() should work.


My version works *more* similarly to thread-local storage,
IMO.

Currently, if you change the decimal context without using
a with-statement or something equivalent, you *don't*
expect the change to be confined to the current function
or sub-generator or async sub-task.

All I'm asking for is one consistent rule: If you want
a context change encapsulated, use a with-statement. If
you don't, don't.

Not only is this rule simpler than yours, it's the
*same* rule that we have now, so there is less for
users to learn.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

1. So essentially this means that we will have one "local context" per
context manager storing one value.


I can't see that being a major problem. Context vars will
(I hope!) be very rare things, and needing to change a
bunch of them in one function ought to be rarer still.

But if you do, it would be easy to provide a context
manager whose sole effect is to introduce a new context:

   with new_local_context():
  cvar1.set(something)
  cvar2.set(otherthing)
  ...


2. If somebody makes a mistake and calls "push_local_context" without
a corresponding "pop_local_context"


You wouldn't normally call them directly, they would be
encapsulated in carefully-written context managers. If you
do use them, you're taking responsibility for using
them correctly.

If it would make you feel happier, they could be named
_push_local_context and _pop_local_context to emphasise
that they're not intended for everyday use.


3. Users will need to know way more to correctly use the mechanism.


Most users will simply be using already-provided context
managers, which they're *already used to doing*. So they
won't have to know anything more than they already do.

See my last decimal example, which required *no change*
to existing correct user code.


So far, both you and Koos can't give us a realistic example which
illustrates why we should suffer the implications of (1), (2), and
(3).


And you haven't given a realistic example that convinces me
your proposed with-statement-elimination feature would be of
significant benefit.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Yury Selivanov
On Wed, Sep 6, 2017 at 11:55 PM, Greg Ewing  wrote:
> Guido van Rossum wrote:
>>
>> Yeah, so my claim this is simply a non-problem, and you've pretty much
>> just proved that by failing to come up with pointers to actual code that
>> would suffer from this. Clearly you're not aware of any such code.
>
>
> In response I'd ask Yuri to come up with examples of real
> code that would benefit significantly from being able to
> make context changes without wrapping them in a with
> statement.

A real-code example: make it possible to implement
decimal.setcontext() on top of PEP 550 semantics.

I still feel that there's some huge misunderstanding in the
discussion: PEP 550 does not promote "not using context managers".  It
simply implements a low-level mechanism to make it possible to
implement context managers for generators/coroutines/etc.  Whether
this API is used to write context managers or not is completely
irrelevant to the discussion.

How does threading.local() promote or demote using of context
managers?  The answer: it doesn't.  Same answer is for PEP 550, which
is a similar mechanism.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

I still think that giving Python programmers one strong rule: "context
mutation is always isolated in generators" makes it easier to reason
about the EC and write maintainable code.


Whereas I think it makes code *harder* to reason about,
because to take advantage of it you need to be acutely
aware of whether the code you're working on is in a
generator/coroutine or not.

It seems simpler to me to have one rule for all kinds
of functions: If you're making a temporary change to
contextual state, always encapsulate it in a with
statement.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Terry Reedy

On 9/6/2017 6:45 PM, Barry Warsaw wrote:

On Sep 6, 2017, at 14:59, Terry Reedy  wrote:


Currently, the debugger is started in response to a menu seletion in the IDLE 
process while the python process is idle.  One reason for the 'idle' 
requirement' is because when code is exec-uting, the loop that reads commands, 
executes them, and sends responses is blocked on the exec call. The IDLE 
process sets up its debugger window, its ends of the rpc channels, and commands 
to python process to set up Idb and the other ends of the channels.  The 
challenge would be to initiate setup from the server process and deal with the 
blocked loop.


Would the environment variable idea in the latest version of the PEP help you 
here?


That seems to be a mechanism for users to convey which function to call. 
 It does not simplify the function call chosen.


pdb works because a) it is designed to be imported into and run in a 
user module and b) it uses the existing stdin and stdout streams to 
interact with the user.  Idb, on the other hand, is designed to be 
import in idlelib.run, which is (almost) invisible to user code, and use 
a special channel to the GUI window.


Having said that, I see a solution: before running user code, set things 
up as now, except do not start tracing and do not show the debugger 
window, but do set a run.idb_ready flag.   Then breakpoint_handler() can 
check the flag and start tracing either through idb or pdb.


What makes breakpoint() less than super useful in IDLE is that IDLE 
already allows setting of persistent breakpoints on a line by 
right-clicking.  Since they are not stored in the file, there is nothing 
to erase when one runs the code in Python directly.



--
Terry Jan Reedy

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Yury Selivanov
On Wed, Sep 6, 2017 at 11:39 PM, Greg Ewing  wrote:
> Yury Selivanov wrote:
>>
>> It would be great if you or Greg could show a couple of real-world
>> examples showing the "issue" (with the current PEP 550
>> APIs/semantics).
>
>
> Here's one way that refactoring could trip you up.
> Start with this:
>
>async def foo():
>   calculate_something()
>   #in a coroutine, so we can be lazy and not use a cm

Where exactly does PEP 550 encourage users to be "lazy and not use a
cm"?  PEP 550 provides a mechanism for implementing context managers!
What is this example supposed to show?

>   ctx = decimal.getcontext().copy()
>   ctx.prec = 5
>   decimal.setcontext(ctx)
>   calculate_something_else()
>
> And factor part of it out (into an *ordinary* function!)
>
>async def foo():
>   calculate_something()
>   calculate_something_else_with_5_digits()
>
>def calculate_something_else_with_5_digits():
>   ctx = decimal.getcontext().copy()
>   ctx.prec = 5
>   decimal.setcontext(ctx)
>   calculate_something_else()
>
> Now we add some more calculation to the end of foo():
>
>async def foo():
>   calculate_something()
>   calculate_something_else_with_5_digits()
>   calculate_more_stuff()
>
> Here we didn't intend calculate_more_stuff() to be done
> with prec=5, but we forgot that calculate_something_else_
> with_5_digits() changes the precision and *doesn't restore
> it* because we didn't add a context manager to it.
>
> If we hadn't been lazy and had used a context manager in the
> first place, that wouldn't have happened.

How is PEP 550 is at fault of somebody being lazy and not using a
context manager?

PEP 550 has a hard requirement to make it possible for decimal/other
libraries to start using its APIs and stay backwards compatible, so it
allows `decimal.setcontext(ctx)` function to be implemented.  We are
fixing things here.

When you are designing a new library/API, you can use CMs and only
CMs.  It's up to you, as a library author, PEP 550 does not limit you.

And when you use CMs, there's no "problems" with 'yield from' or
anything in PEP 550.

>
> Summary: I think that skipping context managers in some
> circumstances is a bad habit that shouldn't be encouraged.

PEP 550 does not encourage coding without context managers.  It does,
in fact, solve the problem of reliably storing context to make writing
context managers possible.

To reiterate: it provides mechanism to set a variable within the
current logical thread, like storing a current request in an async
HTTP handler. Or to implement `decimal.setcontext`.  But you are free
to use it to only implement context managers in your library.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Guido van Rossum wrote:
Yeah, so my claim this is simply a non-problem, and you've pretty much 
just proved that by failing to come up with pointers to actual code that 
would suffer from this. Clearly you're not aware of any such code.


In response I'd ask Yuri to come up with examples of real
code that would benefit significantly from being able to
make context changes without wrapping them in a with
statement.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Yury Selivanov
On Wed, Sep 6, 2017 at 11:39 PM, Greg Ewing  wrote:
> Yury Selivanov wrote:
>>
>> It would be great if you or Greg could show a couple of real-world
>> examples showing the "issue" (with the current PEP 550
>> APIs/semantics).
>
>
> Here's one way that refactoring could trip you up.
> Start with this:
>
>async def foo():
>   calculate_something()
>   #in a coroutine, so we can be lazy and not use a cm
>   ctx = decimal.getcontext().copy()
>   ctx.prec = 5
>   decimal.setcontext(ctx)
>   calculate_something_else()
>
> And factor part of it out (into an *ordinary* function!)
>
>async def foo():
>   calculate_something()
>   calculate_something_else_with_5_digits()
>
>def calculate_something_else_with_5_digits():
>   ctx = decimal.getcontext().copy()
>   ctx.prec = 5
>   decimal.setcontext(ctx)
>   calculate_something_else()
>
> Now we add some more calculation to the end of foo():
>
>async def foo():
>   calculate_something()
>   calculate_something_else_with_5_digits()
>   calculate_more_stuff()
>
> Here we didn't intend calculate_more_stuff() to be done
> with prec=5, but we forgot that calculate_something_else_
> with_5_digits() changes the precision and *doesn't restore
> it* because we didn't add a context manager to it.
>
> If we hadn't been lazy and had used a context manager in the
> first place, that wouldn't have happened.
>
> Summary: I think that skipping context managers in some
> circumstances is a bad habit that shouldn't be encouraged.
>
>
> --
> Greg
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Yury Selivanov
On Wed, Sep 6, 2017 at 11:26 PM, Greg Ewing  wrote:
> Guido van Rossum wrote:
>>
>> This feels like a very abstract argument. I have a feeling that context
>> state propagating out of a call is used relatively rarely -- it  must work
>> for cases where you refactor something that changes context inline into a
>> utility function (e.g. decimal.setcontext()), but I just can't think of a
>> realistic example where coroutines (either of the yield-from variety or of
>> the async/def form) would be used for such a utility function.
>
>
> Yuri has already found one himself, the __aenter__ and __aexit__
> methods of an async context manager.

__aenter__ is not a generator and there's no 'yield from' there.
Coroutines (within an async task) leak state just like regular
functions (within a thread).

Your argument is to allow generators to leak context changes (right?).
AFAIK we don't use generators to implement __enter__ or __aenter__
(generators decorated with @types.coroutine or @asyncio.coroutine are
coroutines, according to PEP 492).  So this is irrelevant.

>
>> A utility function that sets context state but also makes a network call
>> just sounds like asking for trouble!
>
>
> I'm coming from the other direction. It seems to me that it's
> not very useful to allow with-statements to be skipped in
> certain very restricted circumstances.

Can you clarify what do you mean by "with-statements to be skipped"?
This language is not used in PEP 550 or in Python documentation.  I
honestly don't understand what it means.

>
> The only situation in which you will be able to take advantage
> of this is if the context change is being made in a generator
> or coroutine, and it is to apply to the whole body of that
> generator or coroutine.
>
> If you're in an ordinary function, you'll still have to use
> a context manager. If you only want the change to apply to
> part of the body, you'll still have to use a context manager.
>
> It would be simpler to just tell people to always use a
> context manager, wouldn't it?

Yes, PEP 550 wants people to always use a context managers!  Which
will work as you expect them to work for coroutines, generators, and
regular functions.  At this point I suspect you have some wrong idea
about some specification detail of PEP 550.  I understand what Koos is
talking about, but I really don't follow you.  Using the
"with-statements to be skipped" language is very confusing and doesn't
help to understand you.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Yury Selivanov wrote:

It would be great if you or Greg could show a couple of real-world
examples showing the "issue" (with the current PEP 550
APIs/semantics).


Here's one way that refactoring could trip you up.
Start with this:

   async def foo():
  calculate_something()
  #in a coroutine, so we can be lazy and not use a cm
  ctx = decimal.getcontext().copy()
  ctx.prec = 5
  decimal.setcontext(ctx)
  calculate_something_else()

And factor part of it out (into an *ordinary* function!)

   async def foo():
  calculate_something()
  calculate_something_else_with_5_digits()

   def calculate_something_else_with_5_digits():
  ctx = decimal.getcontext().copy()
  ctx.prec = 5
  decimal.setcontext(ctx)
  calculate_something_else()

Now we add some more calculation to the end of foo():

   async def foo():
  calculate_something()
  calculate_something_else_with_5_digits()
  calculate_more_stuff()

Here we didn't intend calculate_more_stuff() to be done
with prec=5, but we forgot that calculate_something_else_
with_5_digits() changes the precision and *doesn't restore
it* because we didn't add a context manager to it.

If we hadn't been lazy and had used a context manager in the
first place, that wouldn't have happened.

Summary: I think that skipping context managers in some
circumstances is a bad habit that shouldn't be encouraged.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Greg Ewing

Guido van Rossum wrote:
This feels like a very abstract argument. I have a feeling that context 
state propagating out of a call is used relatively rarely -- it  must 
work for cases where you refactor something that changes context inline 
into a utility function (e.g. decimal.setcontext()), but I just can't 
think of a realistic example where coroutines (either of the yield-from 
variety or of the async/def form) would be used for such a utility 
function.


Yuri has already found one himself, the __aenter__ and __aexit__
methods of an async context manager.

A utility function that sets context state but also makes a 
network call just sounds like asking for trouble!


I'm coming from the other direction. It seems to me that it's
not very useful to allow with-statements to be skipped in
certain very restricted circumstances.

The only situation in which you will be able to take advantage
of this is if the context change is being made in a generator
or coroutine, and it is to apply to the whole body of that
generator or coroutine.

If you're in an ordinary function, you'll still have to use
a context manager. If you only want the change to apply to
part of the body, you'll still have to use a context manager.

It would be simpler to just tell people to always use a
context manager, wouldn't it?

--
Greg

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 553: Built-in debug()

2017-09-07 Thread Terry Reedy

On 9/6/2017 6:24 PM, Barry Warsaw wrote:

On Sep 6, 2017, at 10:14, Fabio Zadrozny  wrote:


I think it's a nice idea.


Great!


Related to the name, on the windows c++ there's "DebugBreak":  
https://msdn.microsoft.com/en-us/library/windows/desktop/ms679297(v=vs.85).aspx, which I 
think is a better name (so, it'd be debug_break for Python -- I think it's better than 
plain breakpoint(), and wouldn't clash as debug()).


It’s important to understand that a new built-in is technically never going to 
clash with existing code, regardless of what it’s called.  Given Python’s name 
resolution rules, if your code uses any built, it’ll just shadow it.  That’s 
one big reason why the PEP proposed a built-in rather than say a keyword.

That said, while I like the more succinct `debug()` name, Guido prefers 
`breakpoint()` and that works fine for me.


I think I could change the hook on a custom sitecustomize (there's already one 
in place in PyDev) so that the debug_break() would actually read some env var 
to do that work (and provide some utility for users to pre-setup it when not 
launching from inside the IDE).


I had meant to add an open issue about the idea of adding an environment 
variable, such as $PYTHONBREAKPOINTHOOK which could be set to the callable to 
bind to sys.breakpointhook().


Environmental variables are set to strings, not objects.  It is not 
clear how you intend to handle the conversion.


--
Terry Jan Reedy


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com