[Python-Dev] So many PEPs

2021-12-07 Thread Koos Zevenhoven
Steering Council, g'day, Hawaii?

("copying" BDFL-emeritus, everyone)

I'm sorry you haven't heard from me in a while. Actually, neither have any
of my other "girl friends" heard from me. I was worried that they might be
sad. Then I found out that one of them was at some kind of party at a club
or something, and got in trouble because she couldn't be reached. I learned
from the newspaper and other media that she indeed was really really sad,
because of this reason.

Yesterday was supposed to be a big day for her, as the president was
throwing a party. But something was missing – because of COVID-19, i
believe. So various kinds of things have been going on, about which I
wasn't sure if i could tell you ;)

Anyway, I really needed to wrap something for her yesterday, but another
day has passed. And that is because of *me*.

In Finland, we have this thing "me", but it means "we". Then we also have
"hän", which means "he"/"she" etc. But using "hän" may feel needlessly
formal, so we often instead use "se", which means "it". And that's that.

Lots of these PEPs coming from everywhere and been going on for quite some
time now!
>>> it = iter(filter(f2, zip(filter(f1, peps), fun(other_peps  # ouch!

I had a laptop with me the other day when going for a swim and somehow the
battery had magically drained while I was swimming, although I thought I
didn't swim that long. So I think my mind (or "taka-raivo", meaning back of
the head; or "back rage"; or perhaps "muscle memory") is playing tricks on
me. Or it could be that I just forgot to charge it. Ok, that's a strange
story. There was another story, which is even stranger, but also quite
embarrassing. So I told you this boring one instead. Oh, and someone
mentioned a frog or prog or something and it seemed it was talking about
me, and someTHING reminded me of krog. Maybe I should have rejected PEPs
like this.

Anyway, today I was reminded of an actual *girl friend* from quite some
time ago, whose early PEP i didn't reject. We once went to India. The food
was good but she thought it was hot ;)

Oh, and there's a girl FRIEND of mine from school. She would call me Koppi,
meaning "knock-π", or "election booth" or something. At some point in
school we somehow ended up in a conversation about God. I had my own
scientific ideas about it, which was very strange in the context so I told
her a simplified version. But still, we keep rejecting each other's PEPs ;)

Hey wait, i'm talking about girls all the time? Ok, music maybe?

I had a two-meter high stack of recyclable paper from over the years. The
stack has fallen over and broken my shoe rack, and now i noticed that, for
some reason, the score of an often requested song was right there:

"I love coffee, i love tea, i love the Java Jive and it loves me",
dismissing the cover solo in the key of D.

(memories)
[.]
(saving the discussions)

This reminds me of someone saying "I hope `it` is not very long". In a
different context some years back, I tried to make someone longer, though
obeying the laws of physics. It was so interesting that even the police got
involved :D. However, there was no significant change in length. Just as
already predicted from special relativity.

There was a PEP about "early binding". That's a bit tricky, but seems to be
big in 日本. It's possible that I have taken an early course on such things.
I have a friend that could be into such things. Anyway, I'm sure there are
special interest groups for "binding" ;)

Recently, we drove with my parents to their summer place, which is not that
far. On the way there, he told me mostly jokes, but was also looking at the
difference between Merkel and Scholz (puntended in here!)

PEPs keep coming in from different places. Even coffee machines are playing
tricks on me.

I can't just reject all the PEPs, can I? So I'm in trouble here! (and I
originally wrote this as one of the first sentences in this email)

I don't even know how many PEPs there are, and whether I yet need yet
another one yet. And whether that's not NaN in some language.

Getting more and more random, and pretty long.

[]

Gooder like this? Yes me can! Let's C!
Started moving my feet.
TAU, i just had to dance a bit or two.

Tread safe.

Santa b*ches, shoe size 45!
Need food, wash my clothes.

Kiitos ja anteeksi.

BTW, tosi hieno E! Ah, yes.


Kind of wishing i could just reject all PEPs at once.

But instead I'm still in big trouble! Eeeek!

Help! Please help! Please, please help me deal with the PEPs!

In the name of loves and doves,
Signed, "sealed", am delivered,
Yours sincerely,
Koos

PS. I'm getting tired, but you are still welcome to rejoice Finland's 104
years of independence (itsenäisyys, technically meaning "being as yourself")

PSS. Currently struggling a bit with it myself; I blame the fact that these
were mostly non-standard PEPs ;)
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to 

[Python-Dev] Re: Questions about about the DLS 2020

2020-11-26 Thread Koos Zevenhoven
I've had some things going on, and I'm still trying to catch up with the
discussions here. Can someone tell me what would be the best place to look
at the most recent proposal? Is one of the PEPs up to date?

On Mon, Nov 16, 2020 at 7:02 PM Tobias Kohn  wrote:

> Hi Mark,
>
> Thank you for your interest and the questions.
>
>
> 1.  This really comes down to how you look at it, or how you define
> pattern matching.  The issue here is that the concept of pattern matching
> has grown into a large and somewhat diverse flock of interpretations and
> implementations (as a side note: interestingly enough, some of the only
> universally agreed-upon standards are to use `_` as a wildcard and not to
> mark names that capture/bind values---which are quite exactly the points
> most fiercely debatted here).
>
> Anyway, the paper presents the pattern matching structure we are proposing
> as one of three major variants of pattern matching:
> (a)  Matching arguments to parameters in a function call,
> (b)  Matching elements to elements in iterable unpacking,
> (c)  Matching tree-like data to general patterns in a conditional pattern
> matching structure.
>
> The last one is the subject of the PEP and the paper.  Nonetheless, in the
> first two cases (a) and (b), we find that indeed the computer will validate
> that the data matched the pattern and raise an exception if this fails.
> This is where this way of looking at it comes from.
>
>
> 2.  Yes, that is indeed a deliberate simplification.  The idea is to
> abstract away from the details of how exactly Python implements abstract
> syntax trees (which I honestly believe are irrelevant for the sake of the
> entire narrative).  Moreover, using strings here allows us to exemplify the
> literal patterns, rather only showcasing only the constructor/class pattern.
>
> Essentially, this is a question of making the most out of the little space
> available.
>
>
> Since you have addressed this email to me directly, I would like to take
> this opportunity and briefly stress that this paper really grew out of a
> team effort.  While I might have been the one pushing for an academic
> publication, the DLS'20 paper represents the input and ideas of all the
> authors, as well as the long discussions we had.  Of course, I am happy to
> answer any questions about the paper, but it would be wrong to see me as
> the one person behind it.
>
> Cheers,
> Tobias
>
>
>
> Quoting Mark Shannon :
>
> Hi Tobias,
>
> A couple of questions about the DLS 2020 paper.
>
> 1. Why do you use the term "validate" rather than "test" for the process
> of selecting a match?
>
> It seems to me, that this is a test, not a validation, as no exception is
> raised if a case doesn't match.
>
>
> 2. Is the error in the ast matching example, an intentional
> "simplification" or just an oversight?
>
> The example:
>
> ```
> def simplify(node):
>match node:
>case BinOp(Num(left), '+', Num(right)):
>return Num(left + right)
>case BinOp(left, '+' | '-', Num(0)):
>return simplify(left)
>case UnaryOp('-', UnaryOp('-', item)):
>return simplify(item)
>case _:
>return node
>
> ```
>
> is wrong.
>
> The correct version is
>
> ```
> def simplify(node):
>match node:
>case BinOp(Num(left), Add(), Num(right)):
>return Num(left + right)
>case BinOp(left, Add() | Sub(), Num(0)):
>return simplify(left)
>case UnaryOp(USub(), UnaryOp(USub(), item)):
>return simplify(item)
>case _:
>return node
> ```
>
> Cheers,
> Mark.
> ___
> Python-Dev mailing list -- python-dev@python.org
> To unsubscribe send an email to python-dev-le...@python.org
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-dev@python.org/message/ETZGYRCF4DR6RJXTHGXIRZXINXJ76J2D/Code
> of Conduct: http://python.org/psf/codeofconduct/
>
>
>
> ___
> Python-Dev mailing list -- python-dev@python.org
> To unsubscribe send an email to python-dev-le...@python.org
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-dev@python.org/message/CUNO57W7KSIM2WRROC5R43ZT7HUQZCZ6/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/FDP476BPKG55W2OP66VPMMGV5OBP67U3/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Pattern matching (alternative to PEP 622)

2020-08-04 Thread Koos Zevenhoven
Hi everyone,

By what I'm about to write below, I've aimed at a design to meet needs
resembling those that PEP 622 deals with, although with some differences,
especially in emphasis.

I'm not writing a full introduction here; the intended audience of this
email is people somewhat familiar with PEP 622 and its discussions. This
email doesn't have much structure or anything, but I hope it's sufficiently
clear.

Things that this design aims to address:

* Check whether an object's structure matches a given pattern
* Optionally extract desired values from within that structure
* Learning: patterns are understandable in a logical and consistent manner
* Matching can be easily tinkered with interactively
* Names used in patterns behave in a clear and intuitive way for Python
programmers
* ...

The most difficult thing is perhaps to understand what names mean: what is
being bound to, and what is a value defined elsewhere and so on. For
example, if a (somewhat unrealistic) pattern looks like

Point3D(x=3.14, y=6, z=_)

, there are four names that refer to something: `Point3D`, `x`, `y` and
`z`. To understand this, it is useful to think of the pattern as
corresponding to an expression, although it is not treated as quite the
same in the end. (So, here x, y, z refer to internals/arguments of Point3D)

The situation becomes more difficult when the values to compare with are in
variables, and/or if one wishes to extract a value from the structure.

Point3D(x=pi, y=SIX, z=value)

Now there is no way to tell, from this, which names refer to existing
objects and which should be bound to by the operation, except by guessing.
Here, `value` is supposed to be a binding target.

Python already has destructuring assignment (`a, b, *rest = values`), which
is similar to what happens in function calls: `def func(a, b, *rest): ...`.

However, compared to this, it is more useful to see patterns as working
backwards compared to this. For example, the semantics of the names in

lambda value: Point3D(x=pi, y=SIX, z=value)

are exactly as desired. That is, when the pattern matching is interpreted
as "does this object look like what this function would produce, and if so,
what would be the arguments of the function in that case?". Matching the
pattern would bind to `value`.

So, with this, a previous understanding of function definitions already
gives you a mental model for how names work in patterns *and* for what the
pattern is supposed to do.

In theory, the lambda expression could BE the syntax for a pattern.
However, many have wished for a different syntax even for lambdas. A
slightly nicer form would be to omit the keyword `lambda`, but then one
would still have to repeat the names to be bound. To avoid that, we need a
way to explicitly mark "not-yet-bound names":

Point3D(x=pi, y=SIX, z=value?)

Before going any further, how would one invoke the matching machinery? It
could be

 matches 

and that would evaluate to a boolean-like value.

With this syntax, the `is_tuple` example from PEP 622 would look something
like this:

def is_tuple(node: Node) -> bool:
if node matches Node(children=[LParen(), RParen()]):
return True
elif node matches Node(children=[Leaf(value="("), Node(),
Leaf(value=")")]):
return True
return False

Or something like:

def is_tuple(node: Node) -> bool:
if node matches tuple_pattern:
return True
return False

Here, `tuple_pattern` would be a pattern pre-defined with a function-like
syntax. Also, if there is a `tuple_pattern`, then `is_tuple` is probably
not needed at all.

Note that in PEP 622, is_tuple uses a match statement with three cases. So,
in effect, the full tuple_pattern had been split into subpatterns. This was
only possible because it was an OR pattern. In general, splitting a longer
pattern into cases like that is not possible. Longer function-like
patterns, on the other hand, can be expressed using more pattern functions
– just like regular functions can use helper functions.

Here, tuple_pattern isn't passed any arguments. It also doesn't have any
parameters. However, if it did, those would be considered wildcards, as I
believe we'd want both the programmer and the compiler/optimizers to
explicitly see, from the match expression, which names will/would be bound.
If something other than wildcard behavior is desired, that should be
explicitly specified in a "pattern function call".

Let's take another example pattern:

pdef main_diagonal_point(pos):
return Point3D(pos, pos, pos)

Now, `main_diagonal_point(0)` would refer to the origin, and `point matches
main_diagonal_point` would be true for every point with `x == y == z`.
Similarly, also `point matches main_diagonal_point(pos?)` should only match
if  `x == y == z` — and then bind that value to `pos`. However, one might
expect to be able to write the same thing inline as

point matches Point3D(pos?, pos?, pos?)

, so based on that, multiple occurrences of the same binding target should
ensure 

[Python-Dev] Re: PEP 622 aspects

2020-07-19 Thread Koos Zevenhoven
On Sun, Jul 19, 2020 at 3:00 PM Tobias Kohn  wrote:

> Quoting Koos Zevenhoven :
>
> > (1) Class pattern that does isinstance and nothing else.
> >
> > If I understand the proposed semantics correctly, `Class()` is
> equivalent to checking `isinstance(obj, Class)`, also when `__match_args__`
> is not present. However, if a future match protocol is allowed to override
> this behavior to mean something else, for example `Class() == obj`, then
> the plain isinstance checks won't work anymore! I do find `Class() == obj`
> to be a more intuitive and consistent meaning for `Class()` than plain
> `isinstance` is.
> >
> > Instead, the plain isinstance check would seem to be well described by a
> pattern like `Class(...)`. This would allow isinstance checks for any
> class, and there is even a workaround if you really want to refer to the
> Ellipsis object. This is also related to the following point.
> >
> > (2) The meaning of e.g. `Class(x=1, y=_)` versus `Class(x=1)`
> >
> > In the proposed semantics, cases like this are equivalent. I can see why
> that is desirable in many cases, although Class(x=1, ...)` would make it
> more clear. A possible improvement might be to add an optional element to
> `__match_args__` that separates optional arguments from required ones
> (although "optional" is not the same as "don't care").
>
>
> Please let me answer these two questions in reverse order, as I think it
> makes more sense to tackle the second one first.
>
Possibly. Although I do find (1) a more serious issue than (2). To not have
isinstance available by default in a consistent manner would definitely be
a problem in my opinion. But the way I proposed to solve (1) may affect the
user interpretations of (2).

> ***2. Attributes***
>
> There actually is an important difference between `Class(x=1, y=_)` and `
> Class(x=1)` and it won't do to just write `Class(x=1,...)` instead.  The
> form `Class(x=1, y=_)` ensures that the object has an attribute `y`.  In
> a way, this is where the "duck typing" is coming in.
>
Ok, that is indeed how the current class pattern match algorithm works
according to the current PEP 622. Let me rephrase the title of problem (2)
slightly to accommodate for this:

"(2) The meaning of e.g. `Class(x=1, y=_)` versus `Class(x=1)` (when the
object has attributes x, y and "x", "y" are in __match_arhs__)"

> The class of an object and its actual shape (i.e. the set of attributes it
> has) are rather loosely coupled in Python: there is usually nothing in the
> class itself that specifies what attributes an object has (other than the
> good sense to add these attributes in `__init__`).
>
Usually, it is bad practice to define classes whose interface is not or
cannot be specified. Python does, however, even allow you to make hacks
like tack an extra attribute to an object while it doesn't really "belong"
there.

> Conceptually, it therefore makes sense to not only support `isinstance`
> but also `hasattr`/`getattr` as a means to specify the shape/structure of
> an object.
>
> Here we agree (although not necessarily regarding "therefore").

> Let me give a very simple example from Python's `AST` module.  We know
> that compound statements have a field `body` (for the suite) and possibly
> even a field `orelse` (for the `else` part).  But there is no common
> superclass for compound statements.  Hence, although it is shared by
> several objects, you cannot detect this structure through `isinstance`
> alone.  By allowing you to explicitly specify attributes in patterns, you
> can still use pattern matching notwithstanding:
> ```
> *match* node:
> *case* ast.stmt(body=suite, orelse=else_suite) if else_suite:
> # a statement with a non-empty else-part
> ...
> *case* ast.stmt(body=suite):
> # a compound statement without else-part
> ...
> *case* ast.stmt():
> # a simple statement
> ...
> ```
>
So this is an example of a combination of duck-typing and a class type. I
agree it's good to be able to have this type of matching available. I can
only imagine the thought process that led you to bring up this example, but
I feel that we got stuck on whether an attribute is present or not, which
is a side track regarding the issues I pointed out.

Python can be written in many ways, but I'm not sure that the above example
is representative of how duck typing usually works. I see a lot more
situations where you either care about isinstance or about some duck typing
pattern – usually not both.

> The very basic form of class patterns could be described as `C(a_1=P_1,
> a_2=P_2, ...)`, where `C` is a class to be checked through `isinstance`,
> and the `a_

[Python-Dev] PEP 622 aspects

2020-07-18 Thread Koos Zevenhoven
PEP 622 authors,

Overall, the PEP describes the proposal quite nicely. However, I do indeed
have concerns and questions, some of which I describe in this email.

(1) Class pattern that does isinstance and nothing else.

If I understand the proposed semantics correctly, `Class()` is equivalent
to checking `isinstance(obj, Class)`, also when `__match_args__` is not
present. However, if a future match protocol is allowed to override this
behavior to mean something else, for example `Class() == obj`, then the
plain isinstance checks won't work anymore! I do find `Class() == obj` to
be a more intuitive and consistent meaning for `Class()` than plain
`isinstance` is.

Instead, the plain isinstance check would seem to be well described by a
pattern like `Class(...)`. This would allow isinstance checks for any
class, and there is even a workaround if you really want to refer to the
Ellipsis object. This is also related to the following point.

(2) The meaning of e.g. `Class(x=1, y=_)` versus `Class(x=1)`

In the proposed semantics, cases like this are equivalent. I can see why
that is desirable in many cases, although Class(x=1, ...)` would make it
more clear. A possible improvement might be to add an optional element to
`__match_args__` that separates optional arguments from required ones
(although "optional" is not the same as "don't care").

(3) Check for exhaustiveness at runtime

The PEP states:

Check exhaustiveness at runtime
> The question is what to do if no case clause has a matching pattern, and
> there is no default case. An earlier version of the proposal specified that
> the behavior in this case would be to throw an exception rather than
> silently falling through.
> The arguments back and forth were many, but in the end the EIBTI (Explicit
> Is Better Than Implicit) argument won out: it's better to have the
> programmer explicitly throw an exception if that is the behavior they want.
> For cases such as sealed classes and enums, where the patterns are all
> known to be members of a discrete set, static checkers can warn about
> missing patterns.


I don't understand this argument. Would it not be more explicit to have an
`else` or `case _` branch to say what should happen in that case?

(4) Check for exhaustiveness by static checkers

About this, the PEP states:

>From a reliability perspective, experience shows that missing a case when
> dealing with a set of possible data values leads to hard to debug issues,
> thus forcing people to add safety asserts like this:
> def get_first(data: Union[int, list[int]]) -> int:
> if isinstance(data, list) and data:
> return data[0]
> elif isinstance(data, int):
> return data
> else:
> assert False, "should never get here"
> PEP 484 specifies that static type checkers should support exhaustiveness
> in conditional checks with respect to enum values. PEP 586 later
> generalized this requirement to literal types.
> This PEP further generalizes this requirement to arbitrary patterns.


This seems reasonable. However, why is the standard for static and runtime
different? The corresponding runtime check is extremely easy and efficient
to do, so if this is an error according to static analysis, why not make it
an error at runtime too?

—Koos
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/JUG5JOLDF5NKQU7UFY3ZASKQEFDRJ2JG/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 622 version 2 (Structural Pattern Matching)

2020-07-18 Thread Koos Zevenhoven
On Sat, Jul 18, 2020 at 3:46 AM Terry Reedy  wrote:

>
> A major points of Kohn's post is that 'case' is analogous to 'def' and
> match lists are analogous to parameter lists.  In parameter lists,
> untagged simple names ('parameter names') are binding targets.
> Therefore, untagged simple names in match lists, let us call them 'match
> names' should be also.  I elaborated on this in my response to Tobias.
>
>
There are indeed analogous aspects, although not in the most
straightfoward/obvious ways. Still, perhaps even more so than there is
analogy with assignment targets.

This is related to one of my concerns regarding PEP 622. It may be tempting
to see pattern matching as a form of assignment. However, that is quite a
stretch, both conceptually and as a future direction. There is no way these
'match expressions' could be allowed in regular assignments – the way names
are treated just needs to be different. And allowing them in walrus
assignments doesn't make much sense either.

Conceptually, it is strange to call this match operation an assignment.
Most of the added power comes from checking that the object has a certain
structure or contents – and in many cases, that is the only thing it does!
As a (not always) handy side product, it is also able to assign things to
specified targets. Even then, the whole pattern is not assigned to, only
parts of it are.

In mathematics, assignment (definition) and re-assignment is often denoted
with the same sign as equality/identity, because it is usually clear from
the context, which one is in question. Usually, however, it matters which
one is in question. Therefore, as we well know, we have = for assignment,
== for equality, and := to emphasize assignment. Matching is closer to ==,
or almost :==.

So, in many ways, is the assignment that is special, not the matching. It
is also the main thing that differentiates this from the traditional
switch–case construct, which the proposed syntax certainly resembles.

—Koos
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/PZWA6F7A5HVQSSJU27KXOBWYVFJZLUF2/
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-Dev] Intention to accept PEP 567 (Context Variables)

2018-01-23 Thread Koos Zevenhoven
On Tue, Jan 23, 2018 at 2:23 AM, Victor Stinner 
wrote:

> The PEP 555 looks a competitor PEP of the PEP 567. Since the Yury's
> PEP 567 was approved, I understand that Koos's PEP 555 should be
> rejected, no?
>
>
If Guido prefers to reject it​, I assume he'll say so. Anyway, it's still
waiting for me to add references to earlier discussions and perhaps
summaries of some discussions.

Personally, I need to find some time to properly catch up with the latest
discussion to figure out why PEP 567 is better than PEP 555 (or similar
with .set(..), or PEP 550), despite problems of reasoning about the scopes
of variables and unset tokens.

In any case, congrats, Yury! This hasn't been an easy one for any of us,
and it seems like the implementation required quite a beastly patch too in
the end.

—Koos
​​
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-21 Thread Koos Zevenhoven
On Thu, Jan 18, 2018 at 3:53 AM, Yury Selivanov <yselivanov...@gmail.com>
wrote:

​[]


>
> Given the time frame of the Python 3.7 release schedule it was decided
> to defer this proposal to Python 3.8.
>

​It occurs to me that I had misread this to refer to the whole PEP.
Although I thought it's kind of sad that after all this, contextvars still
would not make it into 3.7, I also thought that it might be the right
decision. As you may already know, I think there are several problems with
this PEP. Would it be worth it to write down some thoughts on this PEP in
the morning?

-- Koos​



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-14 Thread Koos Zevenhoven
I'll quickly add a few things below just in case there's anyone that cares.

On Wed, Jan 10, 2018 at 2:06 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

>
> The idea was to always explicitly define the scope of contextvar values. A
> context manager / with statement determined the scope of .set(..)
> operations inside the with statement:
>
> # Version A:
> cvar.set(1)
> with context_scope():
> cvar.set(2)
>
> assert cvar.get() == 2
>
> assert cvar.get() == 1
>
> Then I added the ability to define scopes for different variables
> separately:
>
> # Version B
> cvar1.set(1)
> cvar2.set(2)
> with context_scope(cvar1):
> cvar1.set(11)
> cvar2.set(22)
>
> assert cvar1.get() == 1
> assert cvar2.get() == 22
>
>
> However, in practice, most libraries would wrap __enter__, set and
> __exit__ into another context manager. So maybe one might want to allow
> something like
>
> # Version C:
> assert cvar.get() == something
> with context_scope(cvar, 2):
> assert cvar.get() == 2
>
> assert cvar.get() == something
>
>
Note here, that the point is to get a natural way to "undo" changes made to
variables when exiting the scope. Undoing everything that is done within
the defined scope is a very natural way to do it. Undoing individual
.set(..) operations is more problematic.

Features B+C could be essentially implemented as described in PEP 555,
except with context_scope(cvar) being essentially the same as pushing and
popping an empty Assignment object onto the reverse-linked stack. By empty,
I mean a "key-value pair with a missing value". Then any set operations
would replace the topmost assignment object for that variable with a new
key-value pair (or push a new Assignment if there isn't one).

​However, to also get feature A, the stack may have to contain full
mappings instead of assignemnt objects with just one key-value pair.

I hope that clarifies some parts. Otherwise, in terms of semantics, the
same things apply as for PEP 555 when it comes to generator function calls
and next(..) etc., so we'd need to make sure it works well enough for all
use cases. For instance, I'm not quite sure if I have a good enough
understanding of the timeout example that Nathaniel wrote in the PEP 550
discussion to tell what would be required in terms of semantics, but I
suppose it should be fine.

-- Koos


> But this then led to combining "__enter__" and ".set(..)" into
> Assignment.__enter__ -- and "__exit__" into Assignment.__exit__ like this:
>
> # PEP 555 draft version:
> assert cvar.value == something
> with cvar.assign(1):
> assert cvar.value == 1
>
> assert cvar.value == something
>
>
> Anyway, given the schedule, I'm not really sure about the best thing to do
> here. In principle, something like in versions A, B and C above could be
> done (I hope the proposal was roughly self-explanatory based on earlier
> discussions). However, at this point, I'd probably need a lot of help to
> make that happen for 3.7.
>
> -- Koos
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-14 Thread Koos Zevenhoven
The timing of all of this is unfortunate. I'm sorry that my participation
in the discussion has been a bit "on-off" lately. But my recent
contributions have involved studying things like the interaction of
threading/concurrency aspects of signal handling, as well as investigating
subtleties of various proposals for context variables, including my own.
Those are not exactly low-hanging fruit, and I'm sorry about not being able
to eat them.

It is also unfortunate that I haven't written down this proposal
("versions" A-C) to anywhere near the amount of precision than I did for
PEP 555, which wasn't 100% specified in the first draft either. For
consideration, I just thought it's better to at least mention it, so that
those that now have a good understanding of the issues involved could
perhaps understand it. I can add more detail, but to make it a full
proposal now, I would probably need to join forces with a coauthor (with a
good understanding of these issues) to figure out missing parts. I could
tune in later to finish the PEP and write docs in case the approach gets
implemented.

-- Koos


On Wed, Jan 10, 2018 at 7:17 PM, Guido van Rossum <gu...@python.org> wrote:

> I'm sorry, Koos, but based on your past contributions I am not interested
> in discussing this topic with you.
>
> On Wed, Jan 10, 2018 at 8:58 AM, Koos Zevenhoven <k7ho...@gmail.com>
> wrote:
>
>> The status of PEP 555 is just a side track. Here, I took a step back
>> compared to what went into PEP 555.
>>
>> —Koos
>>
>>
>> On Wed, Jan 10, 2018 at 6:21 PM, Guido van Rossum <gu...@python.org>
>> wrote:
>>
>>> The current status of PEP 555 is "Withdrawn". I have no interest in
>>> considering it any more, so if you'd rather see a decision from me I'll be
>>> happy to change it to "Rejected".
>>>
>>> On Tue, Jan 9, 2018 at 10:29 PM, Koos Zevenhoven <k7ho...@gmail.com>
>>> wrote:
>>>
>>>> On Jan 10, 2018 07:17, "Yury Selivanov" <yselivanov...@gmail.com>
>>>> wrote:
>>>>
>>>> Wasn't PEP 555 rejected by Guido? What's the point of this post?
>>>>
>>>>
>>>> I sure hope there is a point. I don't think mentioning PEP 555 in the
>>>> discussions should hurt.
>>>>
>>>> A typo in my post btw: should be "PEP 567 (+568 ?)" in the second
>>>> paragraph of course.
>>>>
>>>> -- Koos (mobile)
>>>>
>>>>
>>>> Yury
>>>>
>>>> On Wed, Jan 10, 2018 at 4:08 AM Koos Zevenhoven <k7ho...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi all,
>>>>>
>>>>> I feel like I should write some thoughts regarding the "context"
>>>>> discussion, related to the various PEPs.
>>>>>
>>>>> I like PEP 567 (+ 567 ?) better than PEP 550. However, besides
>>>>> providing cvar.set(), I'm not really sure about the gain compared to PEP
>>>>> 555 (which could easily have e.g. a dict-like interface to the context).
>>>>> I'm still not a big fan of "get"/"set" here, but the idea was indeed to
>>>>> provide those on top of a PEP 555 type thing too.
>>>>>
>>>>> "Tokens" in PEP 567, seems to resemble assignment context managers in
>>>>> PEP 555. However, they feel a bit messy to me, because they make it look
>>>>> like one could just set a variable and then revert the change at any point
>>>>> in time after that.
>>>>>
>>>>> PEP 555 is in fact a simplification of my previous sketch that had a
>>>>> .set(..) in it, but was somewhat different from PEP 550. The idea was to
>>>>> always explicitly define the scope of contextvar values. A context manager
>>>>> / with statement determined the scope of .set(..) operations inside the
>>>>> with statement:
>>>>>
>>>>> # Version A:
>>>>> cvar.set(1)
>>>>> with context_scope():
>>>>> cvar.set(2)
>>>>>
>>>>> assert cvar.get() == 2
>>>>>
>>>>> assert cvar.get() == 1
>>>>>
>>>>> Then I added the ability to define scopes for different variables
>>>>> separately:
>>>>>
>>>>> # Version B
>>>>> cvar1.set(1)
>>>>> cvar2.set(2)
>>>>> with context_scope(cvar1):
>>>>> cvar1.set(11)
>>&

Re: [Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-10 Thread Koos Zevenhoven
The status of PEP 555 is just a side track. Here, I took a step back
compared to what went into PEP 555.

—Koos


On Wed, Jan 10, 2018 at 6:21 PM, Guido van Rossum <gu...@python.org> wrote:

> The current status of PEP 555 is "Withdrawn". I have no interest in
> considering it any more, so if you'd rather see a decision from me I'll be
> happy to change it to "Rejected".
>
> On Tue, Jan 9, 2018 at 10:29 PM, Koos Zevenhoven <k7ho...@gmail.com>
> wrote:
>
>> On Jan 10, 2018 07:17, "Yury Selivanov" <yselivanov...@gmail.com> wrote:
>>
>> Wasn't PEP 555 rejected by Guido? What's the point of this post?
>>
>>
>> I sure hope there is a point. I don't think mentioning PEP 555 in the
>> discussions should hurt.
>>
>> A typo in my post btw: should be "PEP 567 (+568 ?)" in the second
>> paragraph of course.
>>
>> -- Koos (mobile)
>>
>>
>> Yury
>>
>> On Wed, Jan 10, 2018 at 4:08 AM Koos Zevenhoven <k7ho...@gmail.com>
>> wrote:
>>
>>> Hi all,
>>>
>>> I feel like I should write some thoughts regarding the "context"
>>> discussion, related to the various PEPs.
>>>
>>> I like PEP 567 (+ 567 ?) better than PEP 550. However, besides providing
>>> cvar.set(), I'm not really sure about the gain compared to PEP 555 (which
>>> could easily have e.g. a dict-like interface to the context). I'm still not
>>> a big fan of "get"/"set" here, but the idea was indeed to provide those on
>>> top of a PEP 555 type thing too.
>>>
>>> "Tokens" in PEP 567, seems to resemble assignment context managers in
>>> PEP 555. However, they feel a bit messy to me, because they make it look
>>> like one could just set a variable and then revert the change at any point
>>> in time after that.
>>>
>>> PEP 555 is in fact a simplification of my previous sketch that had a
>>> .set(..) in it, but was somewhat different from PEP 550. The idea was to
>>> always explicitly define the scope of contextvar values. A context manager
>>> / with statement determined the scope of .set(..) operations inside the
>>> with statement:
>>>
>>> # Version A:
>>> cvar.set(1)
>>> with context_scope():
>>> cvar.set(2)
>>>
>>> assert cvar.get() == 2
>>>
>>> assert cvar.get() == 1
>>>
>>> Then I added the ability to define scopes for different variables
>>> separately:
>>>
>>> # Version B
>>> cvar1.set(1)
>>> cvar2.set(2)
>>> with context_scope(cvar1):
>>> cvar1.set(11)
>>> cvar2.set(22)
>>>
>>> assert cvar1.get() == 1
>>> assert cvar2.get() == 22
>>>
>>>
>>> However, in practice, most libraries would wrap __enter__, set and
>>> __exit__ into another context manager. So maybe one might want to allow
>>> something like
>>>
>>> # Version C:
>>> assert cvar.get() == something
>>> with context_scope(cvar, 2):
>>> assert cvar.get() == 2
>>>
>>> assert cvar.get() == something
>>>
>>>
>>> But this then led to combining "__enter__" and ".set(..)" into
>>> Assignment.__enter__ -- and "__exit__" into Assignment.__exit__ like this:
>>>
>>> # PEP 555 draft version:
>>> assert cvar.value == something
>>> with cvar.assign(1):
>>> assert cvar.value == 1
>>>
>>> assert cvar.value == something
>>>
>>>
>>> Anyway, given the schedule, I'm not really sure about the best thing to
>>> do here. In principle, something like in versions A, B and C above could be
>>> done (I hope the proposal was roughly self-explanatory based on earlier
>>> discussions). However, at this point, I'd probably need a lot of help to
>>> make that happen for 3.7.
>>>
>>> -- Koos
>>>
>>> ___
>>> Python-Dev mailing list
>>> Python-Dev@python.org
>>> https://mail.python.org/mailman/listinfo/python-dev
>>> Unsubscribe: https://mail.python.org/mailma
>>> n/options/python-dev/yselivanov.ml%40gmail.com
>>>
>>
>>
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido%
>> 40python.org
>>
>>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-09 Thread Koos Zevenhoven
On Jan 10, 2018 07:17, "Yury Selivanov" <yselivanov...@gmail.com> wrote:

Wasn't PEP 555 rejected by Guido? What's the point of this post?


I sure hope there is a point. I don't think mentioning PEP 555 in the
discussions should hurt.

A typo in my post btw: should be "PEP 567 (+568 ?)" in the second paragraph
of course.

-- Koos (mobile)


Yury

On Wed, Jan 10, 2018 at 4:08 AM Koos Zevenhoven <k7ho...@gmail.com> wrote:

> Hi all,
>
> I feel like I should write some thoughts regarding the "context"
> discussion, related to the various PEPs.
>
> I like PEP 567 (+ 567 ?) better than PEP 550. However, besides providing
> cvar.set(), I'm not really sure about the gain compared to PEP 555 (which
> could easily have e.g. a dict-like interface to the context). I'm still not
> a big fan of "get"/"set" here, but the idea was indeed to provide those on
> top of a PEP 555 type thing too.
>
> "Tokens" in PEP 567, seems to resemble assignment context managers in PEP
> 555. However, they feel a bit messy to me, because they make it look like
> one could just set a variable and then revert the change at any point in
> time after that.
>
> PEP 555 is in fact a simplification of my previous sketch that had a
> .set(..) in it, but was somewhat different from PEP 550. The idea was to
> always explicitly define the scope of contextvar values. A context manager
> / with statement determined the scope of .set(..) operations inside the
> with statement:
>
> # Version A:
> cvar.set(1)
> with context_scope():
> cvar.set(2)
>
> assert cvar.get() == 2
>
> assert cvar.get() == 1
>
> Then I added the ability to define scopes for different variables
> separately:
>
> # Version B
> cvar1.set(1)
> cvar2.set(2)
> with context_scope(cvar1):
> cvar1.set(11)
> cvar2.set(22)
>
> assert cvar1.get() == 1
> assert cvar2.get() == 22
>
>
> However, in practice, most libraries would wrap __enter__, set and
> __exit__ into another context manager. So maybe one might want to allow
> something like
>
> # Version C:
> assert cvar.get() == something
> with context_scope(cvar, 2):
> assert cvar.get() == 2
>
> assert cvar.get() == something
>
>
> But this then led to combining "__enter__" and ".set(..)" into
> Assignment.__enter__ -- and "__exit__" into Assignment.__exit__ like this:
>
> # PEP 555 draft version:
> assert cvar.value == something
> with cvar.assign(1):
> assert cvar.value == 1
>
> assert cvar.value == something
>
>
> Anyway, given the schedule, I'm not really sure about the best thing to do
> here. In principle, something like in versions A, B and C above could be
> done (I hope the proposal was roughly self-explanatory based on earlier
> discussions). However, at this point, I'd probably need a lot of help to
> make that happen for 3.7.
>
> -- Koos
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-09 Thread Koos Zevenhoven
Hi all,

I feel like I should write some thoughts regarding the "context"
discussion, related to the various PEPs.

I like PEP 567 (+ 567 ?) better than PEP 550. However, besides providing
cvar.set(), I'm not really sure about the gain compared to PEP 555 (which
could easily have e.g. a dict-like interface to the context). I'm still not
a big fan of "get"/"set" here, but the idea was indeed to provide those on
top of a PEP 555 type thing too.

"Tokens" in PEP 567, seems to resemble assignment context managers in PEP
555. However, they feel a bit messy to me, because they make it look like
one could just set a variable and then revert the change at any point in
time after that.

PEP 555 is in fact a simplification of my previous sketch that had a
.set(..) in it, but was somewhat different from PEP 550. The idea was to
always explicitly define the scope of contextvar values. A context manager
/ with statement determined the scope of .set(..) operations inside the
with statement:

# Version A:
cvar.set(1)
with context_scope():
cvar.set(2)

assert cvar.get() == 2

assert cvar.get() == 1

Then I added the ability to define scopes for different variables
separately:

# Version B
cvar1.set(1)
cvar2.set(2)
with context_scope(cvar1):
cvar1.set(11)
cvar2.set(22)

assert cvar1.get() == 1
assert cvar2.get() == 22


However, in practice, most libraries would wrap __enter__, set and __exit__
into another context manager. So maybe one might want to allow something
like

# Version C:
assert cvar.get() == something
with context_scope(cvar, 2):
assert cvar.get() == 2

assert cvar.get() == something


But this then led to combining "__enter__" and ".set(..)" into
Assignment.__enter__ -- and "__exit__" into Assignment.__exit__ like this:

# PEP 555 draft version:
assert cvar.value == something
with cvar.assign(1):
assert cvar.value == 1

assert cvar.value == something


Anyway, given the schedule, I'm not really sure about the best thing to do
here. In principle, something like in versions A, B and C above could be
done (I hope the proposal was roughly self-explanatory based on earlier
discussions). However, at this point, I'd probably need a lot of help to
make that happen for 3.7.

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] generator vs iterator etc. (was: How assignment should work with generators?)

2017-11-27 Thread Koos Zevenhoven
On Mon, Nov 27, 2017 at 3:55 PM, Steven D'Aprano <st...@pearwood.info>
 wrote:

> On Mon, Nov 27, 2017 at 12:17:31PM +0300, Kirill Balunov wrote:
> ​​
>
> > 2. Should this work only for generators or for any iterators?
>
> I don't understand why you are even considering singling out *only*
> generators. A generator is a particular implementation of an iterator. I
> can write:
>
> def gen():
>yield 1; yield 2; yield 3
>
> it = gen()
>
> or I can write:
>
> it = iter([1, 2, 3])
>
> and the behaviour of `it` should be identical.
>
>
>
​I can see where this is coming from. The thing is that "iterator" and
"generator" are mostly synonymous, except two things:

(1) Generators are iterators that are produced by a generator function

(2) Generator functions are sometimes referred to as just "generators"

The concept of "generator" thus overlaps with both "iterator" and
"generator function".

Then there's also "iterator" and "iterable", which are two different things:

(3) If `obj` is an *iterable*, then `it = iter(obj)` is an *iterator* (over
the contents of `obj`)

(
​4) ​Iterators yield values, for example on explicit calls to next(it).

Personally I have leaned towards keeping a clear distinction between
"generator function" and "generator"​, which leads to the situation that
"generator" and "iterator" are mostly synonymous for me. Sometimes, for
convenience, I use the term "generator" to refer to "iterators" more
generally. This further seems to have a minor benefit that "generators" and
"iterables" are less easily confused with each other than "iterators" and
"iterables".

I thought about this issue some time ago for the `views` package, which has
a separation between sequences (seq) and other iterables (gen):

https://github.com/k7hoven/views

The functionality provided by `views.gen` is not that interesting—it's
essentially a subset of itertools functionality, but with an API that
parallels `views.seq` which works with sequences (iterable, sliceable,
chainable, etc.). I used the name `gen`, because iterator/iterable variants
of the functionality can be implemented with generator functions (although
also with other kinds of iterators/iterables). Calling the thing `iter`
would have conflicted with the builtin `iter`.

HOWEVER, this naming can be confusing for those that lean more towards
using "generator" to also mean "generator function", and for those that are
comfortable with the term "iterator" despite its resemblance to "iterable".

Now I'm actually seriously considering to consider renaming `views.gen` to `
views.iter` when I have time. After all, there's already `views.range`
which "conflicts" with the builtin range.

​Anyway, the point is that the naming is suboptimal.​

SOLUTION: Maybe (a) all iterators should be called iterators or (b) all
iterators should be called generators, regardless of whether they are
somehow a result of a generator function having been called in the past.

(I'm not going into the distinction between things that can receive values
via `send` or any other possible distinctions between different types of
iterators and iterables.)

​—Koos​

​(discussion originated from python-ideas, but cross-posted to python-dev
in case there's more interest there)​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Comments on PEP 563 (Postponed Evaluation of Annotations)

2017-11-20 Thread Koos Zevenhoven
On Mon, Nov 20, 2017 at 7:58 PM, Lukasz Langa <luk...@langa.pl> wrote:

> I agree with you. The special handling of outermost strings vs. strings
> embedded inside annotations bugged me a lot. Now you convinced me that this
> functionality should be moved to `get_type_hints()` and the __future__
> import shouldn't try to special-case this one instance, while leaving
> others as is.
>
> ​
> <Python-Dev@python.org>


That's better. I don't necessarily care if there will be a warning when a
string is given as annotation, but if the idea is to simplify things for
the future and get rid of strings to represent types, then this would be a
good moment to gently "enforce" it.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 563: Postponed Evaluation of Annotations

2017-11-19 Thread Koos Zevenhoven
On Mon, Nov 13, 2017 at 11:59 PM, Brett Cannon <br...@python.org> wrote:
​[..]​

> On Sun, Nov 12, 2017, 10:22 Koos Zevenhoven, <k7ho...@gmail.com> wrote:
> ​​
>
>>
>> There's two thing I don't understand here:
>>
>> * What does it mean to preserve the string verbatim? No matter how I read
>> it, I can't tell if it's with quotes or without.
>>
>> Maybe I'm missing some context.
>>
>
> I believe the string passes through unchanged (i.e. no quotes). Think of
> the PEP as simply turning all non-string annotations into string ones.
>
>
​Ok, maybe that was just wishful thinking on my part ;-).

More info in the other threads, for example:

https://mail.python.org/pipermail/python-dev/2017-November/150642.html
https://mail.python.org/pipermail/python-dev/2017-November/150637.html

-- Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] __future__ imports and breaking code (was: PEP 563: Postponed Evaluation of Annotations)

2017-11-19 Thread Koos Zevenhoven
Previously, I expressed some concerns about PEP 563 regarding what should
happen when a string is used as an annotation. Since my point here is more
general, I'm starting yet another thread.

For a lot of existing type-annotated code, adding "from __tuture__ import
annotations" [1] *doesn't break anything*.

But that doesn't seem right. The whole point of __future__ imports is to
break things. Maybe the __future__ import will not give a 100% equivalent
functionality to what will be in Python 4 by default, but anyway, it's
Python 4 that should break as little as possible. This leaves the breaking
business to the future import, if necessary.

If someone cares enough to add the future import that avoids needing string
annotations for forward references, it shouldn't be such a big deal to get
a warning if there's a string annotation left. But the person upgrading to
Python 4 (or whatever they might be upgrading) will have a lot less
motivation to figure out what went wrong.

Then again, code that works in both Python 3 and 4 could still have the
future import. But that would defeat the purpose of Python 4 as a clean and
high-performance dynamic language.

—Koos


[1] As defined in the PEP 563 draft:
https://mail.python.org/pipermail/python-dev/2017-November/150062.html

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Make the stable API-ABI usable

2017-11-18 Thread Koos Zevenhoven
 to a *function call* rather
> than the existing macro, so the compiled C extension will use a
> function call and so don't rely on the ABI anymore.
>
>
> My plan is to have two main milestones:
>
> (1) Python 3.7: Extend the *existing* opt-in "stable API" which
> requires to compile C extensions in a special mode. Add maybe an
> option in distutils to ease the compilation of a C extension with the
> "stable API"?
>
> (2) In Python 3.8, --if the project is successful and the performance
> overhead is acceptable compared the advantages of having C extensions
> working on multiple Python verisons--, make the "stable API (without
> implementation details)" the default, but add a new opt-in option to
> give access to the "full API (with implementation details)" for
> debuggers and other people who understand what they do (like Cython?).
>
> Note: currently, the "stable API" is accessible using Py_LIMITED_API
> define, and the "full API" is accessible using Py_BUILD_CORE define.
> No define gives the current C API.
>
>
> My problem is more on the concrete implementation:
>
> * Need to provide two different API using the same filenames (like:
> #include "Python.h")
>
> * Need to extend distutils to have a flag to compile a C extension
> with one specific API (define Py_LIMITED_API or Py_BUILD_CORE?)
>
> * Need to test many C extensions and check how many extensions are broken
>
>
> My plan for Python 3.7 is to not touch the current API at all. There
> is no risk of backward incompatibility. You should only get issues if
> you opt-in for the new API without implementation details.
>
>
> Final note: Nothing new under the sun: PyPy already implemented my
> "idea"! Where the idea is a C API without macros; PyTuple_GET_ITEM()
> is already a function call in PyPy ;-)
>
>
> Final question: Is it acceptable to iterate on many small changes on
> the C API implement this idea in Python 3.7? Maybe only write partial
> implementation, and finish it in Python 3.8?
>
> Victor
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Python possible vulnerabilities in concurrency

2017-11-17 Thread Koos Zevenhoven
On Fri, Nov 17, 2017 at 3:40 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

> On Thu, Nov 16, 2017 at 6:53 AM, Guido van Rossum <gu...@python.org>
> wrote:
>
>> On Wed, Nov 15, 2017 at 6:50 PM, Guido van Rossum <gu...@python.org>
>> wrote:
>>>
>>>
>>> Actually it linked to http://standards.iso.org/ittf/
>>> PubliclyAvailableStandards/index.html from which I managed to download
>>> what looks like the complete c061457_ISO_IEC_TR_24772_2013.pdf (336
>>> pages) after clicking on an "I accept" button (I didn't read what I
>>> accepted :-). The $200 is for the printed copy I presume.
>>>
>>
>> So far I learned one thing from the report. They use the term
>> "vulnerabilities" liberally, defining it essentially as "bug":
>>
>> All programming languages contain constructs that are incompletely
>>> specified, exhibit undefined behaviour, are implementation-dependent, or
>>> are difficult to use correctly. The use of those constructs may therefore
>>> give rise to *vulnerabilities*, as a result of which, software programs
>>> can execute differently than intended by the writer.
>>>
>>
>> They then go on to explain that sometimes vulnerabilities can be
>> exploited, but I object to calling all bugs vulnerabilities -- that's just
>> using a scary word to get attention for a sleep-inducing document
>> containing such gems as "Use floating-point arithmetic only when absolutely
>> needed" (page 230).
>>
>>
> ​I don't like such a definition of "vulnerability" either. Some bugs can
> be vulnerabilities (those that can be exploited) and some vulnerabilities
> can be bugs. But there are definitely types of vulnerabilities that are not
> bugs––the DoS vulnerability that is eliminated by hash randomization is one.
>
> There may also be a gray area of bugs that can be vulnerabilities but only
> in some special situation. I think it's ok to call those vulnerabilities
> too.
>
>
​Just to clarify the obvious: By the above, I *don't* mean that one could
use the word "vulnerability" for any functionality that can be used in such
a way that it creates a vulnerability. For example, `eval` or `exec` or
`open` by themselves are not vulnerabilities.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Python possible vulnerabilities in concurrency

2017-11-17 Thread Koos Zevenhoven
On Thu, Nov 16, 2017 at 6:53 AM, Guido van Rossum <gu...@python.org> wrote:

> On Wed, Nov 15, 2017 at 6:50 PM, Guido van Rossum <gu...@python.org>
> wrote:
>>
>>
>> Actually it linked to http://standards.iso.org/ittf/
>> PubliclyAvailableStandards/index.html from which I managed to download
>> what looks like the complete c061457_ISO_IEC_TR_24772_2013.pdf (336
>> pages) after clicking on an "I accept" button (I didn't read what I
>> accepted :-). The $200 is for the printed copy I presume.
>>
>
> So far I learned one thing from the report. They use the term
> "vulnerabilities" liberally, defining it essentially as "bug":
>
> All programming languages contain constructs that are incompletely
>> specified, exhibit undefined behaviour, are implementation-dependent, or
>> are difficult to use correctly. The use of those constructs may therefore
>> give rise to *vulnerabilities*, as a result of which, software programs
>> can execute differently than intended by the writer.
>>
>
> They then go on to explain that sometimes vulnerabilities can be
> exploited, but I object to calling all bugs vulnerabilities -- that's just
> using a scary word to get attention for a sleep-inducing document
> containing such gems as "Use floating-point arithmetic only when absolutely
> needed" (page 230).
>
>
​I don't like such a definition of "vulnerability" either. Some bugs can be
vulnerabilities (those that can be exploited) and some vulnerabilities can
be bugs. But there are definitely types of vulnerabilities that are not
bugs––the DoS vulnerability that is eliminated by hash randomization is one.

There may also be a gray area of bugs that can be vulnerabilities but only
in some special situation. I think it's ok to call those vulnerabilities
too.

​––Koos​


​PS. How come I haven't seen a proposal to remove the float type from
builtins yet?-)​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 560: bases classes / confusion

2017-11-16 Thread Koos Zevenhoven
On Thu, Nov 16, 2017 at 6:28 PM, brent bejot <brent.be...@gmail.com> wrote:

> Hello all,
>
> Noticed that "MRO" is not actually defined in the PEP and it seems like it
> should be.  Probably in the Performance section where the abbreviation is
> first used outside of a function name.
>
>
​I don't think it will hurt if I suggest that __bases__, bases, "original
bases"​, mro, __orig_bases__, MRO, __mro__ and "concatenated mro entries"
are all defined as synonyms of each other, except with different meanings
:-)

––Koos



> -Brent
>
> On Thu, Nov 16, 2017 at 7:22 AM, Ivan Levkivskyi <levkivs...@gmail.com>
> wrote:
>
>> On 16 November 2017 at 07:56, Nick Coghlan <ncogh...@gmail.com> wrote:
>>
>>> On 16 November 2017 at 04:39, Ivan Levkivskyi <levkivs...@gmail.com>
>>> wrote:
>>>
>>>> Nick is exactly right here. Jim, if you want to propose alternative
>>>> wording, then we could consider it.
>>>>
>>>
>>> Jim also raised an important point that needs clarification at the spec
>>> level: given multiple entries in "orig_bases" with __mro_entries__ methods,
>>> do all such methods get passed the *same* orig_bases tuple? Or do they
>>> receive partially resolved ones, such that bases listed before them have
>>> already been resolved to their MRO entries by the time they run.
>>>
>>>
>>>
>> Yes, they all get the same initial bases tuple as an argument. Passing
>> updated ones will cost a bit more and I don't think it will be needed (in
>> the worst case a base can resolve another base by calling its
>> __mro_entries__ manually).
>> I will clarify this in the PEP.
>>
>> --
>> Ivan
>>
>>
>>
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe: https://mail.python.org/mailman/options/python-dev/brent.
>> bejot%40gmail.com
>>
>>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 562

2017-11-15 Thread Koos Zevenhoven
On Wed, Nov 15, 2017 at 8:02 PM, Ethan Furman <et...@stoneleaf.us> wrote:

> On 11/15/2017 04:55 AM, Koos Zevenhoven wrote:
>
>> On Tue, Nov 14, 2017 at 10:34 PM, Ivan Levkivskyi wrote:
>>
>
>
>> Rationale
>>> =
>>>
>>> [...] It would be convenient to simplify this
>>> procedure by recognizing ``__getattr__`` defined directly in a module
>>> that
>>> would act like a normal ``__getattr__`` method
>>>
>> >>
> >> [...]
> >>
>
>> Specification
>>> =
>>>
>>> >> The ``__getattr__`` function at the module level should accept one
> argument
>
>> which is the name of an attribute and return the computed value or raise
>>> an ``AttributeError``::
>>>
>>
>>def __getattr__(name: str) -> Any: ...
>>>
>>
>> This function will be called only if ``name`` is not found in the module
>>> through the normal attribute lookup.
>>>
>>
>> The Rationale (quoted in the beginning of this email) easily leaves a
>> different impression of this.​
>>
>
> I don't see how.  This is exactly the way normal __getattr__ works.
>
>
>
​Oh sorry, I think I put this email together too quickly. I was writing
down a bunch of thoughts I had earlier but hadn't written down.​ I think I
was mixing this up in my head with overriding __getitem__ for the module
namespace dict and __class_getitem__ from PEP 560, which only gets called
if the metaclass doesn't implement __getitem__ (IIRC).

But I did have another thought related to this. I was wondering whether the
lack of passing the module to the methods as `self` would harm future
attempts to generalize these ideas.

-- Koos

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 560: bases classes / confusion

2017-11-15 Thread Koos Zevenhoven
On Wed, Nov 15, 2017 at 5:37 PM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 16 November 2017 at 00:20, Jim J. Jewett <jimjjew...@gmail.com> wrote:
>
>> I *think* the following will happen:
>>
>> "NewList[int]" will be evaluated, and __class_getitem__ called, so
>> that the bases tuple will be (A, GenericAlias(NewList, int), B)
>>
>> # (A)  I *think* __mro_entries__ gets called with the full tuple,
>> # instead of just the object it is found on.
>> # (B) I *think* it is called on the results of evaluating
>> # the terms within the tuple, instead of the original
>> # string representation.
>> _tmp = __mro_entries__(A, GenericAlias(NewList, int), B)
>>
>> # (C)  I *think* __mro_entries__ returns a replacement for
>> # just the single object, even though it was called on
>> # the whole tuple, without knowing which object it
>> # represents.
>> bases = (A, _tmp, B)
>>
>
> My understanding of the method signature:
>
> def __mro_entries__(self, orig_bases):
> ...
> return replacement_for_self
>
> My assumption as to the purpose of the extra complexity was:
>
> - given orig_bases, a method could avoid injecting bases already listed if
> it wanted to
> - allowing multiple items to be returned provides a way to
> programmatically combine mixins without having to define a new subclass for
> each combination
>
>

​Thanks, this might provide an answer to my question about multiple mro
entries here

https://mail.python.org/pipermail/python-ideas/2017-November/047897.html​

​––Koos​



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 560: bases classes / confusion

2017-11-15 Thread Koos Zevenhoven
For anyone confused about similar things, I expect you to be interested in
my post on python-ideas from today:

https://mail.python.org/pipermail/python-ideas/2017-November/047896.html

––Koos


On Wed, Nov 15, 2017 at 4:20 PM, Jim J. Jewett <jimjjew...@gmail.com> wrote:

> (1)  I found the following (particularly "bases classes") very confusing:
>
> """
> If an object that is not a class object appears in the bases of a class
>
> definition, then ``__mro_entries__`` is searched on it. If found,
> it is called with the original tuple of bases as an argument. The result
> of the call must be a tuple, that is unpacked in the bases classes in place
> of this object. (If the tuple is empty, this means that the original bases
> is
> simply discarded.)
> """
>
> Based on the following GenericAlias/NewList/Tokens example, I think I
> now I understand what you mean, and would have had somewhat less
> difficulty if it were expressed as:
>
> """
> When an object that is not a class object appears in the (tuple of)
> bases of a class
> definition, then attribute ``__mro_entries__`` is searched on that
> non-class object.  If ``__mro_entries__`` found,
> it is called with the entire original tuple of bases as an argument. The
> result
> of the call must be a tuple, which is unpacked and replaces only the
> non-class object in the tuple of bases.  (If the tuple is empty, this
> means that the original bases
> is
> simply discarded.)
> """
>
> Note that this makes some assumptions about the __mro_entries__
> signature that I wasn't quite sure about from the example.  So
> building on that:
>
> class ABList(A, NewList[int], B):
>
> I *think* the following will happen:
>
> "NewList[int]" will be evaluated, and __class_getitem__ called, so
> that the bases tuple will be (A, GenericAlias(NewList, int), B)
>
> # (A)  I *think* __mro_entries__ gets called with the full tuple,
> # instead of just the object it is found on.
> # (B) I *think* it is called on the results of evaluating
> # the terms within the tuple, instead of the original
> # string representation.
> _tmp = __mro_entries__(A, GenericAlias(NewList, int), B)
>
> # (C)  I *think* __mro_entries__ returns a replacement for
> # just the single object, even though it was called on
> # the whole tuple, without knowing which object it
> # represents.
> bases = (A, _tmp, B)
>
> # (D) If there are two non-class objects, I *think* the
> # second one gets the same arguments as the first,
> # rather than an intermediate tuple with the first such
> # object already substituted out.
>
> -jJ
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 562

2017-11-15 Thread Koos Zevenhoven
On Tue, Nov 14, 2017 at 10:34 PM, Ivan Levkivskyi <levkivs...@gmail.com>
wrote:
​[..]​


> Rationale
> =
>
> It is sometimes convenient to customize or otherwise have control over
> access to module attributes. A typical example is managing deprecation
> warnings. Typical workarounds are assigning ``__class__`` of a module
> object
> to a custom subclass of ``types.ModuleType`` or replacing the
> ``sys.modules``
> item with a custom wrapper instance. It would be convenient to simplify
> this
> procedure by recognizing ``__getattr__`` defined directly in a module that
> would act like a normal ``__getattr__`` method, except that it will be
> defined
> on module *instances*. For example::
>
>
  # lib.py
>
>   from warnings import warn
>
>   deprecated_names = ["old_function", ...]
>
>   def _deprecated_old_function(arg, other):
>   ...
>
>   def __getattr__(name):
>   if name in deprecated_names:
>   warn(f"{name} is deprecated", DeprecationWarning)
>   return globals()[f"_deprecated_{name}"]
>   raise AttributeError(f"module {__name__} has no attribute {name}")
>
>   # main.py
>
>   from lib import old_function  # Works, but emits the warning
>
>
​Deprecating functions is already possible, so I assume the reason for this
would be performance? If so, are you sure this would help for performance?
​
​Deprecating module attributes / globals is indeed difficult to do at
present. This PEP would allow deprecation warnings for accessing
attributes, which is nice!  However, as thread-unsafe as it is, many
modules use module attributes to configure the state of the module. In that
case, the user is more likely to *set* the attribute that to *get* it. Is
this outside the scope of the PEP?


​[..]​


> There is a related proposal PEP 549 that proposes to support instance
> properties for a similar functionality. The difference is this PEP proposes
> a faster and simpler mechanism, but provides more basic customization.
>

​I'm not surprised that the comparison is in favor of this PEP ;-).​


​[..]​


> Specification
> =
>
> The ``__getattr__`` function at the module level should accept one argument
> which is the name of an attribute and return the computed value or raise
> an ``AttributeError``::
>
>   def __getattr__(name: str) -> Any: ...
>
> This function will be called only if ``name`` is not found in the module
> through the normal attribute lookup.
>
>
The Rationale (quoted in the beginning of this email) easily leaves a
different impression of this.​


​[..]
​

>
> Discussion
> ==
>
> Note that the use of module ``__getattr__`` requires care to keep the
> referred
> objects pickleable. For example, the ``__name__`` attribute of a function
> should correspond to the name with which it is accessible via
> ``__getattr__``::
>
>   def keep_pickleable(func):
>   func.__name__ = func.__name__.replace('_deprecated_', '')
>   func.__qualname__ = func.__qualname__.replace('_deprecated_', '')
>   return func
>
>   @keep_pickleable
>   def _deprecated_old_function(arg, other):
>   ...
>
> One should be also careful to avoid recursion as one would do with
> a class level ``__getattr__``.
>
>
Off-topic: In some sense, I'm happy to hear something about pickleability.
But in some sense not.

I think there are three kinds of people regarding pickleability:

1. Those who don't care about anything being pickleable

2. Those ​who care about some things being picklable

​3. ​Those who care about all things being picklable

Personally, I'd like to belong to group 3, but because group 3 cannot even
attempt to coexist with groups 1 and 2, I actually belong to group 1 most
of the time.

​––Koos
​


> References
> ==
>
> .. [1] PEP 484 section about ``__getattr__`` in stub files
>(https://www.python.org/dev/peps/pep-0484/#stub-files)
>
> .. [2] The reference implementation
>(https://github.com/ilevkivskyi/cpython/pull/3/files)
>
>
> Copyright
> =
>
> This document has been placed in the public domain.
>
>
>
> ..
>Local Variables:
>mode: indented-text
>indent-tabs-mode: nil
>    sentence-end-double-space: t
>fill-column: 70
>coding: utf-8
>End:
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 563: Postponed Evaluation of Annotations

2017-11-12 Thread Koos Zevenhoven
On Nov 12, 2017 19:10, "Guido van Rossum" <gu...@python.org> wrote:

On Sun, Nov 12, 2017 at 4:14 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

> So actually my question is: What should happen when the annotation is
> already a string literal?
>

The PEP answers that clearly (under Implementation):

> If an annotation was already a string, this string is preserved
> verbatim.


Oh sorry, I was looking for a spec, so I somehow assumed I can ignore the
gory implementation details just like I routinely ignore things like
headers and footers of emails.

There's two thing I don't understand here:

* What does it mean to preserve the string verbatim? No matter how I read
it, I can't tell if it's with quotes or without.

Maybe I'm missing some context.


-- Koos (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 563: Postponed Evaluation of Annotations

2017-11-12 Thread Koos Zevenhoven
On Sun, Nov 12, 2017 at 7:07 AM, Guido van Rossum <gu...@python.org> wrote:

> On Fri, Nov 10, 2017 at 11:02 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
>
>> On 11 November 2017 at 01:48, Guido van Rossum <gu...@python.org> wrote:
>> > I don't mind the long name. Of all the options so far I really only like
>> > 'string_annotations' so let's go with that.
>>
>> +1 from me.
>>
>
> I'd like to reverse my stance on this. We had `from __future__ import
> division` for many years in Python 2, and nobody argued that it implied
> that Python 2 doesn't have division -- it just meant to import the future
> *version* of division. So I think the original idea, `from __future__
> import annotations` is fine. I don't expect there will be *other* things
> related to annotations that we'll be importing from the future.
>
>
Furthermore, *​nobody* expects the majority of programmers to look at
__annotations__ either. But those who do need to care about the
'implementation detail' of whether it's a string won't be surprised to find
nested strings like "'ForwardReferencedThing'". But one might fear that
those cases get ruthlessly converted into being equivalent to just
"ForwardReferencedThing".

So actually my question is: What should happen when the annotation is
already a string literal?

-- Koos
  ​
-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Analog of PEP 448 for dicts (unpacking in assignment with dict rhs)

2017-11-11 Thread Koos Zevenhoven
Oops, forgot to reply to the list.

On Nov 12, 2017 03:35, "Koos Zevenhoven" <k7ho...@gmail.com> wrote:

On Nov 12, 2017 02:12, "Joao S. O. Bueno" <jsbu...@python.org.br> wrote:

Ben, I have a small package which enables one to do:

with MapGetter(my_dictionary):
from my_dictionary import a, b, parameter3

If this interests you, contributions so it can get hardenned for
mainstram acceptance are welcome.
https://github.com/jsbueno/extradict


Your VersionDict in fact has some similarities to what I have thought of
implementing using the PEP 555 machinery, but it is also a bit different.
Interesting...

-- Koos (mobile)



On 11 November 2017 at 04:26, Ben Usman <bigoban...@gmail.com> wrote:
> Got it, thank you. I'll go and check it out!
>
> On Nov 11, 2017 01:22, "Jelle Zijlstra" <jelle.zijls...@gmail.com> wrote:
>>
>>
>>
>> 2017-11-10 19:53 GMT-08:00 Ben Usman <bigoban...@gmail.com>:
>>>
>>> The following works now:
>>>
>>> seq = [1, 2]
>>> d = {'c': 3, 'a': 1, 'b': 2}
>>>
>>> (el1, el2) = *seq
>>> el1, el2 = *seq
>>> head, *tail = *seq
>>>
>>> seq_new = (*seq, *tail)
>>> dict_new = {**d, **{'c': 4}}
>>>
>>> def f(arg1, arg2, a, b, c):
>>> pass
>>>
>>> f(*seq, **d)
>>>
>>> It seems like dict unpacking syntax would not be fully coherent with
>>> list unpacking syntax without something like:
>>>
>>> {b, a, **other} = **d
>>>
>>> Because iterables have both syntax for function call unpacking and
>>> "rhs in assignment unpacking" and dict has only function call
>>> unpacking syntax.
>>>
>>> I was not able to find any PEPs that suggest this (search keywords:
>>> "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0),
>>> however, let me know if I am wrong.
>>>
>> It was discussed at great length on Python-ideas about a year ago. There
>> is a thread called "Unpacking a dict" from May 2016.
>>
>>>
>>> The main use-case, in my understating, is getting shortcuts to
>>> elements of a dictionary if they are going to be used more then
>>> ones later in the scope. A made-up example is using a config to
>>> initiate a bunch of things with many config arguments with long
>>> names that have overlap in keywords used in initialization.
>>>
>>> One should either write long calls like
>>>
>>> start_a(config['parameter1'], config['parameter2'],
>>> config['parameter3'], config['parameter4'])
>>>
>>> start_b(config['parameter3'], config['parameter2'],
>>> config['parameter3'], config['parameter4'])
>>>
>>> many times or use a list-comprehension solution mentioned above.
>>>
>>> It becomes even worse (in terms of readability) with nested structures.
>>>
>>> start_b(config['group2']['parameter3'], config['parameter2'],
>>> config['parameter3'], config['group2']['parameter3'])
>>>
>>>
>>> ## Rationale
>>>
>>> Right now this problem is often solved using [list] comprehensions,
>>> but this is somewhat verbose:
>>>
>>> a, b = (d[k] for k in ['a', 'b'])
>>>
>>> or direct per-instance assignment (looks simple for with
>>> single-character keys, but often becomes very verbose with
>>> real-world long key names)
>>>
>>> a = d['a']
>>> b = d['b']
>>>
>>> Alternatively one could have a very basic method\function
>>> get_n() or __getitem__() accepting more then a single argument
>>>
>>> a, b = d.get_n('a', 'b')
>>> a, b = get_n(d, 'a', 'b')
>>> a, b = d['a', 'b']
>>>
>>> All these approaches require verbose double-mentioning of same
>>> key. It becomes even worse if you have nested structures
>>> of dictionaries.
>>>
>>> ## Concerns and questions:
>>>
>>> 0. This is the most troubling part,  imho, other questions
>>> are more like common thoughts. It seems (to put it mildly)
>>> weird that execution flow depends on names of local variables.
>>>
>>> For example, one can not easily refactor these variable names. However,
>>> same is true for dictionary keys anyway: you can not suddenly decide
>>> and refactor your code to expect dictionaries with keys 'c' and
>>> 'd' whereas your entire system still expects you to use dictionaries
>>> with keys 'a' and 'b'. A counter-objection 

Re: [Python-Dev] PEP 563: Postponed Evaluation of Annotations

2017-11-10 Thread Koos Zevenhoven
On Fri, Nov 10, 2017 at 7:50 PM, Ethan Furman <et...@stoneleaf.us> wrote:

> On 11/10/2017 07:48 AM, Guido van Rossum wrote:
>
> I don't mind the long name. Of all the options so far I really only like
>> 'string_annotations' so let's go with that.
>>
>
> As someone else mentioned, we have function annotations and variable
> annotations already, which makes string_annotations sound like it's
> annotations for strings.
>
>
> Contriwise, "annotation_strings" sounds like a different type of
> annotation -- they are now being stored as strings, instead of something
> else.
>
>
​Or a step further (longer), with annotations_as_strings.

––Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 563: Postponed Evaluation of Annotations

2017-11-10 Thread Koos Zevenhoven
On Thu, Nov 9, 2017 at 9:51 PM, Guido van Rossum <gu...@python.org> wrote:

> If we have to change the name I'd vote for string_annotations -- "lazy"
> has too many other connotations (e.g. it might cause people to think it's
> the thunks). I find str_annotations too abbreviated, and
> stringify_annotations is too hard to spell.
>
>
​I can't say I disagree. ​And maybe importing string_annotations from the
__future__ doesn't sound quite as sad as importing something from the
__past__.

Anyway, it's not obvious to me that it is the module author that should
decide how the annotations are handled. See also this quote below:

(Quoted from the end of
https://mail.python.org/pipermail/python-ideas/2017-October/047311.html )

On Thu, Oct 12, 2017 at 3:59 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

>
> ​​[*] Maybe somehow make the existing functionality a phantom easter
> egg––a blast from the past which you can import and use, but which is
> otherwise invisible :-). Then later give warnings and finally remove it
> completely.
>
> But we need better smooth upgrade paths anyway, maybe something like:
>
> from __compat__ import unintuitive_decimal_contexts
>
> with unintuitive_decimal_contexts:
> do_stuff()
>
> ​Now code bases can more quickly switch to new python versions and make
> the occasional compatibility adjustments more lazily, while already
> benefiting from other new language features.
>
>
> ––Koos​
>
>
>
-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] What is the design purpose of metaclasses vs code generating decorators? (was Re: PEP 557: Data Classes)

2017-10-13 Thread Koos Zevenhoven
 far as the dataclass interaction with `__slots__` goes, that's a
> problem largely specific to slots (and `__metaclass__` before it), in that
> they're the only characteristics of a class definition that affect how
> CPython allocates memory for the class object itself (the descriptors for
> the slots are stored as a pointer array after the class struct, rather than
> only in the class dict).
>
> Given PEP 526 variable annotations, __slots__ could potentially benefit
> from a __metaclass__ style makeover, allowing an "infer_slots=True" keyword
> argument to type.__new__ to request that the list of slots be inferred from
> __annotations__ (Slot inference would conflict with setting class level
> default values, but that's a real conflict, as you'd be trying to use the
> same name on the class object for both the slot descriptor and the default
> value)
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Investigating time for `import requests`

2017-10-08 Thread Koos Zevenhoven
On Sun, Oct 8, 2017 at 2:44 PM, Chris Angelico <ros...@gmail.com> wrote:

> On Sun, Oct 8, 2017 at 7:02 PM, David Cournapeau <courn...@gmail.com>
> wrote:
> > It is certainly true that for a CLI tool that actually makes any network
> > I/O, especially SSL, import times will quickly be negligible. It becomes
> > tricky for complex tools, because of error management. For example, a
> common
> > pattern I have used in the past is to have a high level "catch all
> > exceptions" function that dispatch the CLI command:
> >
> > try:
> > main_function(...)
> > except ErrorKind1:
> > 
> > except requests.exceptions.SSLError:
> > # gives complete message about options when receiving SSL errors,
> e.g.
> > invalid certificate
> >
> > This pattern requires importing requests every time the command is run,
> even
> > if no network IO is actually done. For complex CLI tools, maybe most
> command
> > don't use network IO (the tool in question was a complete packages
> manager),
> > but you pay ~100 ms because of requests import for every command. It is
> > particularly visible because commands latency starts to be felt around
> > 100-150 ms, and while you can do a lot in python in 100-150 ms, you
> can't do
> > much in 0-50 ms.
>
> This would be a perfect use-case for lazy importing, then. You'd pay
> the price of the import only if you get an error that isn't caught by
> one of the preceding except blocks.
>


​I suppose it might be convenient to be able to do something like:

with autoimport:
try:
main_function(...)
   ​ except ErrorKind1:
...
except requests.exceptions.SLLError:
...


The easiest workaround at the moment is still pretty clumsy:

def import_SLLError():
from requests.exceptions import SLLError
return SLLError

...


except import_SLLError():


But what happens if that gives you an ImportError?

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Investigating time for `import requests`

2017-10-08 Thread Koos Zevenhoven
On Sun, Oct 8, 2017 at 11:02 AM, David Cournapeau <courn...@gmail.com>
wrote:

>
> On Mon, Oct 2, 2017 at 6:42 PM, Raymond Hettinger <
> raymond.hettin...@gmail.com> wrote:
>
>>
>> > On Oct 2, 2017, at 12:39 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
>> >
>> >  "What requests uses" can identify a useful set of
>> > avoidable imports. A Flask "Hello world" app could likely provide
>> > another such sample, as could some example data analysis notebooks).
>>
>> Right.  It is probably worthwhile to identify which parts of the library
>> are typically imported but are not ever used.  And likewise, identify a
>> core set of commonly used tools that are going to be almost unavoidable in
>> sufficiently interesting applications (like using requests to access a REST
>> API, running a micro-webframework, or invoking mercurial).
>>
>> Presumably, if any of this is going to make a difference to end users, we
>> need to see if there is any avoidable work that takes a significant
>> fraction of the total time from invocation through the point where the user
>> first sees meaningful output.  That would include loading from nonvolatile
>> storage, executing the various imports, and doing the actual application.
>>
>> I don't expect to find anything that would help users of Django, Flask,
>> and Bottle since those are typically long-running apps where we value
>> response time more than startup time.
>>
>> For scripts using the requests module, there will be some fruit because
>> not everything that is imported is used.  However, that may not be
>> significant because scripts using requests tend to be I/O bound.  In the
>> timings below, 6% of the running time is used to load and run python.exe,
>> another 16% is used to import requests, and the remaining 78% is devoted to
>> the actual task of running a simple REST API query. It would be interesting
>> to see how much of the 16% could be avoided without major alterations to
>> requests, to urllib3, and to the standard library.
>>
>
> It is certainly true that for a CLI tool that actually makes any network
> I/O, especially SSL, import times will quickly be negligible. It becomes
> tricky for complex tools, because of error management. For example, a
> common pattern I have used in the past is to have a high level "catch all
> exceptions" function that dispatch the CLI command:
>
> try:
> main_function(...)
> except ErrorKind1:
> 
> except requests.exceptions.SSLError:
> # gives complete message about options when receiving SSL errors, e.g.
> invalid certificate
>
> This pattern requires importing requests every time the command is run,
> even if no network IO is actually done. For complex CLI tools, maybe most
> command don't use network IO (the tool in question was a complete packages
> manager), but you pay ~100 ms because of requests import for every command.
> It is particularly visible because commands latency starts to be felt
> around 100-150 ms, and while you can do a lot in python in 100-150 ms, you
> can't do much in 0-50 ms.
>
>
Yes. ​OTOH, ​it can also happen that the *imports* are in fact what use the
network IO. At the office, I usually import from a network drive. For
instance, `import requests` takes a little less than a second, and `import
IPython` usually takes more than a second, with some variation.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 554 v3 (new interpreters module)

2017-10-06 Thread Koos Zevenhoven
IV-like proxy
>>   + you wrap the object, send() the proxy, and recv() a proxy
>>   + this is entirely compatible with tp_share()
>>
>
> * Allow for multiple channel types, such that MemChannel is merely the
> *first* channel type, rather than the *only* channel type
>   + Allows PEP 554 to be restricted to things we already know can be made
> to work
>   + Doesn't block the introduction of an object-sharing based Channel in
> some future release
>   + Allows for at least some channel types to be adapted for use with
> shared memory and multiprocessing
>
>
>> Here are what I consider the key metrics relative to the utility of a
>> solution (not in any significant order):
>>
>> * how hard to understand as a Python programmer?
>>
>
> Not especially important yet - this is more a criterion for the final API,
> not the initial experimental platform.
>
>
>> * how much extra work (if any) for folks calling Channel.send()?
>> * how much extra work (if any) for folks calling Channel.recv()?
>>
>
> I don't think either are particularly important yet, although we also
> don't want to raise any pointless barriers to experimentation.
>
>
>> * how complex is the CPython implementation?
>>
>
> This is critical, since we want to minimise any potential for undesirable
> side effects on regular single interpreter code.
>
>
>> * how hard to understand as a type author (wanting to add support for
>> their type)?
>> * how hard to add support for a new type?
>> * what variety of types could be supported?
>> * what breadth of experimentation opens up?
>>
>
> You missed the big one: what risk does the initial channel design pose to
> the underlying objective of making the GIL a genuinely per-interpreter lock?
>
> If we don't eventually reach the latter goal, then subinterpreters won't
> really offer much in the way of compelling benefits over just using a
> thread pool and queue.Queue.
>
> MemChannel poses zero additional risk to that, since we wouldn't be
> sharing actual Python objects between interpreters, only C pointers and
> structs.
>
> By contrast, introducing an object channel early poses significant new
> risks to that goal, since it will force you to solve hard protocol design
> and refcount management problems *before* making the switch, rather than
> being able to defer the design of the object channel protocol until *after*
> you've already enabled the ability to run subinterpreters in completely
> independent threads.
>
>
>> The most important thing to me is keeping things simple for Python
>> programmers.  After that is ease-of-use for type authors.  However, I
>> also want to put us in a good position in 3.7 to experiment
>> extensively with subinterpreters, so that's a big consideration.
>>
>> Consequently, for PEP 554 my goal is to find a solution for object
>> sharing that keeps things simple in Python while laying a basic
>> foundation we can build on at the C level, so we don't get locked in
>> but still maximize our opportunities to experiment. :)
>>
>
> I think our priorities are quite different then, as I believe PEP 554
> should be focused on defining a relatively easy to implement API that
> nevertheless makes it possible to write interesting programs while working
> on the goal of making the GIL per-interpreter, without worrying too much
> about whether or not the initial cross-interpreter communication channels
> closely resemble the final ones that will be intended for more general use.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-05 Thread Koos Zevenhoven
On Tue, Oct 3, 2017 at 1:11 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

> On Oct 3, 2017 01:00, "Guido van Rossum" <gu...@python.org> wrote:
>
>  Mon, Oct 2, 2017 at 2:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote
>
> I don't mind this (or Nathaniel ;-) being academic. The backwards
>> incompatibility issue I've just described applies to any extension via
>> composition, if the underlying type/protocol grows new members (like the CM
>> protocol would have gained __suspend__ and __resume__ in PEP521).
>>
>
> Since you seem to have a good grasp on this issue, does PEP 550 suffer
> from the same problem? (Or PEP 555, for that matter? :-)
>
>
>
> Neither has this particular issue, because they don't extend an existing
> protocol. If this thread has any significance, it will most likely be
> elsewhere.
>

​Actually, I realize I should be more precise with terminology regarding
"extending an existing protocol"/"growing new members". Below, I'm still
using PEP 521 as an example (sorry).

In fact, in some sense, "adding" __suspend__ and __resume__ to context
managers *does not* extend the context manager protocol, even though it
kind of looks like it does.

There would instead be two separate protocols:

(A) The traditional PEP 343 context manager:
__enter__
__exit__

(B) The hyphothetical PEP 521 context manager:
__enter__
__suspend__
__resume__
__exit__

Protocols A and B are incompatible in both directions:

* It is generally not safe to use a type-A context manager assuming it
implements B.

* It is generally not safe to use a type-B context manager assuming it
implements A.

But if you now have a type-B object, it looks like it's also type-A,
especially for code that is not aware of the existence of B. This is where
the problems come from: a wrapper for type A does the wrong thing when
wrapping a type-B object (except when using inheritance).


[Side note:

Another interpretation of the situation is that, instead of adding protocol
B, A is removed and is replaced with:

(C) The hypothetical PEP 521 context manager with optional members:
__enter__
__suspend__ (optional)
__resume__  (optional)
__exit__

But now the same problems just come from the fact that A no longer exists
while there is code out there that assumes A. But this is only a useful
interpretation if you are the only user of the protocol or if it's
otherwise ok to remove A. So let's go back to the A-B interpretation.]


Q: Could the problem of protocol conflict be solved?

One way to tell A and B apart would be to always explicitly mark the
protocol with a base class. Obviously this is not the case with existing
uses of context managers.

But there's another way, which is to change the naming:

(A) The traditional PEP 343 context manager:
__enter__
__exit__

(Z) The *modified* hyphothetical PEP 521 context manager:
__begin__
__suspend__
__resume__
__end__

Now, A and Z are easy to tell apart. A context manager wrapper designed for
type A immediately fails if used to wrap a type-Z object. But of course the
whole context manager concept now suddenly became a lot more complicated.


It is interesting that, in the A-B scheme, making a general context manager
wrapper using inheritance *just works*, even if A is not a subprotocol of B
and B is not a subprotocol of A.

Anyway, a lot of this is amplified by the fact that the methods of the
context manager protocols are not independent functionality. Instead,
calling one of them leads to the requirement that the other methods are
also called at the right moments.

--Koos

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 554 v3 (new interpreters module)

2017-10-04 Thread Koos Zevenhoven
On Wed, Oct 4, 2017 at 4:51 PM, Eric Snow <ericsnowcurren...@gmail.com>
wrote:

> On Tue, Oct 3, 2017 at 11:36 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> > The problem relates to the fact that there aren't any memory barriers
> > around CPython's INCREF operations (they're implemented as an ordinary
> > C post-increment operation), so you can get the following scenario:
> >
> > * thread on CPU A has the sole reference (ob_refcnt=1)
> > * thread on CPU B acquires a new reference, but hasn't pushed the
> > updated ob_refcnt value back to the shared memory cache yet
> > * original thread on CPU A drops its reference, *thinks* the refcnt is
> > now zero, and deletes the object
> > * bad things now happen in CPU B as the thread running there tries to
> > use a deleted object :)
>
> I'm not clear on where we'd run into this problem with channels.
> Mirroring your scenario:
>
> * interpreter A (in thread on CPU A) INCREFs the object (the GIL is still
> held)
> * interp A sends the object to the channel
> * interp B (in thread on CPU B) receives the object from the channel
> * the new reference is held until interp B DECREFs the object
>
> From what I see, at no point do we get a refcount of 0, such that
> there would be a race on the object being deleted.
>
> ​
So what you're saying is that when Larry finishes the gilectomy,
subinterpreters will work GIL-free too?​-)

​––Koos
​

The only problem I'm aware of (it dawned on me last night), is in the
> case that the interpreter that created the object gets deleted before
> the object does.  In that case we can't pass the deletion back to the
> original interpreter.  (I don't think this problem is necessarily
> exclusive to the solution I've proposed for Bytes.)
>
> -eric
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-04 Thread Koos Zevenhoven
On Wed, Oct 4, 2017 at 4:04 PM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 4 October 2017 at 22:45, Koos Zevenhoven <k7ho...@gmail.com> wrote:
> > On Wed, Oct 4, 2017 at 3:33 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> >> That's not a backwards compatibility problem, because the only way to
> >> encounter it is to update your code to rely on the new extended
> >> protocol - your *existing* code will continue to work fine, since
> >> that, by definition, can't be relying on the new protocol extension.
> >>
> >
> > No, not all code is "your" code. Clearly this is not a well-known
> problem.
> > This is a backwards-compatibility problem for the author of the wrappeR,
> not
> > for the author of the wrappeD object.
>
> No, you're misusing the phrase "backwards compatibility", and
> confusing it with "feature enablement".
>
> Preserving backwards compatibility just means "existing code and
> functionality don't break". It has nothing to do with whether or not
> other support libraries and frameworks might need to change in order
> to enable full access to a new language feature.
>
>
​It's not about full access to a new language feature. It's about the
wrappeR promising it can wrap any ​context manager, which it then no longer
can. If the __suspend__ and __resume__ methods are ignored, that is not
about "not having full access to a new feature" — that's broken code. The
error message you get (if any) may not contain any hint of what went wrong.

Take the length hint protocol defined in PEP 424 for example: that
> extended the iterator protocol to include a new optional
> __length_hint__ method, such that container constructors can make a
> more reasonable guess as to how much space they should pre-allocate
> when being initialised from an iterator or iterable rather than
> another container.
>
>
​This is slightly similar, but not really. Not using __length_hint__ does
not affect the correctness of code.


> That protocol means that many container wrappers break the
> optimisation. That's not a compatibility problem, it just means those
> wrappers don't support the feature, and it would potentially be a
> useful enhancement if they did.
>
>
​Again, ignoring __length_hint__ does not lead to broken code, so that just
means the wrapper is as slow or as fast as it was before.

​So I still think it's an issue for the author of the wrapper to fix––even
if just by documenting that the wrapper does not support the new protocol
members. But that would not be necessary if the wrapper uses inheritance.

(Of course there may be another reason to not use inheritance, but just
overriding two methods seems like a good case for inheritance.).
​
​This discussion seems pretty pointless by now. It's true that *some* code
needs to change for this to be a problem. Updating only the Python version
does not break a codebase if libraries aren't updated, and even then,
breakage is not very likely, I suppose.

It all depends on the kind of change that is made. For __length_hint__, you
only risk not getting the performance improvement. For __suspend__ and
__resume__, there's a small chance of problems. For some other change, it
might be even riskier. But this is definitely not the most dangerous type
of compatibility issue.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-04 Thread Koos Zevenhoven
On Wed, Oct 4, 2017 at 3:33 PM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 4 October 2017 at 20:22, Koos Zevenhoven <k7ho...@gmail.com> wrote:
> > On Wed, Oct 4, 2017 at 8:07 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
> >>
> >> On 3 October 2017 at 03:13, Koos Zevenhoven <k7ho...@gmail.com> wrote:
> >> > Well, it's not completely unrelated to that. The problem I'm talking
> >> > about
> >> > is perhaps most easily seen from a simple context manager wrapper that
> >> > uses
> >> > composition instead of inheritance:
> >> >
> >> > class Wrapper:
> >> > def __init__(self):
> >> > self._wrapped = SomeContextManager()
> >> >
> >> > def __enter__(self):
> >> > print("Entering context")
> >> > return self._wrapped.__enter__()
> >> >
> >> > def __exit__(self):
> >> > self._wrapped.__exit__()
> >> > print("Exited context")
> >> >
> >> >
> >> > Now, if the wrapped contextmanager becomes a PEP 521 one with
> >> > __suspend__
> >> > and __resume__, the Wrapper class is broken, because it does not
> respect
> >> > __suspend__ and __resume__. So actually this is a backwards
> compatiblity
> >> > issue.
> >>
> >> This is a known problem, and one of the main reasons that having a
> >> truly transparent object proxy like
> >> https://wrapt.readthedocs.io/en/latest/wrappers.html#object-proxy as
> >> part of the standard library would be highly desirable.
> >>
> >
> > This is barely related to the problem I describe. The wrapper is not
> > supposed to pretend to *be* the underlying object. It's just supposed to
> > extend its functionality.
>
> If a wrapper *isn't* trying to act as a transparent object proxy, and
> is instead adapting it to a particular protocol, then yes, you'll need
> to update the wrapper when the protocol is extended.
>
>
​Yes, but it still means that the change in the dependency (in this case a
standard Python protocol) breaks the wrapper code.​

Remember that the wrappeR class and the wrappeD class can be implemented in
different libraries.



> That's not a backwards compatibility problem, because the only way to
> encounter it is to update your code to rely on the new extended
> protocol - your *existing* code will continue to work fine, since
> that, by definition, can't be relying on the new protocol extension.
>
>
​No, not all code is "your" code. Clearly this is not a well-known problem.
This is a backwards-compatibility problem for the author of the wrappeR,
not for the author of the wrappeD object.

––Koos
​

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-04 Thread Koos Zevenhoven
On Wed, Oct 4, 2017 at 8:07 AM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 3 October 2017 at 03:13, Koos Zevenhoven <k7ho...@gmail.com> wrote:
> > Well, it's not completely unrelated to that. The problem I'm talking
> about
> > is perhaps most easily seen from a simple context manager wrapper that
> uses
> > composition instead of inheritance:
> >
> > class Wrapper:
> > def __init__(self):
> > self._wrapped = SomeContextManager()
> >
> > def __enter__(self):
> > print("Entering context")
> > return self._wrapped.__enter__()
> >
> > def __exit__(self):
> > self._wrapped.__exit__()
> > print("Exited context")
> >
> >
> > Now, if the wrapped contextmanager becomes a PEP 521 one with __suspend__
> > and __resume__, the Wrapper class is broken, because it does not respect
> > __suspend__ and __resume__. So actually this is a backwards compatiblity
> > issue.
>
> This is a known problem, and one of the main reasons that having a
> truly transparent object proxy like
> https://wrapt.readthedocs.io/en/latest/wrappers.html#object-proxy as
> part of the standard library would be highly desirable.
>
>
This is barely related to the problem I describe. The wrapper is not
supposed to pretend to *be* the underlying object. It's just supposed to
extend its functionality.

Maybe it's just me, but using a transparent object proxy for this sounds
like someone trying to avoid inheritance for no reason and at any cost.
Inheritance probably has faster method access, and makes it more obvious
what's going on:

def Wrapper(contextmanager):
class Wrapper(type(contextmanager)):
def __enter__(self):
print("Entering context")
return contextmanager.__enter__()

def __exit__(self):
contextmanager.__exit__()
print("Exited context")
return Wrapper()


A wrapper based on a transparent object proxy is just a non-transparent
replacement for inheritance. Its wrapper nature is non-transparent because
it pretends to `be` the original object, while it's actually a wrapper.

But an object cannot `be` another object as long as the `is` operator won't
return True. And any straightforward way to implement that would add
performance overhead for normal objects.

I do remember sometimes wanting a transparent object proxy. But not for
normal wrappers. But I don't think I've gone as far as looking for a
library to do that, because it seems that you can only go half way anyway.

––Koos

​--
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Investigating time for `import requests`

2017-10-03 Thread Koos Zevenhoven
I've probably missed a lot of this discussion, but this lazy import
discussion confuses me. We already have both eager import (import at the
top of the file), and lazy import (import right before use).

The former is good when you know you need the module, and the latter is
good when you having the overhead at first use is preferable over having
the overhead at startup. But like Raymond was saying, this is of course
especially relevant when that import is likely never used.

Maybe the fact that the latter is not recommended gives people the feeling
that we don't have lazy imports, although we do.

What we *don't* have, however, is *partially* lazy imports and partially
executed code, something like:

on demand:
class Foo:
# a lot of stuff here

def foo_function(my_foo, bar):
# more stuff here


When executed, the `on demand` block would only keep track of which names
are being bound to (here, "Foo" and "foo_function"), and on the lookup of
those names in the namespace, the code would actually be run.

Then you could also do

on demand:
import sometimes_needed_module

Or

on demand:
from . import all, submodules, of, this, package


This would of course drift away from "namespaces are simply dicts". But who
cares, if they still provide the dict interface. See e.g. this example with
automatic lazy imports:

https://gist.github.com/k7hoven/21c5532ce19b306b08bb4e82cfe5a609


Another thing we *don't* have is unimporting. What if I know that I'm only
going to need some particular module in this one initialization function.
Why should I keep it in memory for the whole lifetime of the program?

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-02 Thread Koos Zevenhoven
On Oct 3, 2017 01:11, "Koos Zevenhoven" <k7ho...@gmail.com> wrote:

On Oct 3, 2017 01:00, "Guido van Rossum" <gu...@python.org> wrote:

 Mon, Oct 2, 2017 at 2:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote

I don't mind this (or Nathaniel ;-) being academic. The backwards
> incompatibility issue I've just described applies to any extension via
> composition, if the underlying type/protocol grows new members (like the CM
> protocol would have gained __suspend__ and __resume__ in PEP521).
>

Since you seem to have a good grasp on this issue, does PEP 550 suffer from
the same problem? (Or PEP 555, for that matter? :-)



Neither has this particular issue, because they don't extend an existing
protocol. If this thread has any significance, it will most likely be
elsewhere.


That said, I did come across this thought while trying to find flaws in my
own PEP ;)

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-02 Thread Koos Zevenhoven
On Oct 3, 2017 01:00, "Guido van Rossum" <gu...@python.org> wrote:

 Mon, Oct 2, 2017 at 2:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote

I don't mind this (or Nathaniel ;-) being academic. The backwards
> incompatibility issue I've just described applies to any extension via
> composition, if the underlying type/protocol grows new members (like the CM
> protocol would have gained __suspend__ and __resume__ in PEP521).
>

Since you seem to have a good grasp on this issue, does PEP 550 suffer from
the same problem? (Or PEP 555, for that matter? :-)



Neither has this particular issue, because they don't extend an existing
protocol. If this thread has any significance, it will most likely be
elsewhere.

-- Koos (mobile)


-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-02 Thread Koos Zevenhoven
On Oct 3, 2017 00:02, "Guido van Rossum" <gu...@python.org> wrote:

On Mon, Oct 2, 2017 at 10:13 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

> Hi all, It was suggested that I start a new thread, because the other
> thread drifted away from its original topic. So here, in case someone is
> interested:
>
> On Oct 2, 2017 17:03, "Koos Zevenhoven <k7ho...@gmail.com> wrote:
>
> On Mon, Oct 2, 2017 at 6:42 AM, Guido van Rossum <gu...@python.org> wrote:
>
> On Sun, Oct 1, 2017 at 1:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>
> On Oct 1, 2017 19:26, "Guido van Rossum" <gu...@python.org> wrote:
>
> Your PEP is currently incomplete. If you don't finish it, it is not even a
> contender. But TBH it's not my favorite anyway, so you could also just
> withdraw it.
>
>
> I can withdraw it if you ask me to, but I don't want to withdraw it
> without any reason. I haven't changed my mind about the big picture. OTOH,
> PEP 521 is elegant and could be used to implement PEP 555, but 521 is
> almost certainly less performant and has some problems regarding context
> manager wrappers that use composition instead of inheritance.
>
>
> It is my understanding that PEP 521 (which proposes to add optional
> __suspend__ and __resume__ methods to the context manager protocol, to be
> called whenever a frame is suspended or resumed inside a `with` block) is
> no longer a contender because it would be way too slow. I haven't read it
> recently or thought about it, so I don't know what the second issue you
> mention is about (though it's presumably about the `yield` in a context
> manager implemented using a generator decorated with
> `@contextlib.contextmanager`).
>
>
> ​Well, it's not completely unrelated to that. The problem I'm talking
> about is perhaps most easily seen from a simple context manager wrapper
> that uses composition instead of inheritance:
>
> class Wrapper:
> def __init__(self):
> self._wrapped = SomeContextManager()
>
> def __enter__(self):
> print("Entering context")
> return self._wrapped.__enter__()
>
> def __exit__(self):
> self._wrapped.__exit__()
> print("Exited context")
>
>
> Now, if the wrapped contextmanager becomes a PEP 521 one with __suspend__
> and __resume__, the Wrapper class is broken, because it does not respect
> __suspend__ and __resume__. So actually this is a backwards compatiblity
> issue.
>
>
Why is it backwards incompatible? I'd think that without PEP 521 it would
be broken in exactly the same way because there's no __suspend__/__resume__
at all.



The wrapper is (would be) broken because it depends on the internal
implementation of the wrapped CM.

Maybe the author of SomeContextManager wants to upgrade the CM to also work
in coroutines and generators. But it could be a more subtle change in the
CM implementation.

The problem becomes more serious and more obvious if you don't know which
context manager you are wrapping:

class Wrapper:
def __init__(self, contextmanager):
self._wrapped = contextmanager

def __enter__(self):
print("Entering context")
return self._wrapped.__enter__()

def __exit__(self):
self._wrapped.__exit__()
print("Exited context")


The wrapper is (would be) broken because it does not work for all CMs
anymore.


But if the wrapper is made using inheritance, the problem goes away:
>
>
> class Wrapper(SomeContextManager):
> def __enter__(self):
> print("Entering context")
> return super().__enter__()
>
> def __exit__(self):
> super().__exit__()
> print("Exited context")
>
>
> Now the wrapper cleanly inherits the new optional __suspend__ and
> __resume__ from the wrapped context manager type.
>
>
In any case this is completely academic because PEP 521 is not going to
happen. Nathaniel himself has said so (I think in the context of discussing
PEP 550).


I don't mind this (or Nathaniel ;-) being academic. The backwards
incompatibility issue I've just described applies to any extension via
composition, if the underlying type/protocol grows new members (like the CM
protocol would have gained __suspend__ and __resume__ in PEP521).


-- Koos (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Inheritance vs composition in backcompat (PEP521)

2017-10-02 Thread Koos Zevenhoven
Hi all, It was suggested that I start a new thread, because the other
thread drifted away from its original topic. So here, in case someone is
interested:

On Oct 2, 2017 17:03, "Koos Zevenhoven <k7ho...@gmail.com> wrote:

On Mon, Oct 2, 2017 at 6:42 AM, Guido van Rossum <gu...@python.org> wrote:

On Sun, Oct 1, 2017 at 1:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote:

On Oct 1, 2017 19:26, "Guido van Rossum" <gu...@python.org> wrote:

Your PEP is currently incomplete. If you don't finish it, it is not even a
contender. But TBH it's not my favorite anyway, so you could also just
withdraw it.


I can withdraw it if you ask me to, but I don't want to withdraw it without
any reason. I haven't changed my mind about the big picture. OTOH, PEP 521
is elegant and could be used to implement PEP 555, but 521 is almost
certainly less performant and has some problems regarding context manager
wrappers that use composition instead of inheritance.


It is my understanding that PEP 521 (which proposes to add optional
__suspend__ and __resume__ methods to the context manager protocol, to be
called whenever a frame is suspended or resumed inside a `with` block) is
no longer a contender because it would be way too slow. I haven't read it
recently or thought about it, so I don't know what the second issue you
mention is about (though it's presumably about the `yield` in a context
manager implemented using a generator decorated with
`@contextlib.contextmanager`).


​Well, it's not completely unrelated to that. The problem I'm talking about
is perhaps most easily seen from a simple context manager wrapper that uses
composition instead of inheritance:

class Wrapper:
def __init__(self):
self._wrapped = SomeContextManager()

def __enter__(self):
print("Entering context")
return self._wrapped.__enter__()

def __exit__(self):
self._wrapped.__exit__()
print("Exited context")


Now, if the wrapped contextmanager becomes a PEP 521 one with __suspend__
and __resume__, the Wrapper class is broken, because it does not respect
__suspend__ and __resume__. So actually this is a backwards compatiblity
issue.

But if the wrapper is made using inheritance, the problem goes away:


class Wrapper(SomeContextManager):
def __enter__(self):
print("Entering context")
return super().__enter__()

def __exit__(self):
super().__exit__()
print("Exited context")


Now the wrapper cleanly inherits the new optional __suspend__ and
__resume__ from the wrapped context manager type.


––Koos




-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Intention to accept PEP 552 soon (deterministic pyc files)

2017-10-02 Thread Koos Zevenhoven
On Mon, Oct 2, 2017 at 6:42 AM, Guido van Rossum <gu...@python.org> wrote:

> On Sun, Oct 1, 2017 at 1:52 PM, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>
>> On Oct 1, 2017 19:26, "Guido van Rossum" <gu...@python.org> wrote:
>>
>> Your PEP is currently incomplete. If you don't finish it, it is not even
>> a contender. But TBH it's not my favorite anyway, so you could also just
>> withdraw it.
>>
>>
>> I can withdraw it if you ask me to, but I don't want to withdraw it
>> without any reason. I haven't changed my mind about the big picture. OTOH,
>> PEP 521 is elegant and could be used to implement PEP 555, but 521 is
>> almost certainly less performant and has some problems regarding context
>> manager wrappers that use composition instead of inheritance.
>>
>
> It is my understanding that PEP 521 (which proposes to add optional
> __suspend__ and __resume__ methods to the context manager protocol, to be
> called whenever a frame is suspended or resumed inside a `with` block) is
> no longer a contender because it would be way too slow. I haven't read it
> recently or thought about it, so I don't know what the second issue you
> mention is about (though it's presumably about the `yield` in a context
> manager implemented using a generator decorated with
> `@contextlib.contextmanager`).
>
>
​Well, it's not completely unrelated to that. The problem I'm talking about
is perhaps most easily seen from a simple context manager wrapper that uses
composition instead of inheritance:

class Wrapper:
def __init__(self):
self._wrapped = SomeContextManager()

def __enter__(self):
print("Entering context")
return self._wrapped.__enter__()

def __exit__(self):
self._wrapped.__exit__()
print("Exited context")


Now, if the wrapped contextmanager becomes a PEP 521 one with __suspend__
and __resume__, the Wrapper class is broken, because it does not respect
__suspend__ and __resume__. So actually this is a backwards compatiblity
issue.

But if the wrapper is made using inheritance, the problem goes away:


class Wrapper(SomeContextManager):
def __enter__(self):
print("Entering context")
return super().__enter__()

def __exit__(self):
super().__exit__()
print("Exited context")


Now the wrapper cleanly inherits the new optional __suspend__ and
__resume__ from the wrapped context manager type.


––Koos

>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Intention to accept PEP 552 soon (deterministic pyc files)

2017-10-01 Thread Koos Zevenhoven
On Oct 1, 2017 19:26, "Guido van Rossum" <gu...@python.org> wrote:

Your PEP is currently incomplete. If you don't finish it, it is not even a
contender. But TBH it's not my favorite anyway, so you could also just
withdraw it.


I can withdraw it if you ask me to, but I don't want to withdraw it without
any reason. I haven't changed my mind about the big picture. OTOH, PEP 521
is elegant and could be used to implement PEP 555, but 521 is almost
certainly less performant and has some problems regarding context manager
wrappers that use composition instead of inheritance.

-- Koos



On Oct 1, 2017 9:13 AM, "Koos Zevenhoven" <k7ho...@gmail.com> wrote:

> On Sep 29, 2017 18:21, "Guido van Rossum" <gu...@python.org> wrote:
>
>
> PS. PEP 550 is still unaccepted, awaiting a new revision from Yury and
> Elvis.
>
>
> This is getting really off-topic, but I do have updates to add to PEP 555
> if there is interest in that. IMO, 555 is better and most likely faster
> than 550, but on the other hand, the issues with PEP 550 are most likely
> not going to be a problem for me personally.
>
> -- Koos
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Intention to accept PEP 552 soon (deterministic pyc files)

2017-10-01 Thread Koos Zevenhoven
On Sep 29, 2017 18:21, "Guido van Rossum"  wrote:


PS. PEP 550 is still unaccepted, awaiting a new revision from Yury and
Elvis.


This is getting really off-topic, but I do have updates to add to PEP 555
if there is interest in that. IMO, 555 is better and most likely faster
than 550, but on the other hand, the issues with PEP 550 are most likely
not going to be a problem for me personally.

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 554 v2 (new "interpreters" module)

2017-09-11 Thread Koos Zevenhoven
On Mon, Sep 11, 2017 at 8:51 AM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 10 September 2017 at 04:04, Nathaniel Smith <n...@pobox.com> wrote:
> > On Sep 8, 2017 4:06 PM, "Eric Snow" <ericsnowcurren...@gmail.com> wrote:
> >
> >
> >run(code):
> >
> >   Run the provided Python code in the interpreter, in the current
> >   OS thread.  If the interpreter is already running then raise
> >   RuntimeError in the interpreter that called ``run()``.
> >
> >   The current interpreter (which called ``run()``) will block until
> >   the subinterpreter finishes running the requested code.  Any
> >   uncaught exception in that code will bubble up to the current
> >   interpreter.
> >
> >
> > This phrase "bubble up" here is doing a lot of work :-). Can you
> elaborate
> > on what you mean? The text now makes it seem like the exception will just
> > pass from one interpreter into another, but that seems impossible – it'd
> > mean sharing not just arbitrary user defined exception classes but full
> > frame objects...
>
> Indeed, I think this will need to be something more akin to
> https://docs.python.org/3/library/subprocess.html#
> subprocess.CalledProcessError,
> where the child interpreter is able to pass back encoded text data
> (perhaps including a full rendered traceback), but the exception
> itself won't be able to be propagated.
>

​It would be helpful if at least the exception type could somehow be
preserved / restored / mimicked. Otherwise you need if-elif statements in
your try-excepts and other annoying stuff.

-- Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 559 - built-in noop()

2017-09-10 Thread Koos Zevenhoven
On Sun, Sep 10, 2017 at 8:21 PM, Barry Warsaw <ba...@python.org> wrote:

> On Sep 9, 2017, at 15:12, Guido van Rossum <gu...@python.org> wrote:
> >
> > I can't tell whether this was meant seriously, but I don't think it's
> worth it. People can easily write their own dummy function and give it any
> damn semantics they want. Let's reject the PEP.
>
> Alrighty then!  (Yes, it was serious, but I claim post-sprint
> euphoria/delirium).
>
>
​Just for future reference, here's a slightly more serious comment:

I think the "pass" statement wasn't mentioned yet, but clearly noop() would
be duplication of functionality. So maybe the closest thing without
duplication would be to make "pass" an expression which evaluates to a
no-op function, but which the compiler could perhaps optimize away if it's
a statement by itself, or is a builtin.

-- Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 559 - built-in noop()

2017-09-09 Thread Koos Zevenhoven
On Sat, Sep 9, 2017 at 10:54 PM, Victor Stinner <victor.stin...@gmail.com>
wrote:

> I always wanted this feature (no kidding).
>
> Would it be possible to add support for the context manager?
>
> with noop(): ...
>
> Maybe noop can be an instance of:
>
> class Noop:
>   def __enter__(self, *args, **kw): return self
>   def __exit__(self, *args): pass
>   def __call__(self, *args, **kw): return self
>
>

​This worries me. Clearly, assuming a None-coercing noop, we must have:

noop(foo) is None
noop[foo] is None
noop * ​foo is None
foo * noop is None
noop + foo is None
foo + noop is None
noop - noop is None
...
noop / 0 is None
...
(noop == None) is None

which can all sort of be implicitly extrapolated to be in the PEP.

But how are you planning to implement:

(noop is None) is None
(obj in noop) is None
(noop in obj) is None

or

(None or noop) is None
(None and noop) is None

and finally:

foo(noop) is None

?

Sooner or later, someone will need all these features, and the PEP does not
seem to address the issue in any way.

-- Koos




> Victor
>
> Le 9 sept. 2017 11:48 AM, "Barry Warsaw" <ba...@python.org> a écrit :
>
>> I couldn’t resist one more PEP from the Core sprint.  I won’t reveal
>> where or how this one came to me.
>>
>> -Barry
>>
>> PEP: 559
>> Title: Built-in noop()
>> Author: Barry Warsaw <ba...@python.org>
>> Status: Draft
>> Type: Standards Track
>> Content-Type: text/x-rst
>> Created: 2017-09-08
>> Python-Version: 3.7
>> Post-History: 2017-09-09
>>
>>
>> Abstract
>> 
>>
>> This PEP proposes adding a new built-in function called ``noop()`` which
>> does
>> nothing but return ``None``.
>>
>>
>> Rationale
>> =
>>
>> It is trivial to implement a no-op function in Python.  It's so easy in
>> fact
>> that many people do it many times over and over again.  It would be
>> useful in
>> many cases to have a common built-in function that does nothing.
>>
>> One use case would be for PEP 553, where you could set the breakpoint
>> environment variable to the following in order to effectively disable it::
>>
>> $ setenv PYTHONBREAKPOINT=noop
>>
>>
>> Implementation
>> ==
>>
>> The Python equivalent of the ``noop()`` function is exactly::
>>
>> def noop(*args, **kws):
>> return None
>>
>> The C built-in implementation is available as a pull request.
>>
>>
>> Rejected alternatives
>> =
>>
>> ``noop()`` returns something
>> 
>>
>> YAGNI.
>>
>> This is rejected because it complicates the semantics.  For example, if
>> you
>> always return both ``*args`` and ``**kws``, what do you return when none
>> of
>> those are given?  Returning a tuple of ``((), {})`` is kind of ugly, but
>> provides consistency.  But you might also want to just return ``None``
>> since
>> that's also conceptually what the function was passed.
>>
>> Or, what if you pass in exactly one positional argument, e.g.
>> ``noop(7)``.  Do
>> you return ``7`` or ``((7,), {})``?  And so on.
>>
>> The author claims that you won't ever need the return value of ``noop()``
>> so
>> it will always return ``None``.
>>
>> Coghlin's Dialogs (edited for formatting):
>>
>> My counterargument to this would be ``map(noop, iterable)``,
>> ``sorted(iterable, key=noop)``, etc. (``filter``, ``max``, and
>> ``min`` all accept callables that accept a single argument, as do
>> many of the itertools operations).
>>
>> Making ``noop()`` a useful default function in those cases just
>> needs the definition to be::
>>
>>def noop(*args, **kwds):
>>return args[0] if args else None
>>
>> The counterargument to the counterargument is that using ``None``
>> as the default in all these cases is going to be faster, since it
>> lets the algorithm skip the callback entirely, rather than calling
>> it and having it do nothing useful.
>>
>>
>> Copyright
>> =
>>
>> This document has been placed in the public domain.
>>
>>
>> ..
>>Local Variables:
>>mode: indented-text
>>indent-tabs-mode: nil
>>sentence-end-double-space: t
>>fill-column: 70
>>coding: utf-8
>>End:
>>
>>
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.
>> stinner%40gmail.com
>>
>>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> k7hoven%40gmail.com
>
>


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Consolidate stateful runtime globals

2017-09-07 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:17 PM, Benjamin Peterson <benja...@python.org>
wrote:

> On Wed, Sep 6, 2017, at 10:08, Antoine Pitrou wrote:
> > On Wed, 06 Sep 2017 09:42:29 -0700
> > Benjamin Peterson <benja...@python.org> wrote:
> > > On Wed, Sep 6, 2017, at 03:14, Antoine Pitrou wrote:
> > > >
> > > > Hello,
> > > >
> > > > I'm a bit concerned about
> > > > https://github.com/python/cpython/commit/
> 76d5abc8684bac4f2fc7cccfe2cd940923357351
> > > >
> > > > My main gripe is that makes writing C code more tedious.  Simple C
> > > > global variables such as "_once_registry" are now spelled
> > > > "_PyRuntime.warnings.once_registry".  The most egregious example
> seems
> > > > to be "_PyRuntime.ceval.gil.locked" (used to be simply "gil_locked").
> > > >
> > > > Granted, C is more verbose than Python, but it doesn't have to become
> > > > that verbose.  I don't know about you, but when code becomes annoying
> > > > to type, I tend to try and take shortcuts.
> > >
> > > How often are you actually typing the names of runtime globals, though?
> >
> > Not very often, but if I want to experiment with some low-level
> > implementation details, it is nice to avoid the hassle.
>
> It seems like this could be remediated with some inline functions or
> macros, which would also help safely encapsulate state.
>
> >
> > There's also a readability argument: with very long names, expressions
> > can become less easy to parse.
> >
> > > If you are using a globals, perhaps the typing time will allow you to
> > > fully consider the gravity of the situation.
> >
> > Right, I needed to be reminded of how perilous the use of C globals is.
> > Perhaps I should contact the PSRT the next time I contemplate using a C
> > global.
>
> It's not just you but future readers.


Great. Related to this, there is also discussion on dangers of globals and
other widely-scoped variables in the Rationale section of PEP 555
(Context-local variables), for anyone interested. But if you read the draft
I posted on python-ideas last Monday, you've already seen it.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Consolidate stateful runtime globals

2017-09-07 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:17 PM, Benjamin Peterson <benja...@python.org>
wrote:

> On Wed, Sep 6, 2017, at 10:08, Antoine Pitrou wrote:
> > On Wed, 06 Sep 2017 09:42:29 -0700
> > Benjamin Peterson <benja...@python.org> wrote:
> > > On Wed, Sep 6, 2017, at 03:14, Antoine Pitrou wrote:
> > > >
> > > > Hello,
> > > >
> > > > I'm a bit concerned about
> > > > https://github.com/python/cpython/commit/76d5abc8684bac4f2fc
> 7cccfe2cd940923357351
> > > >
> > > > My main gripe is that makes writing C code more tedious.  Simple C
> > > > global variables such as "_once_registry" are now spelled
> > > > "_PyRuntime.warnings.once_registry".  The most egregious example
> seems
> > > > to be "_PyRuntime.ceval.gil.locked" (used to be simply "gil_locked").
> > > >
> > > > Granted, C is more verbose than Python, but it doesn't have to become
> > > > that verbose.  I don't know about you, but when code becomes annoying
> > > > to type, I tend to try and take shortcuts.
> > >
> > > How often are you actually typing the names of runtime globals, though?
> >
> > Not very often, but if I want to experiment with some low-level
> > implementation details, it is nice to avoid the hassle.
>
> It seems like this could be remediated with some inline functions or
> macros, which would also help safely encapsulate state.
>
> >
> > There's also a readability argument: with very long names, expressions
> > can become less easy to parse.
> >
> > > If you are using a globals, perhaps the typing time will allow you to
> > > fully consider the gravity of the situation.
> >
> > Right, I needed to be reminded of how perilous the use of C globals is.
> > Perhaps I should contact the PSRT the next time I contemplate using a C
> > global.
>
> It's not just you but future readers.


Great. Related to this, there is also discussion on dangers of globals and
other widely-scoped variables in the Rationale section of PEP 555
(Context-local variables), for anyone interested. But if you read the draft
I posted on python-ideas last Monday, you've already seen it.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-07 Thread Koos Zevenhoven
On Thu, Sep 7, 2017 at 10:54 AM, Greg Ewing <greg.ew...@canterbury.ac.nz>
wrote:

> Yury Selivanov wrote:
>
>> def foo():
>>  var = ContextVar()
>>  var.set(1)
>>
>> for _ in range(10**6): foo()
>>
>> If 'var' is strongly referenced, we would have a bunch of them.
>>
>
> Erk. This is not how I envisaged context vars would be
> used. What I thought you would do is this:
>
>my_context_var = ContextVar()
>
>def foo():
>   my_context_var.set(1)
>
> This problem would also not arise if context vars
> simply had names instead of being magic key objects:
>
>def foo():
>   contextvars.set("mymodule.myvar", 1)
>
> That's another thing I think would be an improvement,
> but it's orthogonal to what we're talking about here
> and would be best discussed separately.
>
>
​​​There are lots of things in this discussion that I should have commented
on, but here's one related to this.

PEP 555 does not have the resource-management issue described above and
needs no additional tricks to achieve that:

# using PEP 555

def foo():
   var = contextvars.Var()
   with var.assign(1):
   # do something [*]

​for _ in range(10**6):
foo(​)


Every time foo is called, a new context variable is created, but that's
perfectly fine, and lightweight. As soon as the context manager exits,
there are no references to the Assignment object returned by var.assign(1),
and as soon as foo() returns, there are no references to var, so everything
should get cleaned up nicely.

And regarding string keys, they have pros and cons, and they can be added
easily, so let's not go there now.

-- Koos


[*] (nit-picking) without closures that would keep the var reference alive


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 549: Instance Properties (aka: module properties)

2017-09-06 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 1:52 AM, Nathaniel Smith <n...@pobox.com> wrote:
​[...]​


> import sys, types
> class _MyModuleType(types.ModuleType):
> @property
> def ...
>
> @property
> def ...
> sys.modules[__name__].__class__ = _MyModuleType
>
> It's definitely true though that they're not the most obvious lines of
> code :-)
>
>
​It would kind of be in line with the present behavior if you could simply
write something like this in the module:

class __class__(types.ModuleType):
​@property
def hello(self):
return "hello"

def __dir__(self):
return ["hello"]



assuming it would be equivalent to setting __class__ afterwards.​

​--Koos​




-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-06 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:22 PM, Yury Selivanov <yselivanov...@gmail.com>
wrote:
​[...]
​


> PEP 550 treats coroutines and generators as objects that support out
> of order execution.


​Out of order? More like interleaved.​
​​

> PEP 555 still doesn't clearly explain how exactly it is different from
> PEP 550.  Because 555 was posted *after* 550, I think that it's PEP
> 555 that should have that comparison.
>

555 was *posted* as a pep after 550, yes. And yes, there could be a
comparison, especially now that PEP 550 semantics seem to have converged,
so PEP 555 does not have to adapt the comparison to PEP 550 changes.

-- Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-06 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 8:16 PM, Guido van Rossum <gu...@python.org> wrote:

> On Wed, Sep 6, 2017 at 8:07 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>
>> I think yield from should have the same semantics as iterating over the
>> generator with next/send, and PEP 555 has no issues with this.
>>
>
> I think the onus is on you and Greg to show a realistic example that shows
> why this is necessary.
>
>
​Well, regarding this part, it's just that things like

for obj in gen:
​yield obj

often get modernized into

yield from gen

And realistic examples of that include pretty much any normal use of yield
from.


So far all the argumentation about this has been of the form "if you have
> code that currently does this (example using foo) and you refactor it in
> using yield from (example using bar), and if you were relying on context
> propagation back out of calls, then it should still propagate out."
>
>
​So here's a realistic example, with the semantics of PEP 550 applied to a
decimal.setcontext() kind of thing, but it could be anything using
var.set(value):

def process_data_buffers(​buffers):
setcontext(default_context)
for buf in buffers:
for data in buf:
if data.tag == "NEW_PRECISION":
setcontext(context_based_on(data))
else:
yield compute(data)


Code smells? Yes, but maybe you often see much worse things, so let's say
it's fine.

​But then, if you refactor it into a subgenerator like this:

def process_data_buffer(buffer):
for data in buf:
if data.tag == "NEW_PRECISION":
setcontext(context_based_on(data))
else:
yield compute(data)

def process_data_buffers(​buffers):
setcontext(default_context)
for buf in buffers:
yield from buf


Now, if setcontext uses PEP 550 semantics, the refactoring broke the code,
because a generator introduce a scope barrier by adding a LogicalContext on
the stack, and setcontext is only local to the process_data_buffer
subroutine. But the programmer is puzzled, because with regular functions
it had worked just fine in a similar situation before they learned about
generators:


def process_data_buffer(buffer, output):
for data in buf:
if data.tag == "precision change":
setcontext(context_based_on(data))
else:
output.append(compute(data))

def process_data_buffers(​buffers):
output = []
setcontext(default_context)
for buf in buffers:
process_data_buffer(buf, output)

​In fact, this code had another problem, namely that the context state is
leaked out of process_d​ata_buffers, because PEP 550 leaks context state
out of functions, but not out of generators. But we can easily imagine that
the unit tests for process_data_buffers *do* pass.

But let's look at a user of the functionality:

def get_total():
return sum(process_data_buffers(get_buffers()))

setcontext(somecontext)
value = get_total() * compute_factor()


Now the code is broken, because setcontext(somecontext) has no effect,
because get_total() leaks out another context. Not to mention that our data
buffer source now has control over the behavior of compute_factor(). But if
one is lucky, the last line was written as

value = compute_factor() * get_total()


And hooray, the code works!

(Except for perhaps the code that is run after this.)


Now this was of course a completely fictional example, and hopefully I
didn't introduce any bugs or syntax errors other than the ones I described.
I haven't seen code like this anywhere, but somehow we caught the problems
anyway.


-- Koos



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-06 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 10:07 AM, Greg Ewing <greg.ew...@canterbury.ac.nz>
wrote:

> Yury Selivanov wrote:
>
>> Greg, have you seen this new section:
>> https://www.python.org/dev/peps/pep-0550/#should-yield-from-
>> leak-context-changes
>>
>
> That section seems to be addressing the idea of a generator
> behaving differently depending on whether you use yield-from
> on it.
>

​Regarding this, I think yield from should have the same semantics as
iterating over the generator with next/send, and PEP 555 has no issues with
this.


>
> I never suggested that, and I'm still not suggesting it.
>
> The bottomline is that it's easier to
>> reason about context when it's guaranteed that context changes are
>> always isolated in generators no matter what.
>>
>
> I don't see a lot of value in trying to automagically
> isolate changes to global state *only* in generators.
>
> Under PEP 550, if you want to e.g. change the decimal
> context temporarily in a non-generator function, you're
> still going to have to protect those changes using a
> with-statement or something equivalent. I don't see
> why the same thing shouldn't apply to generators.
> ​​
>
> It seems to me that it will be *more* confusing to give
> generators this magical ability to avoid with-statements.
> ​​
>
>
​Exactly. To state it clearly: PEP 555 does not have this issue.


​––Koos​



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-09-06 Thread Koos Zevenhoven
On Wed, Sep 6, 2017 at 12:13 PM, Nathaniel Smith <n...@pobox.com> wrote:

> On Wed, Sep 6, 2017 at 1:49 AM, Ivan Levkivskyi <levkivs...@gmail.com>
> wrote:
> > Another comment from bystander point of view: it looks like the
> discussions
> > of API design and implementation are a bit entangled here.
> > This is much better in the current version of the PEP, but still there
> is a
> > _feelling_ that some design decisions are influenced by the
> implementation
> > strategy.
> >
> > As I currently see the "philosophy" at large is like this:
> > there are different level of coupling between concurrently executing
> code:
> > * processes: practically not coupled, designed to be long running
> > * threads: more tightly coupled, designed to be less long-lived, context
> is
> > managed by threading.local, which is not inherited on "forking"
> > * tasks: tightly coupled, designed to be short-lived, context will be
> > managed by PEP 550, context is inherited on "forking"
> >
> > This seems right to me.
> >
> > Normal generators fall out from this "scheme", and it looks like their
> > behavior is determined by the fact that coroutines are implemented as
> > generators. What I think miht help is to add few more motivational
> examples
> > to the design section of the PEP.
>
> Literally the first motivating example at the beginning of the PEP
> ('def fractions ...') involves only generators, not coroutines, and
> only works correctly if generators get special handling. (In fact, I'd
> be curious to see how Greg's {push,pop}_local_storage could handle
> this case.) The implementation strategy changed radically between v1
> and v2 because of considerations around generator (not coroutine)
> semantics. I'm not sure what more it can do to dispel these feelings
> :-).
>
>
​Just to mention that this is now closely related to the discussion on my
proposal on python-ideas. BTW, that proposal is now submitted as PEP 555 on
the peps repo.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-08-31 Thread Koos Zevenhoven
On Wed, Aug 30, 2017 at 5:36 PM, Yury Selivanov <yselivanov...@gmail.com>
wrote:

> On Wed, Aug 30, 2017 at 9:44 AM, Yury Selivanov <yselivanov...@gmail.com>
> wrote:
> [..]
> >> FYI, I've been sketching an alternative solution that addresses these
> kinds
> >> of things. I've been hesitant to post about it, partly because of the
> >> PEP550-based workarounds that Nick, Nathaniel, Yury etc. have been
> >> describing, and partly because that might be a major distraction from
> other
> >> useful discussions, especially because I wasn't completely sure yet
> about
> >> whether my approach has some fatal flaw compared to PEP 550 ;).
> >
> > We'll never know until you post it. Go ahead.
>
>
Anyway, thanks to these efforts, your proposal has become somewhat more
competitive compared to mine ;). I'll post mine as soon as I find the time
to write everything down. My intention is before next week.


—Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v4

2017-08-30 Thread Koos Zevenhoven
On Wed, Aug 30, 2017 at 2:36 AM, Greg Ewing <greg.ew...@canterbury.ac.nz>
wrote:

> Yury Selivanov wrote:
>
>> While we want "yield from" to have semantics close to a function call,
>>
>
> That's not what I said! I said that "yield from foo()" should
> have semantics close to a function call. If you separate the
> "yield from" from the "foo()", then of course you can get
> different behaviours.
>
> But that's beside the point, because I'm not suggesting
> that generators should behave differently depending on when
> or if you use "yield from" on them.
>
> For (1) we want the context change to be isolated.  For (2) you say
>> that the context change should propagate to the caller.
>>
>
> No, I'm saying that the context change should *always*
> propagate to the caller, unless you do something explicit
> within the generator to prevent it.
>
> I have some ideas on what that something might be, which
> I'll post later.
>
>
​FYI, I've been sketching an alternative solution that addresses these
kinds of things. I've been hesitant to post ​about it, partly because of
the PEP550-based workarounds that Nick, Nathaniel, Yury etc. have been
describing, and partly because that might be a major distraction from other
useful discussions, especially because I wasn't completely sure yet about
whether my approach has some fatal flaw compared to PEP 550 ;).


—Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v3

2017-08-22 Thread Koos Zevenhoven
On Tue, Aug 22, 2017 at 12:44 AM, Yury Selivanov <yselivanov...@gmail.com>
wrote:

> On Mon, Aug 21, 2017 at 5:39 PM, Koos Zevenhoven <k7ho...@gmail.com>
> wrote:
> [..]
> >> In the current version of the PEP, generators are initialized with an
> >> empty LogicalContext.  When they are being iterated (started or
> >> resumed), their LogicalContext is pushed to the EC.  When the
> >> iteration is stopped (or paused), they pop their LC from the EC.
> >>
> >
> > Another quick one before I go: Do we really need to push and pop a LC on
> > each next() call, even if it most likely will never be touched?
>
> Yes, otherwise it will be hard to maintain the consistency of the stack.
>
> There will be an optimization: if the LC is empty, we will push NULL
> to the stack, thus avoiding the cost of allocating an object.
>
> ​
But if LCs are immutable, there needs to be only one empty-LC instance.
That would avoid special-casing NULL in code.

​-- Koos​




> I measured the overhead -- generators will become 0.5-1% slower in
> microbenchmarks, but only when they do pretty much nothing. If a
> generator contains more Python code than a bare "yield" expression,
> the overhead will be harder to detect.




-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v3

2017-08-21 Thread Koos Zevenhoven
On Tue, Aug 22, 2017 at 12:25 AM, Yury Selivanov <yselivanov...@gmail.com>
wrote:

> On Mon, Aug 21, 2017 at 5:14 PM, Koos Zevenhoven <k7ho...@gmail.com>
> wrote:
> [..]
> >> This has consequences for the design in the PEP:
> >>
> >> * what we want to capture at generator creation time is the context
> >> where writes will happen, and we also want that to be the innermost
> >> context used for lookups
> >
> >
> > I don't get it. How is this a consequence of the above two points? And
> why
> > do we need to capture something (a "context") at generator creation time?
> >
>
> We don't need to "capture" anything when a generator is created (it
> was something that PEP 550 version 1 was doing).
>
>
​Ok, good.​



> In the current version of the PEP, generators are initialized with an
> empty LogicalContext.  When they are being iterated (started or
> resumed), their LogicalContext is pushed to the EC.  When the
> iteration is stopped (or paused), they pop their LC from the EC.
>
>
Another quick one before I go: Do we really need to push and pop a LC on
each next() call​, even if it most likely will never be touched?

-- Koos

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 550 v3

2017-08-21 Thread Koos Zevenhoven
On Mon, Aug 21, 2017 at 5:12 PM, Nick Coghlan <ncogh...@gmail.com> wrote:

> On 21 August 2017 at 15:03, Guido van Rossum <gu...@python.org> wrote:
> > Honestly I'm not sure we need the distinction between LC and EC. If you
> read
> > carefully some of the given example code seems to confuse them. If we
> could
> > get away with only a single framework-facing concept, I would be happy
> > calling it ExecutionContext.
>
> Unfortunately, I don't think we can, and that's why I tried to reframe
> the discussion in terms of "Where ContextKey.set() writes to" and
> "Where ContextKey.get() looks things up".
>
> Consider the following toy generator:
>
> def tracking_gen():
> start_tracking_iterations()
> while True:
> tally_iteration()
> yield
>
> task_id = ContextKey("task_id")
> iter_counter = ContextKey("iter_counter")
>
> def start_tracking_iterations():
> iter_counter.set(collection.Counter())
>
> def tally_iteration():
> current_task = task_id.get() # Set elsewhere
> iter_counter.get()[current_task] += 1
>
> Now, this isn't a very *sensible* generator (since it could just use a
> regular object instance for tracking instead of a context variable),
> but nevertheless, it's one that we would expect to work, and it's one
> that we would expect to exhibit the following properties:
>
> 1. When tally_iteration() calls task_id.get(), we expect that to be
> resolved in the context calling next() on the instance, *not* the
> context where the generator was first created
> 2. When tally_iteration() calls iter_counter.get(), we expect that to
> be resolved in the same context where start_tracking_iterations()
> called iter_counter.set()
>
> This has consequences for the design in the PEP:
>
> * what we want to capture at generator creation time is the context
> where writes will happen, and we also want that to be the innermost
> context used for lookups
>

​I don't get it. How is this a consequence of the above two points? And why
do we need to capture something (a "context") at generator creation time?

​-- Koos​



> * other than that innermost context, we want everything else to be dynamic
> * this means that "mutable context saved on the generator" and "entire
> dynamic context visible when the generator runs" aren't the same thing
>
> And hence the introduction of the LocalContext/LogicalContext
> terminology for the former, and the ExecutionContext terminology for
> the latter.
>
>
> ​[...]​




-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] for...else

2017-07-26 Thread Koos Zevenhoven
On Jul 27, 2017 02:38, "MRAB" <pyt...@mrabarnett.plus.com> wrote:

On 2017-07-26 23:55, Koos Zevenhoven wrote:

>
> ​IMO,
>
> for item in sequence:
>  # block
> nobreak:   # or perhaps `if not break:`
>  # block
>
> would be clearer (if the syntax is necessary at all).
>

You couldn't have "if not break:" because that would look like the start of
an 'if' statement.


Do you mean as an implementation issue or for human readability?

"nobreak" would introduce a new keyword, but "not break" wouldn't.


Sure :)

-- Koos (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] for...else

2017-07-26 Thread Koos Zevenhoven
On Mon, Jul 24, 2017 at 7:14 PM, Steven D'Aprano <st...@pearwood.info>
wrote:

> Hello Kiuhnm, and welcome.
>
> On Mon, Jul 24, 2017 at 05:35:03PM +0200, Kiuhnm via Python-Dev wrote:
> > Hello,
> >
> > I think that the expression "for...else" or "while...else" is completely
> > counter-intuitive.
>
>
> You may be right -- this has been discussed many, many times before. In
> my personal opinion, the best (and only accurate!) phrase would have
> been:
>
> for item in sequence:
> # block
> then:
> # block
>
>
​IMO,

for item in sequence:
# block
nobreak:   # or perhaps `if not break:`
# block

would be clearer (if the syntax is necessary at all).


​[...]


> ​
>
> > Wouldn't it be possible to make it clearer? Maybe
> > something like
>
> At this point, no, it is not practical to change the syntax used. Maybe
> when Python 3.0 was first introduced, but that ship has long sailed. It
> is very, very unlikely that the syntax for this will ever change, but if
> it does, it probably won't be until something in the distant future like
> Python 5.
>

​I don't have a strong opinion on this particular case, but if something
like this is changed in Python 5, I think the decision should be made much
earlier (now?) so that the old else syntax could be discouraged (and new
syntax potentially already introduced). The same thing would apply to many
other "possibly in Python 5" changes, where there is no reason to expect
that the situation is somehow different years later.

-- Koos


>
> But not Python 4: Guido has already ruled that Python 4 will not include
> major backwards-incompatible changes. Going from 3 to 4 will not be as
> disruptive as going from 2 to 3.
>
>
​[...]​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 484 update proposal: annotating decorated declarations

2017-06-02 Thread Koos Zevenhoven
On Fri, Jun 2, 2017 at 8:57 PM, Guido van Rossum <gu...@python.org> wrote:
> On Fri, Jun 2, 2017 at 9:41 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>>
>> I still don't understand what would happen with __annotations__. If
>> the decorator returns a non-function, one would expect the annotations
>> to be in the __annotations__ attribute of the enclosing class or
>> module. If it returns a function, they would be in the __annotations__
>> attribute of the function. And I'm talking about the runtime behavior
>> in Python as explained in PEP484 and PEP526. I would expect these
>> declarations to behave according to the same principles as other ways
>> to annotate variables/functions. If there is no runtime behavior, a
>> comment-based syntax might be more appropriate. Or have I missed
>> something?
>
>
> So when returning a function, the runtime version of the decorator can
> easily update the function's __annotations__. But when returning a
> non-function, the decorator would have a hard time updating
__annotations__
> of the containing class/module without "cheating" (e.g. sys._getframe()).
I
> think the latter is similar to e.g. attributes defined with @property --
> those don't end up in __annotations__ either. I think this is an
acceptable
> deficiency.
>

I suppose it is, especially because there seems to be nothing that prevents
you from getting runtime annotations in the enclosing class/module
​:


number: int

@call
def number():
return 42


But for functions one could have (
​using
 the context manager example):


def session(url: str) -> ContextManager[DatabaseSession]: ...

@predeclared
@contextmanager
def session(url: str) -> Iterator[DatabaseSession]:
s = DatabaseSession(url)
try:
yield s
finally:
s.close()


This makes it clear that the function is declared elsewhere. But the
`predeclared` decorator would need tricks like sys._getframe(1) to set
session.__annotations__ according to the predeclaration.

-- Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 484 update proposal: annotating decorated declarations

2017-06-02 Thread Koos Zevenhoven
On Fri, Jun 2, 2017 at 6:34 PM, Naomi Seyfer <na...@seyfer.org> wrote:
> Yep, interested in implementing it!  I will put implementation time on my
> schedule and tell y'all when it is, for holding myself accountable -- it
> turns out I never do anything not on my schedule.
>

I still don't understand what would happen with __annotations__. If
the decorator returns a non-function, one would expect the annotations
to be in the __annotations__ attribute of the enclosing class or
module. If it returns a function, they would be in the __annotations__
attribute of the function. And I'm talking about the runtime behavior
in Python as explained in PEP484 and PEP526. I would expect these
declarations to behave according to the same principles as other ways
to annotate variables/functions. If there is no runtime behavior, a
comment-based syntax might be more appropriate. Or have I missed
something?


—Koos



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 484 update proposal: annotating decorated declarations

2017-05-15 Thread Koos Zevenhoven
On Tue, May 9, 2017 at 8:19 PM, Guido van Rossum <gu...@python.org> wrote:
> There's a PR to the peps proposal here:
> https://github.com/python/peps/pull/242
>
> The full text of the current proposal is below. The motivation for this is
> that for complex decorators, even if the type checker can figure out what's
> going on (by taking the signature of the decorator into account), it's
> sometimes helpful to the human reader of the code to be reminded of the type
> after applying the decorators (or a stack thereof). Much discussion can be
> found in the PR. Note that we ended up having `Callable` in the type because
> there's no rule that says a decorator returns a function type (e.g.
> `property` doesn't).
>
> This is a small thing but I'd like to run it by a larger audience than the
> core mypy devs who have commented so far. There was a brief discussion on
> python-ideas (my original, favorable reply by Nick, my response).
>
> Credit for the proposal goes to Naomi Seyfer, with discussion by Ivan
> Levkivskyi and Jukka Lehtosalo.
>
> If there's no further debate here I'll merge it into the PEP and an
> implementation will hopefully appear in the next version of the typing
> module (also hopefully to be included in CPython 3.6.2 and 3.5.4).
>

So the change would only affect early adopters of this typing feature,
who are likely to upgrade to newer python versions often? Could this
be called a 3.7 feature with a clearly documented bonus that it also
works in 3.6.2+ and 3.5.4+? I mean, to prevent 3rd-party libraries
tested with 3.5(.4) from being broken in 3.5.3?

> Here's the proposed text (wordsmithing suggestions in the PR please):
>
> +Decorators
> +--
> +
> +Decorators can modify the types of the functions or classes they
> +decorate. Use the ``decorated_type`` decorator to declare the type of
> +the resulting item after all other decorators have been applied::
> +
> + from typing import ContextManager, Iterator, decorated_type
> + from contextlib import contextmanager
> +
> + class DatabaseSession: ...
> +
> + @decorated_type(Callable[[str], ContextManager[DatabaseSession]])
> + @contextmanager
> + def session(url: str) -> Iterator[DatabaseSession]:
> + s = DatabaseSession(url)
> + try:
> + yield s
> + finally:
> + s.close()
> +
> +The argument of ``decorated_type`` is a type annotation on the name
> +being declared (``session``, in the example above). If you have
> +multiple decorators, ``decorated_type`` must be topmost. The
> +``decorated_type`` decorator is invalid on a function declaration that
> +is also decorated with ``overload``, but you can annotate the
> +implementation of the overload series with ``decorated_type``.
> +
>

Would __annotations__ be set by the decorator? To me, not setting them
would seem weird, but cases where the result is not a function could
be weird. I also don't see a mention of this only working in stubs.

I like Jukka's version, as it has a clear distinction between
functions and other attributes. But that might require a language
change to provide __annotations__ in a clean manner? Maybe that
language change would be useful elsewhere.

—Koos



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is adding support for os.PathLike an enhancement or bugfix?

2017-05-05 Thread Koos Zevenhoven
On May 5, 2017 10:39 PM, "Chris Barker"  wrote:

Sorry to come late to the game, It wasn't immediately clear to me what the
implications were of the "enhancement or bugfix" distinction...

On Thu, May 4, 2017 at 9:46 PM, Nick Coghlan  wrote:

> That improved casting mechanism and the implicit support in the low
> level APIs is the main benefit I see in PEP 519, and if we were
> talking about an os module API that was missing os.path support, I'd
> be more likely to be on the "it's a bug" side of things.
>

absolutely.


> It's only higher level APIs like shutil that I put into the same
> category as third party libraries for now: Python 3.6 users shouldn't
> expect implicit use of the fspath protocol to be universal yet,


I think stdlib packages, like shutil are very much distinct from third
party  libs, and users certainly expect the entire stdlib to be updated to
support new language features.

We very often make the distinction between third party libs and stdlibs --
in fact, that is one reason we are reluctant to add new packages to the
stdlib...

Indeed, IIRC, that was the entire motivation for PEP 519 -- it started as a
"let's make Path objects work with the stdlib", but when we looked at how
to do that, it became clear that new protocol was teh best way to do that
and also provide flexibility to do that.

And the PEP states:

"""
Changes to Python's standard library are also proposed to utilize this
protocol where appropriate to facilitate the use of path objects where
historically only str and/or bytes file system paths are accepted. The goal
is to facilitate the migration of users towards rich path objects while
providing an easy way to work with code expecting str or bytes .
"""

It doesn't actually say "everywhere possible in the stdlib", but if the
goal is to facilitate migration, as stated, then the any but truly obscure
functions should be covered -- and shutil is certainly not obscure.


Indeed.

So it would be really great if any updates to shutils (and other stdlib
packages) to support the new protocol be back-ported.

Yes, it's "just" adding an extra call, but in my experience, low barrier to
entry are enough to discourage adoption -- and handful of shutil function
failing will certainly be enough to keep some folks from adopting the new
approach.


I suppose the worst thing to happen is what Eric describes. But technically
speaking, passing os.PathLike to shutil functions might currently be
undefined behavior, so the change is 'legal' if it's not documented ;).

-- Koos (mobile)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is adding support for os.PathLike an enhancement or bugfix?

2017-05-04 Thread Koos Zevenhoven
On Thu, May 4, 2017 at 8:30 PM, Terry Reedy <tjre...@udel.edu> wrote:
> On 5/4/2017 10:43 AM, Koos Zevenhoven wrote:
>> On Thu, May 4, 2017 at 4:19 AM, Terry Reedy <tjre...@udel.edu> wrote:
>>> Enhancing public APIs in normal (non-provisional) modules in bugfix
>>> releases
>>> has turned out to be a bad thing to do.  Hence the policy to not do that.
>>> The few exceptions have been necessary to fix a bug that needed to be
>>> fixed,
>>> and could not reasonably be fixed otherwise.
>>
>> Such exceptions can of course more easily be made when the adoption of
>> a version is still small, and almost all users will never see X.Y.0 or
>> X.Y.1.
>
> This is not an allowed excuse for breaking the policy.  The x.y language is
> defined when x.y.0 is released.  Please stop.
>

Don't worry, I didn't even start :)

I do think it can cause problems if most of a stdlib module supports
PathLike and some parts do not. People will write code in 3.7
believing it works on 3.6, while it doesn't. Anyway, I'm completely
happy if the policy outweighs this issue, and I have absolutely no
need to argue about the decision.

—Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is adding support for os.PathLike an enhancement or bugfix?

2017-05-04 Thread Koos Zevenhoven
On Thu, May 4, 2017 at 4:19 AM, Terry Reedy <tjre...@udel.edu> wrote:
> On 5/3/2017 7:13 PM, Koos Zevenhoven wrote:
>>
[...]

>> Shutil was among the most important to be updated, IMO.
>>
>> I had made some sort of list of affected modules elsewhere [1]:
>> ntpath, posixpath, os.scandir, os.[other stuff], DirEntry (tempted to
>> say os.DirEntry, but that is
>> not true), shutil.[stuff], (io.)open, fileinput, filecmp, zipfile,
>> tarfile, tempfile (for the 'dir' keyword arguments), maybe even glob
>> and fnmatch (are the patterns paths?)
>
>
> What did not get done for 3.6 should be proposed for 3.7.
>

Anyone, feel free. The nightmare part is done, so this could be a case
where a PR actually pays off in terms of being able to use the
feature. There's no need for any unnecessary masochism (should there
ever be?).

[...]
>
> Enhancing public APIs in normal (non-provisional) modules in bugfix releases
> has turned out to be a bad thing to do.  Hence the policy to not do that.
> The few exceptions have been necessary to fix a bug that needed to be fixed,
> and could not reasonably be fixed otherwise.

Such exceptions can of course more easily be made when the adoption of
a version is still small, and almost all users will never see X.Y.0 or
X.Y.1. The fraction of 3.6 users is probably super tiny right now, and
even those users are likely to eagerly update to bugfix releases. For
instance, are there any major (LTS?) linux distros that already come
with 3.6.0 or 3.6.1? Well OK, 3.6.2 may be too late for some.

—Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is adding support for os.PathLike an enhancement or bugfix?

2017-05-03 Thread Koos Zevenhoven
On Thu, May 4, 2017 at 1:07 AM, Terry Reedy <tjre...@udel.edu> wrote:
> On 5/3/2017 2:15 PM, Brett Cannon wrote:
>>
>> My allergies have hit me hard so I'm not thinking at full capacity, but
>> did we ever decide if supporting os.PathLike in the stdlib was viewed as an
>> enhancement or bugfix? Specifically I'm thinking of
>> https://bugs.python.org/issue30218 for adding support to
>> shutil.unpack_archive() and whether it should be backported to 3.6.
>
>
> On the face of it, that particular issue looks like an enhancement that
> should have gone into 3.6

Agreed.

> (if ever), but did not.  I notice that
> https://www.python.org/dev/peps/pep-0519/#implementation
> did not include "Update shutil", so it was not done, at least not
> completely.

Shutil was among the most important to be updated, IMO.

I had made some sort of list of affected modules elsewhere [1]:
ntpath, posixpath, os.scandir, os.[other stuff], DirEntry (tempted to
say os.DirEntry, but that is
not true), shutil.[stuff], (io.)open, fileinput, filecmp, zipfile,
tarfile, tempfile (for the 'dir' keyword arguments), maybe even glob
and fnmatch (are the patterns paths?)

It looks like what made it to PEP519 was mainly this:

"It is expected that most APIs in Python's standard library that
currently accept a file system path will be updated appropriately to
accept path objects (whether that requires code or simply an update to
documentation will vary)."

> Was shutil updated at all?  Is unpack_archive the only shutil function not
> updated?  If so, I could see the omission as a bug.
>
> If the patch for 30218 were applied in 3.6, would the doc
> https://docs.python.org/3/library/shutil.html#shutil.unpack_archive
> need to be changed, with a note "Added in 3.6.2: filename can be any
> pathlike object"?  If so, it is an enhancement.

Regardless of bugfix vs enhancement semantics, that seems like a good
thing to do.

-- Koos

[1] e.g. in this thread somewhere:
https://mail.python.org/pipermail/python-ideas/2016-April/039827.html


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] API design: where to add async variants of existing stdlib APIs?

2017-03-10 Thread Koos Zevenhoven
On Wed, Mar 1, 2017 at 7:42 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
> Short version:
>
> - there are some reasonable requests for async variants of contextlib APIs
> for 3.7
> - prompted by Raymond, I'm thinking it actually makes more sense to add
> these in a new `asyncio.contextlib` module than it does to add them directly
> to the existing module
> - would anyone object strongly to my asking authors of the affected PRs to
> take their changes in that direction?
>

Related to this, here's a post from two years ago in attempt to tackle
the cause of this problem (of needing async and non-async variants)
and solve it in the long term.

https://mail.python.org/pipermail/python-ideas/2015-May/033267.html

You can read the details in that thread, but in short, the idea is
that all functionality that may have to wait for something (IO etc.)
should be explicitly awaited, regardless of whether the code takes
advantage of concurrency or not. This solution is an attempt to do
this without enforcing a specific async framework.

In the post, I made up the terms "Y end" and "L end", because I did
not know what to call them. This was when the draft PEP492 was being
discussed.

L is the end that 'drives' the (chain of) coroutines, usually an event
loop. Y is the other end, the most inner co-routine in the
calling/awaiting chain that does the yields. The L and Y end together
could hide the need of two variants, as explained in the above link.

—Koos

> Longer version:
>
> There are a couple of open issues requesting async variants of some
> contextlib APIs (asynccontextmanager and AsyncExitStack). I'm inclined to
> accept both of them, but Raymond raised a good question regarding our
> general design philosophy for these kinds of additions: would it make more
> sense to put these in an "asyncio.contextlib" module than it would to add
> them directly to contextlib itself?
>
> The main advantage I see to the idea is that if someone proposed adding an
> "asyncio" dependency to contextlib, I'd say no. For the existing
> asynccontextmanager PR, I even said no to adding that dependency to the
> standard contextlib test suite, and instead asked that the new tests be
> moved out to a separate file, so the existing tests could continue to run
> even if asyncio was unavailable for some reason.
>
> While rejecting the idea of an asyncio dependency isn't a problem for
> asyncontextmanager specifically (it's low level enough for it not to
> matter), it's likely to be more of a concern for the AsyncExitStack API,
> where the "asyncio.iscoroutinefunction" introspection API is likely to be
> quite helpful, as are other APIs like `asyncio.ensure_future()`.
>
> So would folks be OK with my asking the author of the PR for
> https://bugs.python.org/issue29679 (adding asynccontextmanager) to rewrite
> the patch to add it as asyncio.contextlib.asyncontextmanager (with a
> cross-reference from the synchronous contextlib docs), rather than the
> current approach of adding it directly to contextlib?
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-06 Thread Koos Zevenhoven
On Mon, Sep 5, 2016 at 6:06 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 5 September 2016 at 06:42, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> On Sun, Sep 4, 2016 at 6:38 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
>>>
>>> There are two self-consistent sets of names:
>>>
>>
>> Let me add a few. I wonder if this is really used so much that
>> bytes.chr is too long to type (and you can do bchr = bytes.chr if you
>> want to)
>>
>> bytes.chr (or bchr in builtins)
>
> The main problem with class method based spellings is that we need to
> either duplicate it on bytearray or else break the bytearray/bytes
> symmetry and propose "bytearray(bytes.chr(x))" as the replacement for
> current cryptic "bytearray([x])"

Warning: some API-design philosophy below:

1. It's not as bad to break symmetry regarding what functionality is
offered for related object types (here: str, bytes, bytearray) than it
is to break symmetry in how the symmetric functionality is provided.
IOW, a missing unnecessary functionality is less bad than exposing the
equivalent functionality under a different name. (This might be kind
of how Random832 was reasoning previously)

2. Symmetry is more important in object access functionality than it
is in instance creation. IOW, symmetry regarding 'constructors' (here:
bchr, bytes.chr, bytes.byte, ...) across different types is not as
crucial as symmetry in slicing. The reason is that the caller of a
constructor is likely to know which class it is instantiating. A
consumer of bytes/bytearray/str-like objects often does not know which
type is being dealt with.


I might be crying over spilled milk here, but that seems to be the
point of the whole PEP. That chars view thing might collect some of
the milk back back into a bottle:

mystr[whatever]  <-> mybytes.chars[whatever] <-> mybytearray.chars[whatever]
iter(mystr) <-> iter(mybytes.chars) <-> iter(mybytearray.chars)

Then introduce 'chars' on str and this becomes

mystring.chars[whatever]  <-> mybytes.chars[whatever] <->
mybytearray.chars[whatever]
iter(mystr.chars) <-> iter(mybytes.chars) <-> iter(mybytearray.chars)

If iter(mystr.chars) is recommended and iter(mystr) discouraged, then
after a decade or two, the world may look quite different regarding
how important it is for a str to be iterable.

This would solve multiple problems at once. Well I admit that "at
once" is not really an accurate description of the process :).

[...]
> You also run into a searchability problem as "chr" will get hits for
> both the chr builtin and bytes.chr, similar to the afalg problem that
> recently came up in another thread. While namespaces are a honking
> great idea, the fact that search is non-hierarchical means they still
> don't give API designers complete freedom to reuse names at will.

Oh, I can kind of see a point here, especially if the search hits
aren't related in any way. Why not just forget all symmetry if this is
an issue? But is it really a bad thing if by searching you find that
there's a chr for both str and bytes?

If I think, "I want to turn my int into a bytes 'character' kind of in
the way that chr turns my int into a str". What am I going to search
or google for? I can't speak for others, but I would probably search
for something that contains 'chr' and 'bytes'.

Based on this, I'm unable to see the search disadvantage of bytes.chr.

[...]
>> bytes.char(or bytes.chr or bchr in builtins)
>> bytes.chars, bytearray.chars (sequence views)
>
> The views are already available via memoryview.cast if folks really
> want them, but encouraging their use in general isn't a great idea, as
> it means more function developers now need to ask themselves "What if
> someone passes me a char view rather than a normal bytes object?".

Thanks, I think this is the first real argument I hear against the
char view. In fact, I don't think people should ask themselves that
question, and just not accept bytes views as input. Would it be enough
to discourage storing and passing bytes views?

Anyway, the only error that would pass silently would be that the
passed-in object gets indexed (e.g. obj[0]) and a bytes-char comes out
instead of an int. But it would be a strange thing to do by the caller
to pass a char view into the bytes-consumer. I could imagine someone
wanting to pass a bytes view into a str-consumer. But there are no
significant silently-passing errors there. If str also gets .chars,
then it becomes even easier to support this.

-- Koos

>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-06 Thread Koos Zevenhoven
On Mon, Sep 5, 2016 at 3:30 AM, Random832 <random...@fastmail.com> wrote:
> On Sun, Sep 4, 2016, at 16:42, Koos Zevenhoven wrote:
>> On Sun, Sep 4, 2016 at 6:38 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
>> >
>> > There are two self-consistent sets of names:
>> >
>>
>> Let me add a few. I wonder if this is really used so much that
>> bytes.chr is too long to type (and you can do bchr = bytes.chr if you
>> want to):
>>
>> bytes.chr (or bchr in builtins)
>> bytes.chr_at, bytearray.chr_at
>
> Ugh, that "at" is too reminiscent of java. And it just feels wrong to
> spell it "chr" rather than "char" when there's a vowel elsewhere in the
> name.
>

Oh, I didn't realize that connection. It's funny that I get a Java
connotation from get* methods ;).


> Hmm... how offensive to the zen of python would it be to have "magic" to
> allow both bytes.chr(65) and b'ABCDE'.chr[0]? (and possibly also
> iter(b'ABCDE'.chr)? That is, a descriptor which is callable on the
> class, but returns a view on instances?

Indeed quite magical, while I really like how easy it is to remember
this *once you realize what is going on*.

I think bytes.char (on class) and data.chars (on instance) would be
quite similar.

-- Koos


> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Do PEP 526 type declarations define the types of variables or not?

2016-09-05 Thread Koos Zevenhoven
On Tue, Sep 6, 2016 at 1:49 AM, Sven R. Kunze <srku...@mail.de> wrote:
> Didn't Koos say this works more like an expression annotation?
>
> IMO, the type of the expression is what is specified but the type of the
> variable can change over time (as you demonstrated).

That's exactly the kind of semantics I'm describing in the
python-ideas thread. An that's exactly how Python works: the type of a
variable can change every time you assign a value to it (but not in
between, unless you're doing funny stuff). So in a sense you annotate
the *value* by annotating the variable at the point in the function
where the value is assigned to it.

There are open questions in this approach of course. But if you're
interested, don't hesitate to discuss or ask questions in the
python-ideas thread. I won't answer before I wake up, though ;).

-- Koos


>
> Sven
>
>
> PS: thinking this way, the new syntax is actually confusing as it annotates
> the variable not the expression. :-/
>
>
>
> On 05.09.2016 17:26, Mark Shannon wrote:
>>
>> Hi,
>>
>> PEP 526 states that "This PEP aims at adding syntax to Python for
>> annotating the types of variables" and Guido seems quite insistent that the
>> declarations are for the types of variables.
>>
>> However, I get the impression that most (all) of the authors and
>> proponents of PEP 526 are quite keen to emphasise that the PEP in no way
>> limits type checkers from doing what they want.
>>
>> This is rather contradictory. The behaviour of a typechecker is defined by
>> the typesystem that it implements. Whether a type annotation determines the
>> type of a variable or an expression alters changes what typesystems are
>> feasible. So, stating that annotations define the type of variables *does*
>> limit what a typechecker can or cannot do.
>>
>> Unless of course, others may have a different idea of what the "type of a
>> variable" means.
>> To me, it means it means that for all assignments `var = expr`
>> the type of `expr` must be a subtype of the variable,
>> and for all uses of var, the type of the use is the same as the type of
>> the variable.
>>
>> In this example:
>>
>> def bar()->Optional[int]: ...
>>
>> def foo()->int:
>> x:Optional[int] = bar()
>> if x is None:
>> return -1
>> return x
>>
>> According to PEP 526 the annotation `x:Optional[int]`
>> means that the *variable* `x` has the type `Optional[int]`.
>> So what is the type of `x` in `return x`?
>> If it is `Optional[int]`, then a type checker is obliged to reject this
>> code. If it is `int` then what does "type of a variable" actually mean,
>> and why aren't the other uses of `x` int as well?
>>
>> Cheers,
>> Mark.
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/srkunze%40mail.de
>
>
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-05 Thread Koos Zevenhoven
On Mon, Sep 5, 2016 at 5:02 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 5 September 2016 at 23:46, Nick Coghlan <ncogh...@gmail.com> wrote:
>> Under such "parameter annotation like" semantics, uninitialised
>> variable annotations would only make sense as a new form of
>> post-initialisation assertion,

Why not discuss this in the python-ideas thread where I quote myself
from last Friday regarding the notion of annotations as assertions?

>> and perhaps as some form of
>> Eiffel-style class invariant documentation syntax.

I hope this is simpler than it sounds :-)

> Thinking further about the latter half of that comment, I realised
> that the PEP 484 equivalence I'd like to see for variable annotations
> in a class body is how they would relate to a property definition
> using the existing PEP 484 syntax.
>
> For example, consider:
>
> class AnnotatedProperty:
>
> @property
> def x(self) -> int:
> ...
>
> @x.setter
> def x(self, value: int) -> None:
> ...
>
> @x.deleter
> def x(self) -> None:
> ...
>
> It would be rather surprising if that typechecked differently from:
>
> class AnnotatedVariable:
>
> x: int
>

How about just using the latter way? That's much clearer. I doubt this
needs a change in the PEP.

> For ClassVar, you'd similarly want:
>
>
> class AnnotatedClassVariable:
>
> x: ClassVar[int]
>
> to typecheck like "x" was declared as an annotated property on the metaclass.
>

Sure, there are many things that one may consider equivalent. I doubt
you'll be able to list them all in a way that everyone agrees on. And
I hope you don't take this as a challenge -- I'm in the don't-panic
camp :).


-- Koos


> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-05 Thread Koos Zevenhoven
Sorry, I don't have time to read emails of this length now, and
perhaps I'm interpreting your emails more literally than you write
them, anyway.

If PEP 484 introduces unnecessary restrictions at this point, that's a
separate issue. I see no need to copy those into PEP 526. I'll be
posting my own remaining concerns regarding PEP 526 when I find the
time.

-- Koos


On Mon, Sep 5, 2016 at 4:46 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 5 September 2016 at 21:46, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> The thing I'm promoting here is to not add anything to PEP 526 that
>> says what a type checker is supposed to do with type annotations.
>
> PEP 526 says it doesn't intend to expand the scope of typechecking
> semantics beyond what PEP 484 already supports. For that to be true,
> it needs to be able to define expected equivalencies between the
> existing semantics of PEP 484 and the new syntax in PEP 526.
>
> If those equivalencies can't be defined, then Mark's concerns are
> valid, and the PEP either needs to be deferred as inadvertently
> introducing new semantics while intending to only introduce new
> syntax, or else the intended semantics need to be spelled out as they
> were in PEP 484 so folks can judge the proposal accurately, rather
> than attempting to judge it based on an invalid premise.
>
> For initialised variables, the equivalence between the two PEPs is
> straightforward: "x: T = expr" is equivalent to "x = expr # type: T"
>
> If PEP 526 always required an initialiser, and didn't introduce
> ClassVar, there'd be no controversy, and we'd already be done.
>
> However, the question of "Does this new syntax necessarily imply the
> introduction of new semantics?" gets a lot murkier for uninitialised
> variables.
>
> A strict "no new semantics beyond PEP 484" interpretation would mean
> that these need to be interpreted the same way as parameter
> annotations: as a type hint on the outcome of the code executed up to
> that point, rather than as a type constraint on assignment statements
> in the code *following* that point.
>
> Consider:
>
> def simple_appender(base: List[T], value: T) -> None:
> base.append(value)
>
> This will typecheck fine - lists have append methods, and the value
> appended conforms to what our list expects.
>
> The parameter annotations mainly act as constraints on how this
> function is *called*, with the following all being problematic:
>
> simple_appender([1, 2, 3], "hello") # Container/value type mismatch
> simple_appender([1, 2, 3], None) # Value is not optional
> simple_appender((1, 2, 3), 4) # A tuple is not a list
>
> However, because of the way name binding in Python works, the
> annotations in *no way* constrain assignments inside the function
> body:
>
> def not_so_simple_appender(base: List[T], value: T) -> None:
> other_ref = base
> base = value
> other_ref.append(base)
>
> From a dynamic typechecking perspective, that's just as valid as the
> original implementation, since the "List[T]" type of "other_ref" is
> inferred from the original type of "base" before it gets rebound to
> value and has its new type inferred as "T".
>
> This freedom to rebind an annotated name without a typechecker
> complaining is what Mark is referring to when he says that PEP 484
> attaches annotations to expressions rather than types.
>
> Under such "parameter annotation like" semantics, uninitialised
> variable annotations would only make sense as a new form of
> post-initialisation assertion, and perhaps as some form of
> Eiffel-style class invariant documentation syntax.
>
> The usage to help ensure code correctness in multi-branch
> initialisation cases would then look something like this:
>
>if case1:
> x = ...
> elif case2:
> x = ...
> else:
> x = ...
> assert x : List[T] # If we get to here without x being List[T],
> something's wrong
>
> The interpreter could then optimise type assertions out entirely at
> function level (even in __debug__ mode), and turn them into
> annotations at module and class level (with typecheckers then deciding
> how to process them).
>
> That's not what the PEP proposes for uninitialised variables though:
> it proposes processing them *before* a series of assignment
> statements, which *only makes sense* if you plan to use them to
> constrain those assignments in some way.
>
> If you wanted to write something like that under a type assertion
> spelling, then you could enlist the aid of the "all" builtin:
>
> assert all(x) : Li

Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-05 Thread Koos Zevenhoven
It looks like you are trying to make sense of this, but unfortunately
there's some added mess and copy errors regarding who said
what. I think no such errors remain in what I quote below:

On Mon, Sep 5, 2016 at 3:10 PM, Steven D'Aprano <st...@pearwood.info> wrote:
>
> [Koos Zevenhoven]
>> >> How is it going to help that these are equivalent within one checker,
>> >> if the meaning may differ across checkers?
>
> Before I can give an answer to your [Koos'] question, I have to
> understand what you see as the problem here.

The problem was that suggested restrictive addition into PEP 526 with
no proper justification, especially since the PEP was not supposed to
restrict the semantics of type checking. I was asking how it would
help to add that restriction. Very simple. Maybe some people got
confused because I did want to *discuss* best practices for type
checking elsewhere.

> I *think* that you are worried that two different checkers will disagree
> on what counts as a type error. That given the same chunk of code:

In the long term, I'm worried about that, but there's nothing that PEP
526 can do about it at this point.

> [Nick Coghlan]
>> > For typechecker developers, it provides absolute clarity that the
>> > semantics of the new annotations should match the behaviour of
>> > existing type comments when there's an initialiser present,
>
> [Koos]
>> I understood that, but what's the benefit?
>
> Are you asking what is the benefit of having three forms of syntax for
> the same thing?

No, still the same thing: What is the benefit of that particular
restriction, when there are no other restrictions? Better just leave
it out.

> The type comment systax is required for Python 2 and backwards-
> compatibility. That's a given.

Sure, but all type checkers will not have to care about Python 2.

> The variable annotation syntax is required because the type comment
> syntax is (according to the PEP) very much a second-best solution. See
> the PEP:
>
> https://www.python.org/dev/peps/pep-0526/#id4
>
> So this is a proposal to create a *better* syntax for something which
> already exists. The old version, using comments, cannot be deprecated or
> removed, as it is required for Python 3.5 and older.

Right.

> Once we allow
>
> x: T = value
>
> then there benefit in also allowing:
>
> x: T
> x = value
>
> since this supports some of the use cases that aren't well supported by
> type comments or one-line variable annotations. E.g. very long or deeply
> indented lines, situations where the assignment to x is inside an
> if...else branch, or any other time you wish to declare the type of the
> variable before actually setting the variable.

Sure.

> [Nick]
>> > For folks following along without necessarily keeping up with all the
>> > nuances, it makes it more explicit what Guido means when he says "PEP
>> > 526 does not make a stand on the
>> > behavior of type checkers (other than deferring to PEP 484)."
>
> [Koos]
>> What you are proposing is exactly "making a stand on the behavior of
>> type checkers", and the examples you provide below are all variations
>> of the same situation and provide no justification for a general rule.
>
> I'm sorry, I don't understand this objection. The closest I can get to
> an answer would be:
>
> A general rule is better than a large number of unconnected, arbitrary,
> special cases.

A general rule that does not solve a problem is worse than no rule.


-- Koos


>
> --
> Steve
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-05 Thread Koos Zevenhoven
On Mon, Sep 5, 2016 at 1:04 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 5 September 2016 at 18:19, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> On Mon, Sep 5, 2016 at 5:21 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
>>> On 5 September 2016 at 04:40, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>>>> On Sun, Sep 4, 2016 at 9:13 PM, Ivan Levkivskyi <levkivs...@gmail.com> 
>>>> wrote:
>>>>> On 4 September 2016 at 19:59, Nick Coghlan <ncogh...@gmail.com> wrote:
>>>> [...]
>>>>>>
>>>>>> Similarly, it would be reasonable to say that these three snippets
>>>>>> should all be equivalent from a typechecking perspective:
>>>>>>
>>>>>> x = None # type: Optional[T]
>>>>>>
>>>>>> x: Optional[T] = None
>>>>>>
>>>>>> x: Optional[T]
>>>>>> x = None
>>>>>
>>>>>
>>>>> Nice idea, explicit is better than implicit.
>>>>
>>>> How is it going to help that these are equivalent within one checker,
>>>> if the meaning may differ across checkers?
>>>
>>> For typechecker developers, it provides absolute clarity that the
>>> semantics of the new annotations should match the behaviour of
>>> existing type comments when there's an initialiser present,
>>
>> I understood that, but what's the benefit? I hope there will be a type
>> checker that breaks this "rule".
>
> Such a typechecker means you're not writing Python anymore, you're
> writing Java/C++/C# in a language that isn't designed to be used that
> way.

I'm glad those are all the languages you accuse me of. The list could
have been a lot worse. I actually have some good memories of Java. It
felt kind of cool at that age, and it taught me many things about
undertanding the structure of large and complicated programs after I
had been programming for years in other languages, including C++. It
also taught me to value simplicity instead, so here we are.

> Fortunately, none of the current typecheckers have made that mistake,
> nor does anyone appear to be promoting this mindset outside this
> particular discussion.

The thing I'm promoting here is to not add anything to PEP 526 that
says what a type checker is supposed to do with type annotations.
Quite the opposite of Java/C++/C#, I would say.

We can, of course, speculate about the future of type checkers and the
implications of PEP 526 on it. That's what I'm trying to do on
python-ideas, speculate about the best kind of type checking
(achievable with PEP 526 annotations) [1].

--Koos


[1] https://mail.python.org/pipermail/python-ideas/2016-September/042076.html

>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-05 Thread Koos Zevenhoven
On Mon, Sep 5, 2016 at 5:21 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 5 September 2016 at 04:40, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> On Sun, Sep 4, 2016 at 9:13 PM, Ivan Levkivskyi <levkivs...@gmail.com> wrote:
>>> On 4 September 2016 at 19:59, Nick Coghlan <ncogh...@gmail.com> wrote:
>> [...]
>>>>
>>>> Similarly, it would be reasonable to say that these three snippets
>>>> should all be equivalent from a typechecking perspective:
>>>>
>>>> x = None # type: Optional[T]
>>>>
>>>> x: Optional[T] = None
>>>>
>>>> x: Optional[T]
>>>> x = None
>>>
>>>
>>> Nice idea, explicit is better than implicit.
>>
>> How is it going to help that these are equivalent within one checker,
>> if the meaning may differ across checkers?
>
> For typechecker developers, it provides absolute clarity that the
> semantics of the new annotations should match the behaviour of
> existing type comments when there's an initialiser present,

I understood that, but what's the benefit? I hope there will be a type
checker that breaks this "rule".

> or of a
> parameter annotation when there's no initialiser present.

No, your suggested addition does not provide any reference to this.
(...luckily, because that would have been worse.)

> For folks following along without necessarily keeping up with all the
> nuances, it makes it more explicit what Guido means when he says "PEP
> 526 does not make a stand on the
> behavior of type checkers (other than deferring to PEP 484)."

What you are proposing is exactly "making a stand on the behavior of
type checkers", and the examples you provide below are all variations
of the same situation and provide no justification for a general rule.

Here's a general rule:

The closer it gets to the end of drafting a PEP [1],
the more carefully you have to justify changes.

Justification is left as an exercise ;-).

--Koos

[1] or any document (or anything, I suppose)

> For example, the formulation of the divergent initialisation case
> where I think the preferred semantics are already implied by PEP 484
> can be looked at this way:
>
> x = None # type: Optional[List[T]]
> if arg is not None:
> x = list(arg)
> if other_arg is not None:
> x.extend(arg)
>
> It would be a strange typechecker indeed that handled that case
> differently from the new spellings made possible by PEP 526:
>
> x: Optional[List[T]] = None
> if arg is not None:
> x = list(arg)
> if other_arg is not None:
> x.extend(arg)
>
> x: Optional[List[T]]
> if arg is None:
> x = None
> else:
> x = list(arg)
> if other_arg is not None:
> x.extend(arg)
>
> x: Optional[List[T]]
> if arg is not None:
> x = list(arg)
> if other_arg is not None:
> x.extend(arg)
> else:
> x = None
>
> Or from the semantics of PEP 484 parameter annotations:
>
> def my_func(arg:Optional[List[T]], other_arg=None):
> # other_arg is implicitly Optional[Any]
> if arg is not None and other_arg is not None:
> # Here, "arg" can be assumed to be List[T]
> # while "other_arg" is Any
> arg.extend(other_arg)
>
> A self-consistent typechecker will either allow all of the above, or
> prohibit all of the above, while a typechecker that *isn't*
> self-consistent would be incredibly hard to use.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 6:38 PM, Nick Coghlan  wrote:
>
> There are two self-consistent sets of names:
>

Let me add a few. I wonder if this is really used so much that
bytes.chr is too long to type (and you can do bchr = bytes.chr if you
want to):

bytes.chr (or bchr in builtins)
bytes.chr_at, bytearray.chr_at
bytes.iterchr, bytearray.iterchr

bytes.chr (or bchr in builtins)
bytes.chrview, bytearray.chrview (sequence views)

bytes.char(or bytes.chr or bchr in builtins)
bytes.chars, bytearray.chars (sequence views)


> bchr
> bytes.getbchr, bytearray.getbchr
> bytes.iterbchr, bytearray.iterbchr
>
> byte
> bytes.getbyte, bytearray.getbyte
> bytes.iterbytes, bytearray.iterbytes
>
> The former set emphasises the "stringiness" of this behaviour, by
> aligning with the chr() builtin
>
> The latter set emphasises that these APIs are still about working with
> arbitrary binary data rather than text, with a Python "byte"
> subsequently being a length 1 bytes object containing a single integer
> between 0 and 255, rather than "What you get when you index or iterate
> over a bytes instance".
>
> Having noticed the discrepancy, my personal preference is to go with
> the latter option (since it better fits the "executable pseudocode"
> ideal and despite my reservations about "bytes objects contain int
> objects rather than byte objects", that shouldn't be any more
> confusing in the long run than explaining that str instances are
> containers of length-1 str instances). The fact "byte" is much easier
> to pronounce than bchr (bee-cher? bee-char?) also doesn't hurt.
>
> However, I suspect we'll need to put both sets of names in front of
> Guido and ask him to just pick whichever he prefers to get it resolved
> one way or the other.
>
>> And didn't someone recently propose deprecating iterability of str
>> (not indexing, or slicing, just iterability)? Then str would also need
>> a way to provide an iterable or sequence view of the characters. For
>> consistency, the str functionality would probably need to mimic the
>> approach in bytes. IOW, this PEP may in fact ultimately dictate how to
>> get a iterable/sequence from a str object.
>
> Strings are not going to become atomic objects, no matter how many
> times people suggest it.
>

You consider all non-iterable objects atomic? If str.__iter__ raises
an exception, it does not turn str somehow atomic. I wouldn't be
surprised by breaking changes of this nature to python at some point.
The breakage will be quite significant, but easy to fix.

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 9:13 PM, Ivan Levkivskyi <levkivs...@gmail.com> wrote:
> On 4 September 2016 at 19:59, Nick Coghlan <ncogh...@gmail.com> wrote:
[...]
>>
>> Similarly, it would be reasonable to say that these three snippets
>> should all be equivalent from a typechecking perspective:
>>
>> x = None # type: Optional[T]
>>
>> x: Optional[T] = None
>>
>> x: Optional[T]
>> x = None
>
>
> Nice idea, explicit is better than implicit.
>

How is it going to help that these are equivalent within one checker,
if the meaning may differ across checkers?

-- Koos


> --
> Ivan
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com
>



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 7:43 PM, Nick Coghlan  wrote:
> On 4 September 2016 at 21:32, Ivan Levkivskyi  wrote:
>> The first form still could be interpreted by type checkers
>> as annotation for value (a cast to more precise type):
>>
>> variable = cast(annotation, value) # visually also looks similar
>
> I think a function based spelling needs to be discussed further, as it
> seems to me that at least some of the goals of the PEP could be met
> with a suitable definition of "cast" and "declare", with no syntactic
> changes to Python. Specifically, consider:
>
> def cast(value, annotation):
> return value
>

typing.cast already exists.

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 3:43 PM, Steven D'Aprano <st...@pearwood.info> wrote:
[...]
> [steve@ando ~]$ cat test.py
> from typing import Optional
>
> def foo(x:Optional[int])->int:
> if x is None:
> return -1
> return x + 1
>
> def bar(x:Optional[int])->int:
> y = x  # the type of y must be inferred
> if y is None:
> return y + 1
> return len(y)
>
> [steve@ando ~]$ mypy --strict-optional test.py
> test.py: note: In function "bar":
> test.py:11: error: Unsupported operand types for + (None and "int")
> test.py:12: error: Argument 1 to "len" has incompatible type "int"; expected 
> "Sized"
>
>
> foo passes the type check; bar fails.
>

That's great. While mypy has nice features, these examples have little
to do with PEP 526 as they don't have variable annotations, not even
using comments.

For some reason, pip install --upgrade mypy fails for me at the
moment, but at least mypy version 0.4.1 does not allow this:

from typing import Callable

def foo(cond: bool, bar : Callable, baz : Callable) -> float:
if cond:
x = bar() # type: int
else:
x = baz() # type: float
return x / 2

and complains that

test.py:7: error: Name 'x' already defined". Maybe someone can confirm
this with a newer version.

Here,

def foo(cond: bool) -> float:
if cond:
x = 1
else:
x = 1.5
return x / 2

you get a different error:

test.py:5: error: Incompatible types in assignment (expression has
type "float", variable has type "int")

Maybe someone can confirm this with a newer version, but IIUC this is
still the case.

>> I want a checker to check my code and, with minimal annotations, give me
>> confidence that my code is correct
>
> Don't we all.
>

I would add *with minimal restrictions on how the code is supposed to
be written* for type checking to work. It's not at all obvious that
everyone thinks that way. Hence, the "Semantics for type checking"
thread on python-ideas.

-- Koos

>
>
> --
> Steve
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 1:52 PM, Mark Shannon <m...@hotpy.org> wrote:
[...]
>
> The key difference is in placement.
> PEP 484 style
> variable = value # annotation
>
> Which reads to me as if the annotation refers to the value.
> PEP 526
> variable: annotation = value
>
> Which reads very much as if the annotation refers to the variable.
> That is a change in terms of semantics and a change for the worse, in terms
> of expressibility.
>

You have probably noticed this already, but in the semantics which I
have now explained more precisely on python-ideas

https://mail.python.org/pipermail/python-ideas/2016-September/042076.html

an annotation like

variable: annotation = value

is a little closer to an expression annotation. I.e. it does not say
that 'variable' should *always* have the type given by 'annotation'.

-- Koos

>
> Cheers,
> Mark.
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-04 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 12:51 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 4 September 2016 at 08:11, Random832 <random...@fastmail.com> wrote:
>> On Sat, Sep 3, 2016, at 18:06, Koos Zevenhoven wrote:
>>> I guess one reason I don't like bchr (nor chrb, really) is that they
>>> look just like a random sequence of letters in builtins, but not
>>> recognizable the way asdf would be.
>>>
>>> I guess I have one last pair of suggestions for the name of this
>>> function: bytes.chr or bytes.char.
>
> The PEP started out with a classmethod, and that proved problematic
> due to length and the expectation of API symmetry with bytearray. A
> new builtin paralleling chr avoids both of those problems.
>
>> What about byte? Like, not bytes.byte, just builtins.byte.
>
> The main problem with "byte" as a name is that "bytes" is *not* an
> iterable of these - it's an iterable of ints. That concern doesn't
> arise with chr/str as they're both abbreviated singular nouns rather
> than one being the plural form of the other (it also doesn't hurt that
> str actually is an iterable of chr results).
>

Since you agree with me about this...

[...]
>
> That said, the PEP does propose "getbyte()" and "iterbytes()" for
> bytes-oriented indexing and iteration, so there's a reasonable
> consistency argument in favour of also proposing "byte" as the builtin
> factory function:
>
> * data.getbyte(idx) would be a more efficient alternative to byte(data[idx])
> * data.iterbytes() would be a more efficient alternative to map(byte, data)
>

.. I don't understand the argument for having 'byte' in these names.
They should have 'char' or 'chr' in them for exacly the same reason
that the proposed builtin should have 'chr' in it instead of 'byte'.
If 'bytes' is an iterable of ints, then get_byte should probably
return an int

I'm sorry, but this argument comes across as "were're proposing the
wrong thing here, so for consistency, we might want to do the wrong
thing in this other part too".

And didn't someone recently propose deprecating iterability of str
(not indexing, or slicing, just iterability)? Then str would also need
a way to provide an iterable or sequence view of the characters. For
consistency, the str functionality would probably need to mimic the
approach in bytes. IOW, this PEP may in fact ultimately dictate how to
get a iterable/sequence from a str object.

-- Koos


> With bchr, those mappings aren't as clear (plus there's a potentially
> unwanted "text" connotation arising from the use of the "chr"
> abbreviation).
>

Which mappings?

> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-03 Thread Koos Zevenhoven
On Sun, Sep 4, 2016 at 1:23 AM, Ivan Levkivskyi <levkivs...@gmail.com> wrote:
> On 4 September 2016 at 00:11, Random832 <random...@fastmail.com> wrote:
>>
>> On Sat, Sep 3, 2016, at 18:06, Koos Zevenhoven wrote:
>> > I guess one reason I don't like bchr (nor chrb, really) is that they
>> > look just like a random sequence of letters in builtins, but not
>> > recognizable the way asdf would be.
>> >
>> > I guess I have one last pair of suggestions for the name of this
>> > function: bytes.chr or bytes.char.
>>
>> What about byte? Like, not bytes.byte, just builtins.byte.
>
>
> I like this option, it would be very "symmetric" to have, compare:
>
>>>>chr(42)
> '*'
>>>>str()
> ''
>
> with this:
>
>>>>byte(42)
> b'*'
>>>>bytes()
> b''
>
> It is easy to explain and remember this.

In one way, I like it, but on the other hand, indexing a bytes gives
an integer, so maybe a 'byte' is just an integer in range(256). Also,
having both byte and bytes would be a slight annoyance with
autocomplete.

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-03 Thread Koos Zevenhoven
On Sat, Sep 3, 2016 at 6:41 PM, Ethan Furman <et...@stoneleaf.us> wrote:
>>>
>>> Open Questions
>>> ==
>>>
>>> Do we add ``iterbytes`` to ``memoryview``, or modify
>>> ``memoryview.cast()`` to accept ``'s'`` as a single-byte interpretation?
>>> Or
>>> do we ignore memory for now and add it later?
>>
>>
>> Apparently memoryview.cast('s') comes from Nick Coghlan:
>>
>> <https://marc.info/?i=CADiSq7e=8ieyew-txf5dims_5nuaos5udv-3g_w3ltwn9wb...@mail.gmail.com>.
>> However, since 3.5 (https://bugs.python.org/issue15944) you can call
>> cast("c") on most memoryviews, which I think already does what you
>> want:
>>
>>>>> tuple(memoryview(b"ABC").cast("c"))
>>
>> (b'A', b'B', b'C')
>
>
> Nice!
>

Indeed! Exposing this as bytes_instance.chars would make porting from
Python 2 really simple. Of course even better would be if slicing the
view would return bytes, so the porting rule would be the same for all
bytes subscripting:

py2str[SOMETHING]

becomes

py3bytes.chars[SOMETHING]

With the "c" memoryview there will be a distinction between slicing
and indexing.

And Random832 seems to be making some good points.

--- Koos


> --
> ~Ethan~
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-03 Thread Koos Zevenhoven
On Sat, Sep 3, 2016 at 7:59 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 3 September 2016 at 03:54, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> chrb seems to be more in line with some bytes versions in for instance os
>> than bchr.
>
> The mnemonic for the current name in the PEP is that bchr is to chr as
> b"" is to "". The PEP should probably say that in addition to pointing
> out the 'unichr' Python 2 inspiration, though.

Thanks for explaining. Indeed I hope that unichr does not affect any
naming decisions that will remain in the language for a long time.

> The other big difference between this and the os module case, is that
> the resulting builtin constructor pairs here are str/chr (arbitrary
> text, single code point) and bytes/bchr (arbitrary binary data, single
> binary octet). By contrast, os.getcwd() and os.getcwdb() (and similar
> APIs) are both referring to the same operating system level operation,
> they're just requesting a different return type for the data.

But chr and "bchr" are also requesting a different return type. The
difference is that the data is not coming from an os-level operation
but from an int.

I guess one reason I don't like bchr (nor chrb, really) is that they
look just like a random sequence of letters in builtins, but not
recognizable the way asdf would be.

I guess I have one last pair of suggestions for the name of this
function: bytes.chr or bytes.char.

-- Koos


> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] What should a good type checker do? (was: Please reject or postpone PEP 526)

2016-09-03 Thread Koos Zevenhoven
What's up with the weird subthreads, Stephen?!

On Guido's suggestion, I'm working on posting those type-checking thoughts here.

-- Koos

On Sat, Sep 3, 2016 at 6:17 PM, Stephen J. Turnbull
<turnbull.stephen...@u.tsukuba.ac.jp> wrote:
> Please respect Reply-To, set to python-ideas.
>
> Greg Ewing writes:
>  > Chris Angelico wrote:
>  > > Forcing people to write 1.0 just to be compatible with 1.5 will cause
>  > > a lot of annoyance.
>  >
>  > Indeed, this would be unacceptable IMO.
>
> But "forcing" won't happen.  Just ignore the warning.  *All* such
> Python programs will continue to run (or crash) exactly as if the type
> declarations weren't there.  If you don't like the warning, either
> don't run the typechecker, or change your code to placate it.
>
> But allowing escapes from a typechecker means allowing escapes.  All
> of them, not just the ones you or I have preapproved.  I want my
> typechecker to be paranoid, and loud about it.
>
> That doesn't mean I would never use a type like "Floatable" (ie, any
> type subject to implicit conversion to float).  But in the original
> example, I would probably placate the typechecker.  YMMV, of course.
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] What should a good type checker do? (was: Please reject or postpone PEP 526)

2016-09-02 Thread Koos Zevenhoven
On Fri, Sep 2, 2016 at 9:04 PM, Steven D'Aprano <st...@pearwood.info> wrote:
> On Fri, Sep 02, 2016 at 08:10:24PM +0300, Koos Zevenhoven wrote:
>
>> A good checker should be able to infer that x is a union type at the
>> point that it's passed to spam, even without the type annotation. For
>> example:
>>
>> def eggs(cond:bool):
>> if cond:
>> x = 1
>> else:
>> x = 1.5
>> spam(x)   # a good type checker infers that x is of type Union[int, 
>> float]
>
> Oh I really hope not. I wouldn't call that a *good* type checker. I
> would call that a type checker that is overly permissive.

I guess it's perfectly fine if we disagree about type checking ideals,
and I can imagine the justification for you thinking that way. There
can also be different type checkers, and which can have different
modes.

But assume (a) that the above function is perfectly working code, and
spam(...) accepts Union[int, float]. Why would I want the type checker
to complain?

Then again, (b) instead of that being working code, it might be an
error and spam only takes float. No problem, the type checker will
catch that.

In case of (b), to get the behavior you want (but in my hypothetical
semantics), this could be annotated as

def eggs(cond:bool):
x : float
if cond:
x = 1  # type checker says error
else:
x = 1.5
spam(x)

So here the programmer thinks the type of x should be more constrained
than what spam(...) accepts.

Or you might have something like this

def eggs(cond:bool):
if cond:
x = 1
else:
x = 1.5
# type checker has inferred x to be Union[int, float]
x : float  # type checker finds an error
spam(x)

Here, the same error is found, but at a different location.

> Maybe you think that it's okay because ints and floats are somewhat
> compatible. But suppose I wrote:
>
> if cond:
> x = HTTPServer(*args)
> else:
> x = 1.5

It might be clear by now, but no, that's not why I wrote that. That
was just a slightly more "realistic" example than this HTTP & 1.5 one.

[...]
> Do you have a better idea for variable
> syntax?

I had one but it turned out it was worse.

-- Koos

>
>
> --
> Steve
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 467: last round (?)

2016-09-02 Thread Koos Zevenhoven
t' object cannot be interpreted as an integer
>
> While this does create some duplication, there are valid reasons for it::
>
> * the ``bchr`` builtin is to recreate the ord/chr/unichr trio from Python
> 2 under a different naming scheme
> * the class method is mainly for the ``bytearray.fromord`` case, with
> ``bytes.fromord`` added for consistency
>
> The documentation of the ``ord`` builtin will be updated to explicitly
note
> that ``bchr`` is the primary inverse operation for binary data, while
> ``chr``
> is the inverse operation for text data, and that ``bytes.fromord`` and
> ``bytearray.fromord`` also exist.
>
> Behaviourally, ``bytes.fromord(x)`` will be equivalent to the current
> ``bytes([x])`` (and similarly for ``bytearray``). The new spelling is
> expected to be easier to discover and easier to read (especially when used
> in conjunction with indexing operations on binary sequence types).
>
> As a separate method, the new spelling will also work better with higher
> order functions like ``map``.
>
>
> Addition of "getbyte" method to retrieve a single byte
> --
>
> This PEP proposes that ``bytes`` and ``bytearray`` gain the method
> ``getbyte``
> which will always return ``bytes``::
>
> >>> b'abc'.getbyte(0)
> b'a'
>
> If an index is asked for that doesn't exist, ``IndexError`` is raised::
>
> >>> b'abc'.getbyte(9)
> Traceback (most recent call last):
> File "", line 1, in 
> IndexError: index out of range
>
>
> Addition of optimised iterator methods that produce ``bytes`` objects
> -
>
> This PEP proposes that ``bytes`` and ``bytearray``gain an optimised
> ``iterbytes`` method that produces length 1 ``bytes`` objects rather than
> integers::
>
> for x in data.iterbytes():
> # x is a length 1 ``bytes`` object, rather than an integer
>
> For example::
>
> >>> tuple(b"ABC".iterbytes())
> (b'A', b'B', b'C')
>
>
> Design discussion
> =
>
> Why not rely on sequence repetition to create zero-initialised sequences?
> -
>
> Zero-initialised sequences can be created via sequence repetition::
>
> >>> b'\x00' * 3
> b'\x00\x00\x00'
> >>> bytearray(b'\x00') * 3
> bytearray(b'\x00\x00\x00')
>
> However, this was also the case when the ``bytearray`` type was originally
> designed, and the decision was made to add explicit support for it in the
> type constructor. The immutable ``bytes`` type then inherited that feature
> when it was introduced in PEP 3137.
>
> This PEP isn't revisiting that original design decision, just changing the
> spelling as users sometimes find the current behaviour of the binary
> sequence
> constructors surprising. In particular, there's a reasonable case to be
made
> that ``bytes(x)`` (where ``x`` is an integer) should behave like the
> ``bytes.fromint(x)`` proposal in this PEP. Providing both behaviours as
> separate
> class methods avoids that ambiguity.
>
>
> Open Questions
> ==
>
> Do we add ``iterbytes`` to ``memoryview``, or modify
> ``memoryview.cast()`` to accept ``'s'`` as a single-byte interpretation?
Or
> do we ignore memory for now and add it later?
>
>
> References
> ==
>
> .. [1] Initial March 2014 discussion thread on python-ideas
> (https://mail.python.org/pipermail/python-ideas/2014-March/027295.html)
> .. [2] Guido's initial feedback in that thread
> (https://mail.python.org/pipermail/python-ideas/2014-March/027376.html)
> .. [3] Issue proposing moving zero-initialised sequences to a dedicated
API
> (http://bugs.python.org/issue20895)
> .. [4] Issue proposing to use calloc() for zero-initialised binary
sequences
> (http://bugs.python.org/issue21644)
> .. [5] August 2014 discussion thread on python-dev
> (https://mail.python.org/pipermail/python-ideas/2014-March/027295.html)
> .. [6] June 2016 discussion thread on python-dev
> (https://mail.python.org/pipermail/python-dev/2016-June/144875.html)
>
>
> Copyright
> =
>
> This document has been placed in the public domain.
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reject or postpone PEP 526

2016-09-02 Thread Koos Zevenhoven
tr = 'Picard'
> damage: int
> stats: ClassVar[Dict[str, int]] = {}
>
> def __init__(self, damage: int, captain: str = None):
> self.damage = damage
> if captain:
> self.captain = captain  # Else keep the default
>
> With type hints as they currently exist, the same code is shorter and
> doesn't contaminate the class namespace with the 'damage' attribute.

IIUC, 'damage' will not be in the class namespace according to PEP 526.

> class Starship:
>
> captain = 'Picard'
> stats = {} # type: Dict[str, int]
>
> def __init__(self, damage: int, captain: str = None):
> self.damage = damage # Can infer type as int
> if captain:
> self.captain = captain # Can infer type as str
>

And that's one of the reasons why there should be annotations without
setting a type hint (as I wrote in the other thread).

>
> This isn't an argument against adding type syntax for attributes in general,
> just that the form suggested in PEP 526 doesn't seem to follow Python
> semantics.
>
> One could imagine applying minimal PEP 526 style hints, with standard Python
> semantics and relying on type inference, as follows:
>
> class Starship:
>
> captain = 'Picard'
> stats: Dict[str, int] = {}
>
> def __init__(self, damage: int, captain: str = None):
> self.damage = damage
> if captain:
> self.captain = captain

I don't like this, because some of the attributes are introduced at
class level and some inside __init__, so it is easy to miss that there
is such a thing as 'damage' (at least in more complicated examples). I
keep repeating myself, but again this where we need non-type-hinting
attribute declarations.

-- Koos

>
> The PEP overstates the existing use of static typing in Python
> ==
>
[...]
> Please don't turn Python into some sort of inferior Java.
> There is potential in this PEP, but in its current form I think it should be
> rejected.
>
> Cheers,
> Mark.
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 526 ready for review: Syntax for Variable and Attribute Annotations

2016-09-01 Thread Koos Zevenhoven
On Thu, Sep 1, 2016 at 5:46 PM, Guido van Rossum <gu...@python.org> wrote:
> On Thu, Sep 1, 2016 at 6:11 AM, Koos Zevenhoven <k7ho...@gmail.com> wrote:
>> While a large amount of Python programmers may not be interested in
>> type hinting local variables inside functions, I can see other
>> potential benefits in this.
>
> IOW, PEP 3157 is not dead yet. Indeed.
>

PEP 3157? Is that a typo or is there such a thing somewhere?

[...]
>> Also, when reading code, it may be hard to tell which (instance)
>> attributes the class implements. To have these listed in the beginning
>> of the class could therefore improve the readability.
>
> Right. That has been my observation using PEP 484's type comments
> extensively for annotating instance variables at the class level. E.g.
> much of mypy's own code is written this way, and it really is a huge
> help. But
>
> foo = None  # type: List[int]
>
> while it gives me the info I'm looking for, is not great
> notation-wise, and that's why I started thinking about an alternative:
>
> foo: List[int]
>
> (in either case, the __init__ contains something like `self.foo = []`).
>
>> In this light, I'm not sure it's a good idea to allow attribute type
>> hints inside methods.
>
> Those are meant for the coding style where all attributes are
> initialized in the method and people just want to add annotations
> there. This is already in heavy use in some PEP-484-annotated code
> bases I know of, using # type comments, and I think it will be easier
> to get people to switch to syntactic annotations if they can
> mechanically translate those uses. (In fact we are planning an
> automatic translator.)

I suppose the translator would be somewhat more complicated if it were
to move the type hints to the beginning of the class suite. Anyway, I
hope there will at least be a recommendation somewhere (PEP 8?) to not
mix the two styles of attribute annotation (beginning of class / in
method). The whole readability benefit turns against itself if there
are some non-ClassVar variables annotated outside __init__ and then
the rest somewhere in __init__ and in whatever initialization helper
methods __init__ happens to call.

[...]
>> Note that we could then also have this:
>>
>> def NAME
>>
>> Which would, again for readability (see above), be a way to express
>> that "there is an instance variable called X, but no type hint for
>> now". I can't think of a *good* way to do this with the keyword-free
>> version for people that don't use type hints.
>>
>> And then there could also be a simple decorator like
>> @slotted_attributes that automatically generates "__slots__" from the
>> annotations.
>
> This I like, or something like it. It can be a follow-up design. (I.e.
> a separate PEP, once we have experiece with PEP 526.)

I think there should be a syntax for this that does not involve type
hints, but I can't seem to come up with anything that works with the
keyword-free version :(.

-- Koos


> --
> --Guido van Rossum (python.org/~guido)

-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 526 ready for review: Syntax for Variable and Attribute Annotations

2016-09-01 Thread Koos Zevenhoven
On Wed, Aug 31, 2016 at 12:20 AM, Guido van Rossum <gu...@python.org> wrote:
> I'm happy to present PEP 526 for your collective review:
> https://www.python.org/dev/peps/pep-0526/ (HTML)
> https://github.com/python/peps/blob/master/pep-0526.txt (source)
>
> There's also an implementation ready:
> https://github.com/ilevkivskyi/cpython/tree/pep-526
>
> I don't want to post the full text here but I encourage feedback on
> the high-order ideas, including but not limited to
>
> - Whether (given PEP 484's relative success) it's worth adding syntax
> for variable/attribute annotations.

While a large amount of Python programmers may not be interested in
type hinting local variables inside functions, I can see other
potential benefits in this.

When I start sketching a new class, I'm often tempted to write down
the names of the attributes first, before starting to implement
``__init__``. Sometimes I even write temporary comments for this
purpose. This syntax would naturally provide a way to sketch the list
of attributes. Yes, there is already __slots__, but I'm not sure that
is a good example of readability.

Also, when reading code, it may be hard to tell which (instance)
attributes the class implements. To have these listed in the beginning
of the class could therefore improve the readability.

In this light, I'm not sure it's a good idea to allow attribute type
hints inside methods.

>
> - Whether the keyword-free syntax idea proposed here is best:
>   NAME: TYPE
>   TARGET: TYPE = VALUE
>

I wonder if this would be better:

def NAME: TYPE
def NAME: TYPE = VALUE

Maybe it's just me, but I've always thought 'def' is Python's least
logically used keyword. It seems to come from 'define', but what is it
about 'define' that makes it relate to functions only. Adding an
optional 'def' for other variables might even be a tiny bit of added
consistency.

Note that we could then also have this:

def NAME

Which would, again for readability (see above), be a way to express
that "there is an instance variable called X, but no type hint for
now". I can't think of a *good* way to do this with the keyword-free
version for people that don't use type hints.

And then there could also be a simple decorator like
@slotted_attributes that automatically generates "__slots__" from the
annotations.

-- Koos


> Note that there's an extensive list of rejected ideas in the PEP;
> please be so kind to read it before posting here:
> https://www.python.org/dev/peps/pep-0526/#rejected-proposals-and-things-left-out-for-now
>
>
> --
> --Guido van Rossum (python.org/~guido)
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] proposed os.fspath() change

2016-06-15 Thread Koos Zevenhoven
On Wed, Jun 15, 2016 at 11:00 PM, Ethan Furman <et...@stoneleaf.us> wrote:
> On 06/15/2016 12:24 PM, Koos Zevenhoven wrote:
>>
>> And the other question could be turned into whether to make str and
>> bytes also PathLike in __subclasshook__.
>
> No, for two reasons.
>
> - most str's and bytes' are not paths;

True. Well, at least most str and bytes objects are not *meant* to be
used as paths, even if they could be.

> - PathLike indicates a rich-path object, which str's and bytes' are not.

This does not count as a reason.

If this were called pathlib.PathABC, I would definitely agree [1]. But
since this is called os.PathLike, I'm not quite as sure. Anyway,
including str and bytes is more of a type hinting issue. And since
type hints will in also act as documentation, the naming of types is
becoming more important.

-- Koos

[1] No, I'm not proposing moving this to pathlib

> --
> ~Ethan~
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/k7hoven%40gmail.com



-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] proposed os.fspath() change

2016-06-15 Thread Koos Zevenhoven
On Wed, Jun 15, 2016 at 10:15 PM, Brett Cannon <br...@python.org> wrote:
>
>
> On Wed, 15 Jun 2016 at 12:12 Koos Zevenhoven <k7ho...@gmail.com> wrote:
>>
>> >> if isinstance(filename, os.PathLike):
>>
>> By the way, regarding the line of code above, is there a convention
>> regarding whether implementing some protocol/interface requires
>> registering with (or inheriting from) the appropriate ABC for it to
>> work in all situations. IOW, in this case, is it sufficient to
>> implement __fspath__ to make your type pathlike? Is there a conscious
>> trend towards requiring the ABC?
>
>
> ABCs like os.PathLike can override __subclasshook__ so that registration
> isn't required (see
> https://hg.python.org/cpython/file/default/Lib/os.py#l1136). So registration
> is definitely good to do to be explicit that you're trying to meet an ABC,
> but it isn't strictly required.

Ok I suppose that's fine, so I propose we update the ABC part in the
PEP with __subclasshook__.

And the other question could be turned into whether to make str and
bytes also PathLike in __subclasshook__.

-- Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] proposed os.fspath() change

2016-06-15 Thread Koos Zevenhoven
>> if isinstance(filename, os.PathLike):

By the way, regarding the line of code above, is there a convention
regarding whether implementing some protocol/interface requires
registering with (or inheriting from) the appropriate ABC for it to
work in all situations. IOW, in this case, is it sufficient to
implement __fspath__ to make your type pathlike? Is there a conscious
trend towards requiring the ABC?

-- Koos
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] proposed os.fspath() change

2016-06-15 Thread Koos Zevenhoven
On Wed, Jun 15, 2016 at 9:29 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 15 June 2016 at 10:59, Brett Cannon <br...@python.org> wrote:
>>
>>
>> On Wed, 15 Jun 2016 at 09:48 Guido van Rossum <gu...@python.org> wrote:
>>>
>>> These are really two separate proposals.
>>>
>>> I'm okay with checking the return value of calling obj.__fspath__; that's
>>> an error in the object anyways, and it doesn't matter much whether we do
>>> this or not (though when approving the PEP I considered this and decided not
>>> to insert a check for this). But it doesn't affect your example, does it? I
>>> guess it's easier to raise now and change the API in the future to avoid
>>> raising in this case (if we find that raising is undesirable) than the other
>>> way around, so I'm +0 on this.
>>
>> +0 from me as well. I know in some code in the stdlib that has been ported
>> which prior to adding support was explicitly checking for str/bytes this
>> will eliminate its own checking (obviously not a motivating factor as it's
>> pretty minor).
>
> I'd like a strong assertion that the return value of os.fspath() is a
> plausible filesystem path representation (so either bytes or str), and
> *not* some other kind of object that can also be used for accessing
> the filesystem (like a file descriptor or an IO stream)

I agree, so I'm -0.5 on passing through any object (at least by default).

>>> The other proposal (passing anything that's not understood right through)
>>> is more interesting and your use case is somewhat compelling. Catching the
>>> exception coming out of os.fspath() would certainly be much messier. The
>>> question remaining is whether, when this behavior is not desired (e.g. when
>>> the caller of os.fspath() just wants a string that it can pass to open()),
>>> the condition of passing that's neither a string not supports __fspath__
>>> still produces an understandable error. I'm not sure that that's the case.
>>> E.g. open() accepts file descriptors in addition to paths, but I'm not sure
>>> that accepting an integer is a good idea in most cases -- it either gives a
>>> mystery "Bad file descriptor" error or starts reading/writing some random
>>> system file, which it then closes once the stream is closed.
>>
>> The FD issue of magically passing through an int was also a concern when
>> Ethan brought this up in an issue on the tracker. My argument is that FDs
>> are not file paths and so shouldn't magically pass through if we're going to
>> type-check anything or claim os.fspath() only works with paths (FDs are
>> already open file objects). So in my view  either we go ahead and type-check
>> the return value of __fspath__() and thus restrict everything coming out of
>> os.fspath() to Union[str, bytes] or we don't type check anything and be
>> consistent that os.fspath() simply does is call __fspath__() if present.
>>
>> And just  because I'm thinking about it, I would special-case the FDs, not
>> os.PathLike (clearer why you care and faster as it skips the override of
>> __subclasshook__):
>>
>> # Can be a single-line ternary operator if preferred.
>> if not isinstance(filename, int):
>> filename = os.fspath(filename)
>
> Note that the LZMA case Ethan cites is one where the code accepts
> either an already opened file-like object *or* a path-like object, and
> does different things based on which it receives.
>
> In that scenario, rather than introducing an unconditional "filename =
> os.fspath(filename)" before the current logic, it makes more sense to
> me to change the current logic to use the new protocol check rather
> than a strict typecheck on str/bytes:
>
> if isinstance(filename, os.PathLike): # Changed line
> filename = os.fspath(filename)# New line

You are making one of my earlier points here, thanks ;). The point is
that the name PathLike sounds like it would mean anything path-like,
except that os.PathLike does not include str and bytes. And I still
think the naming should be a little different.

So that would be (os.Pathlike, str, bytes) instead of just os.PathLike.

> if "b" not in mode:
> mode += "b"
> self._fp = builtins.open(filename, mode)
> self._closefp = True
> self._mode = mode_code
> elif hasattr(filename, "read") or hasattr(filename, "write"):
> self._fp = filename
> self._mode = mode_code
> else:
> raise TypeError(
>  "filename must be a path-like or file-like object"

  1   2   >