Re: [Python-ideas] Vectorization [was Re: Add list.join() please]

2019-01-31 Thread David Allemang
I accidentally replied only to Steven - sorry! - this is what I said, with
a typo corrected:

> a_list_of_strings..lower()
>
> str.lower.(a_list_of_strings)

I much prefer this solution to any of the other things discussed so far. I
wonder, though, would it be general enough to simply have this new '.' operator
interact with __iter__, or would there have to be new magic methods like
__veccall__, __vecgetattr__, etc? Would a single __vectorize__ magic method
be enough?

For example, I would expect   (1, 2, 3) .** 2   to evaluate as a tuple and
 [1, 2, 3] .** 2   to evaluate as a list, and   some_generator() .** 2   to
still be a generator.

If there were a   __vectorize__(self, func)   which returned the iterable
result of applying func on each element of self:

class list:
def __vectorize__(self, func):
return [func(e) for e in self]

some_list .* otherbecomes   some_list.__vectorize__(lambda e: e * 2)
some_string..lower()  becomes   some_string.__vectorize__(str.lower)
some_list..attr   becomes
 some_list.__vectorize__(operator.__attrgetter__('attr'))

Perhaps there would be a better name for such a magic method, but I believe
it would allow existing sequences to behave as one might expect, but not
require each operator to require its own definition. I might also be
over-complicating this, but I'm not sure how else to allow different
sequences to give results of their same type.

On Thu, Jan 31, 2019 at 6:24 PM Steven D'Aprano  wrote:

> On Thu, Jan 31, 2019 at 09:51:20AM -0800, Chris Barker via Python-ideas
> wrote:
>
> > I do a lot of numerical programming, and used to use MATLAB and now
> numpy a
> > lot. So I am very used to "vectorization" -- i.e. having operations that
> > work on a whole collection of items at once.
> [...]
> > You can imagine that for more complex expressions the "vectorized"
> approach
> > can make for much clearer and easier to parse code. Also much faster,
> which
> > is what is usually talked about, but I think the readability is the
> bigger
> > deal.
>
> Julia has special "dot" vectorize operator that looks like this:
>
>  L .+ 1   # adds 1 to each item in L
>
>  func.(L)   # calls f on each item in L
>
> https://julialang.org/blog/2017/01/moredots
>
> The beauty of this is that you can apply it to any function or operator
> and the compiler will automatically vectorize it. The function doesn't
> have to be written to specifically support vectorization.
>
>
> > So what does this have to do with the topic at hand?
> >
> > I know that when I'm used to working with numpy and then need to do some
> > string processing or some such, I find myself missing this
> "vectorization"
> > -- if I want to do the same operation on a whole bunch of strings, why
> do I
> > need to write a loop or comprehension or map? that is:
> >
> > [s.lower() for s in a_list_of_strings]
> >
> > rather than:
> >
> > a_list_of_strings.lower()
>
> Using Julia syntax, that might become a_list_of_strings..lower(). If you
> don't like the double dot, perhaps str.lower.(a_list_of_strings) would
> be less ugly.
>
>
>
> --
> Steven
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Range and slice syntax

2018-11-13 Thread David Allemang
That is not what slice.indices does. Per help(slice.indices) -

"S.indices(len) -> (start, stop, stride)

"Assuming a sequence of length len, calculate the start and stop indices,
and the stride length of the extended slice described by S. Out of bounds
indices are clipped in a manner consistent with handling of normal slices.

Essentially, it returns (S.start, len, S.step), with start and stop
adjusted to prevent out-of-bounds indices.

On Tue, Nov 13, 2018, 12:50 PM Vladimir Filipović  On Mon, Nov 12, 2018 at 4:43 PM Nicholas Harrison
>  wrote:
> > Only when this is called (implicitly or explicitly) do checks for valid
> objects and bounds occur. From my experience using slices, this is how they
> work in that context too.
>
> On reconsideration, I've found one more argument in favour of (at
> least this aspect of?) the proposal: the slice.indices method, which
> takes a sequence's length and returns an iterable (range) of all
> indices of such a sequence that would be "selected" by the slice. Not
> sure if it's supposed to be documented.
>
> So there is definitely precedent for "though slices in general are
> primarily a syntactic construct and new container-like classes can
> choose any semantics for indexing with them, the semantics
> specifically in the context of sequences have a bit of a privileged
> place in the language with concrete expectations, including strictly
> integer (or None) attributes".
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Allow Context Managers to Support Suspended Execution

2018-11-01 Thread David Allemang
Yes, so PEP 512 is exactly what I was suggesting. My apologies for not
finding it before sending this.

So, then, PEP 567 solves the issue for coroutines and PEP 568 would solve
it for generators as well?

On Thu, Nov 1, 2018, 11:40 AM Yury Selivanov  Yep, PEP 567 addresses this for coroutines, so David's first example
> is covered; here's a link to the fixed version: [1]
>
> The proposal to add __suspend__ and __resume__ is very similar to PEP
> 521 which was withdrawn.  PEP 568 (which needs to be properly updated)
> is the way to go if we want to address this issue for generators.
>
> [1]
> https://gist.github.com/allemangD/bba8dc2d059310623f752ebf65bb6cdc#gistcomment-2748803
>
> Yury
> On Thu, Nov 1, 2018 at 11:06 AM Guido van Rossum  wrote:
> >
> > Check out the decimal example here:
> https://www.python.org/dev/peps/pep-0568/ (PEP 568 is deferred, but PEP
> 567 is implemented in Python 3.7).
> >
> > Those Contexts aren't context managers, but still there's some thought
> put into swapping contexts out at the boundaries of generators.
> >
> > On Wed, Oct 31, 2018 at 7:54 PM David Allemang 
> wrote:
> >>
> >> I do not think there is currently a good way for Context Managers to
> >> support suspended execution, as in await or yield. Both of these
> >> instructions cause the interpreter to leave the with block, yet no
> >> indication of this (temporary) exit or subsequent re-entrance is given
> >> to the context manager. If the intent of a Context Manager is to say
> >> "no matter how this block is entered or exited, the context will be
> >> correctly maintained", then this needs to be possible.
> >>
> >> I would propose magic methods __suspend__ and __resume__ as companions
> >> to the existing __enter__ and __exit__ methods (and their async
> >> variants). __suspend__, if present, would be called upon suspending
> >> execution on an await or yield statement, and __resume__, if present,
> >> would be called when execution is resumed. If __suspend__ or
> >> __resume__ are not present then nothing should be done, so that the
> >> behavior of existing context managers is preserved.
> >>
> >> Here is an example demonstrating the issue with await:
> >> https://gist.github.com/allemangD/bba8dc2d059310623f752ebf65bb6cdc
> >> and one with yield:
> >> https://gist.github.com/allemangD/f2534f16d3a0c642c2cdc02c544e854f
> >>
> >> The context manager used is clearly not thread-safe, and I'm not
> >> actually sure how to approach a thread-safe implementation with the
> >> proposed __suspend__ and __resume__ - but I don't believe that
> >> introducing these new methods would create any issues that aren't
> >> already present with __enter__ and __exit__.
> >>
> >> It's worth noting that the context manager used in those examples is,
> >> essentially, identical contextlib's redirect_stdout and decimal's
> >> localcontext managers. Any context manager such as these which modify
> >> global state or the behavior of global functions would benefit from
> >> this. It may also make sense to, for example, have the __suspend__
> >> method on file objects flush buffers without closing the file, similar
> >> to their current __exit__ behavior, but I'm unsure what impact this
> >> would have on performance.
> >>
> >> It is important, though, that yield and await not use __enter__ or
> >> __exit__, as not all context-managers are reusable. I'm unsure  what
> >> the best term would be to describe this type of context, as the
> >> documentation for contextlib already gives a different definition for
> >> "reentrant" - I would then call them "suspendable" contexts. It would
> >> make sense to have an @suspendable decorator, probably in contextlib,
> >> to indicate that a context manager can use __enter__ and __exit__
> >> methods rather than __suspend__ and __resume__. All it would need to
> >> do is define __suspend__ to call __enter__() and __resume__ to call
> >> __exit__(None, None, None).
> >>
> >> It is also important, since __suspend__ and __resume__ would be called
> >> after a context is entered but before it is exited, that __suspend__
> >> not accept any parameters and that __resume__ not use its return
> >> value. __suspend__ could not be triggered by an exception, only by a
> >> yield or await, and __resume__ could not have its return value named
> >> with as.
> >>
> >> Thanks,
> >>
> >> David
&

[Python-ideas] Allow Context Managers to Support Suspended Execution

2018-10-31 Thread David Allemang
I do not think there is currently a good way for Context Managers to
support suspended execution, as in await or yield. Both of these
instructions cause the interpreter to leave the with block, yet no
indication of this (temporary) exit or subsequent re-entrance is given
to the context manager. If the intent of a Context Manager is to say
"no matter how this block is entered or exited, the context will be
correctly maintained", then this needs to be possible.

I would propose magic methods __suspend__ and __resume__ as companions
to the existing __enter__ and __exit__ methods (and their async
variants). __suspend__, if present, would be called upon suspending
execution on an await or yield statement, and __resume__, if present,
would be called when execution is resumed. If __suspend__ or
__resume__ are not present then nothing should be done, so that the
behavior of existing context managers is preserved.

Here is an example demonstrating the issue with await:
https://gist.github.com/allemangD/bba8dc2d059310623f752ebf65bb6cdc
and one with yield:
https://gist.github.com/allemangD/f2534f16d3a0c642c2cdc02c544e854f

The context manager used is clearly not thread-safe, and I'm not
actually sure how to approach a thread-safe implementation with the
proposed __suspend__ and __resume__ - but I don't believe that
introducing these new methods would create any issues that aren't
already present with __enter__ and __exit__.

It's worth noting that the context manager used in those examples is,
essentially, identical contextlib's redirect_stdout and decimal's
localcontext managers. Any context manager such as these which modify
global state or the behavior of global functions would benefit from
this. It may also make sense to, for example, have the __suspend__
method on file objects flush buffers without closing the file, similar
to their current __exit__ behavior, but I'm unsure what impact this
would have on performance.

It is important, though, that yield and await not use __enter__ or
__exit__, as not all context-managers are reusable. I'm unsure  what
the best term would be to describe this type of context, as the
documentation for contextlib already gives a different definition for
"reentrant" - I would then call them "suspendable" contexts. It would
make sense to have an @suspendable decorator, probably in contextlib,
to indicate that a context manager can use __enter__ and __exit__
methods rather than __suspend__ and __resume__. All it would need to
do is define __suspend__ to call __enter__() and __resume__ to call
__exit__(None, None, None).

It is also important, since __suspend__ and __resume__ would be called
after a context is entered but before it is exited, that __suspend__
not accept any parameters and that __resume__ not use its return
value. __suspend__ could not be triggered by an exception, only by a
yield or await, and __resume__ could not have its return value named
with as.

Thanks,

David
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/