Since this started, I've been using variants of the following in my own
code, wrapping `accumulate`:
from itertools import accumulate, chain, cycle, islice
def accum(iterable, func=None, *, initial=None, skipfirst=False):
if initial is not None:
iterable = chain([initi
[Raymond Hettinger ]
> Q. Do other languages do it?
> A. Numpy, no. R, no. APL, no. Mathematica, no. Haskell, yes.
>
> *
> http://docs.scipy.org/doc/numpy/reference/generated/numpy.ufunc.accumulate.html
> * https://stat.ethz.ch/R-manual/R-devel/library/base/html/cumsum.html
> * http://
[Tim]
> Woo hoo! Another coincidence. I just happened to be playing with
> this problem today:
>
> You have a large list - xs - of N numbers. It's necessary to compute slice
> sums
>
> sum(xs[i:j])
>
> for a great many slices, 0 <= i <= j <= N.
Which brought to mind a different problem: w
On Mon, Apr 9, 2018 at 11:35 PM, Stephen J. Turnbull <
turnbull.stephen...@u.tsukuba.ac.jp> wrote:
> Tim Peters writes:
>
> > "Sum reduction" and "running-sum accumulation" are primitives in
> > many peoples' brains.
>
> I wonder what Kahneman would say about that. He goes to some length
> to e
Tim Peters writes:
> "Sum reduction" and "running-sum accumulation" are primitives in
> many peoples' brains.
I wonder what Kahneman would say about that. He goes to some length
to explain that people are quite good (as human abilities go) at
perceiving averages over sets but terrible at summi
[Peter O'Connor ]
> Ok, so it seems everyone's happy with adding an initial_value argument.
Heh - that's not clear to me ;-)
> Now, I claim that while it should be an option, the initial value should NOT
> be returned by default. (i.e. the returned generator should by default
> yield N elements
* correction to brackets from first example:
def iter_cumsum_tolls_from_day(day, toll_amount_so_far):
return accumulate(get_tolls_from_day(day), initial=toll_amount_so_far)
On Mon, Apr 9, 2018 at 11:55 PM, Peter O'Connor
wrote:
> Ok, so it seems everyone's happy with adding an initial_valu
Ok, so it seems everyone's happy with adding an initial_value argument.
Now, I claim that while it should be an option, the initial value should
NOT be returned by default. (i.e. the returned generator should by default
yield N elements, not N+1).
Example: suppose we're doing the toll booth thin
[Tim]
>> while we have N numbers, there are N+1 slice indices. So
>> accumulate(xs) doesn't quite work. It needs to also have a 0 inserted
>> as the first prefix sum (the empty prefix sum(xs[:0]).
>>
>> Which is exactly what a this_is_the_initial_value=0 argument would do
>> for us.
[Greg Ewing
Tim Peters wrote:
while we have N numbers, there are N+1 slice indices. So
accumulate(xs) doesn't quite work. It needs to also have a 0 inserted
as the first prefix sum (the empty prefix sum(xs[:0]).
Which is exactly what a this_is_the_initial_value=0 argument would do
for us.
In this case,
Peter O'Connor wrote:
The behaviour where the first element of the return is the same as the
first element of the input can be weird and confusing. E.g. compare:
>> list(itertools.accumulate([2, 3, 4], lambda accum, val: accum-val))
[2, -1, -5]
>> list(itertools.accumulate([2, 3, 4], lambda
Woo hoo! Another coincidence. I just happened to be playing with
this problem today:
You have a large list - xs - of N numbers. It's necessary to compute slice sums
sum(xs[i:j])
for a great many slices, 0 <= i <= j <= N.
For concreteness, say xs is a time series representing a toll booth
On Friday, April 6, 2018 at 9:03:05 PM UTC-4, Raymond Hettinger wrote:
>
> > On Friday, April 6, 2018 at 8:14:30 AM UTC-7, Guido van Rossum wrote:
> > On Fri, Apr 6, 2018 at 7:47 AM, Peter O'Connor
> wrote:
> >> So some more humble proposals would be:
> >>
> >> 1) An initializer to iterto
[Tim]
>> Then why was [accumulate] generalized to allow any 2-argument function?
[Raymond]
> Prior to 3.2, accumulate() was in the recipes section as pure Python
> code. It had no particular restriction to numeric types.
>
> I received a number of requests for accumulate() to be promoted
> to a r
Also Tim Peter's one-line example of:
print(list(itertools.accumulate([1, 2, 3], lambda x, y: str(x) + str(y
I think makes it clear that itertools.accumulate is not the right vehicle
for this change - we should make a new itertools function with a required
"initial" argument.
On Mon, Apr 9,
It seems clear that the name "accumulate" has been kind of antiquated since
the "func" argument was added and "sum" became just a default.
And people seem to disagree about whether the result should have a length N
or length N+1 (where N is the number of elements in the input iterable).
The behav
On 9 April 2018 at 14:38, Raymond Hettinger wrote:
>> On Apr 8, 2018, at 6:43 PM, Tim Peters wrote:
>> In short, for _general_ use `accumulate()` needs `initial` for exactly
>> the same reasons `reduce()` needed it.
>
> The reduce() function had been much derided, so I've had it mentally filed in
Raymond Hettinger wrote:
I don't want to overstate the case, but I do think a function signature that
offers a "first_value" option is an invitation to treat the first value as
being distinct from the rest of the data stream.
I conjecture that the initial value is *always* special,
and the onl
> On Apr 8, 2018, at 6:43 PM, Tim Peters wrote:
>
>> My other common case for accumulate() is building cumulative
>> probability distributions from probability mass functions (see the
>> code for random.choice() for example, or typical code for a K-S test).
>
> So, a question: why wasn't iter
Raymond Hettinger wrote:
For neither of those use case categories did I ever want an initial value and
it would have been distracting to even had the option. For example, when
doing a discounted cash flow analysis, I was taught to model the various
flows as a single sequence of up and down arrows
> >>> list(accumulate([1, 2, 3]))
> [11, 13, 16]
Wow! I would have sworn that said
[1, 3, 6]
when I sent it. Damn Gmail ;-)
> >>> list(accumulate([1, 2, 3], initial=10))
> [10, 11, 13, 16]
___
Python-ideas mailing list
Python-ideas@python.org
https
[Raymond]
> The Bayesian world view isn't much different except they would
> prefer "prior" instead of "initial" or "start" ;-)
>
> my_changing_beliefs = accumulate(stream_of_new_evidence, bayes_rule,
> prior=what_i_used_to_think)
>
> Though the two analogies are cute, I'm not sure they tell u
> On Apr 8, 2018, at 12:22 PM, Tim Peters wrote:
>
> [Guido]
>> Well if you can get Raymond to agree on that too I suppose you can go ahead.
>> Personally I'm -0 but I don't really write this kind of algorithmic code
>> enough to know what's useful.
>
> Actually, you do - but you don't _think_
[Guido]
> Well if you can get Raymond to agree on that too I suppose you can go ahead.
> Personally I'm -0 but I don't really write this kind of algorithmic code
> enough to know what's useful.
Actually, you do - but you don't _think_ of problems in these terms.
Neither do I. For those who do: c
2018-04-08 8:19 GMT+03:00 Nick Coghlan :
> A name like "first_result" would also make it clearer to readers that
> passing that parameter has an impact on the length of the output
> series (since you're injecting an extra result), and also that the
> production of the first result skips calling fu
Another bit of prior art: the Python itertoolz package also supplies
`accumulate()`, with an optional `initial` argument. I stumbled into
that when reading a Stackoverflow "how can I do Haskell's scanl in
Python?" question.
https://toolz.readthedocs.io/en/latest/api.html#toolz.itertoolz.accumula
Well if you can get Raymond to agree on that too I suppose you can go
ahead. Personally I'm -0 but I don't really write this kind of algorithmic
code enough to know what's useful. I do think that the new parameter name
is ugly. But maybe that's the point.
On Sat, Apr 7, 2018 at 10:26 PM, Tim Peter
FYI:
[Raymond]
> ...
> Q. Do other languages do it?
> A. Numpy, no. R, no. APL, no. Mathematica, no. Haskell, yes.
>
>...
> * https://www.haskell.org/hoogle/?hoogle=mapAccumL
Haskell has millions of functions ;-) `mapAccumL` is a God-awful
mixture of Python's map(), reduce(), and accumul
Just a nit here:
[Tim]
>> ...
>> Arguing that it "has to do" something exactly the way `sum()` happens
>> to be implemented just doesn't follow - not even if they happen to
>> give the same name to an optional argument. If the function were
>> named `accumulate_sum()`, and restricted to numeric t
Top-posting just to say I agree with Nick's bottom line (changing the
name to `first_result=`). I remain just +0.5, although that is up a
notch from yesterday's +0.4 ;-)
--- nothing new below ---
On Sun, Apr 8, 2018 at 12:19 AM, Nick Coghlan wrote:
> On 8 April 2018 at 14:31, Guido van Rossum
On 8 April 2018 at 15:00, Tim Peters wrote:
> `accumulate()` accepts any two-argument function.
>
itertools.accumulate([1, 2, 3], lambda x, y: str(x) + str(y))
>
list(_)
> [1, '12', '123']
>
> Arguing that it "has to do" something exactly the way `sum()` happens
> to be implemented just
On 8 April 2018 at 14:31, Guido van Rossum wrote:
> Given that two respected members of the community so strongly disagree
> whether accumulate([], start=0) should behave like accumulate([]) or like
> accumulate([0]), maybe in the end it's better not to add a start argument.
> (The disagreement su
Nick, sorry, but your arguments still make little sense to me. I
think you're pushing an analogy between `sum()` details and
`accumulate()` way too far, changing a simple idea into a
needlessly complicated one.
`accumulate()` can do anything at all it wants to do with a `start`
argument (if
Given that two respected members of the community so strongly disagree
whether accumulate([], start=0) should behave like accumulate([]) or like
accumulate([0]), maybe in the end it's better not to add a start argument.
(The disagreement suggests that we can't trust users' intuition here.)
On Sat,
On 8 April 2018 at 13:17, Tim Peters wrote:
> [Nick Coghlan ]
>> So I now think that having "start" as a parameter to one but not the
>> other, counts as a genuine API discrepancy.
>
> Genuine but minor ;-)
Agreed :)
>> Providing start to accumulate would then mean the same thing as
>> providing
[Nick Coghlan ]
> I didn't have a strong opinion either way until Tim mentioned sum()
> and then I went and checked the docs for both that and for accumulate.
>
> First sentence of the sum() docs:
>
> Sums *start* and the items of an *iterable* from left to right and
> returns the total.
>
> Fi
On 8 April 2018 at 08:09, Tim Peters wrote:
[Raymond wrote]:
>> The docs probably need another recipe to show this pattern:
>>
>> def prepend(value, iterator):
>> "prepend(1, [2, 3, 4]) -> 1 2 3 4"
>> return chain([value], iterator)
>
> +1. Whether `accumulate()` s
...
[Tim]
>> Later:
>>
>>def coll(SHIFT=24):
>>...
>>from itertools import accumulate, chain, cycle
>>...
>>LIMIT = 1 << SHIFT
>>...
>>abc, first, deltas = buildtab(SHIFT, LIMIT)
>>...
>>for num in accumulate(chain([first], cycle(
On 7 April 2018 at 08:44, Raymond Hettinger wrote:
> Agreed that the "chain([x], it)" step is obscure. That's a bit of a bummer
> -- one of the goals for the itertools module was to be a generic toolkit for
> chopping-up, modifying, and splicing iterator streams (sort of a CRISPR for
> iterato
> On Apr 6, 2018, at 9:06 PM, Tim Peters wrote:
>
>>
>>What is this code trying to accomplish?
>
> It's quite obviously trying to bias the reader against the proposal by
> presenting a senseless example ;-)
FWIW, the example was not from me. It was provided by the OP on the tracker.
[Raymond Hettinger ]
> ...
> Q. How readable is the proposed code?
> A. Look at the following code and ask yourself what it does:
>
> accumulate(range(4, 6), operator.mul, start=6)
>
>Now test your understanding:
>
> How many values are emitted?
3
> What is the first v
> On Friday, April 6, 2018 at 8:14:30 AM UTC-7, Guido van Rossum wrote:
> On Fri, Apr 6, 2018 at 7:47 AM, Peter O'Connor wrote:
>> So some more humble proposals would be:
>>
>> 1) An initializer to itertools.accumulate
>> functools.reduce already has an initializer, I can't see any controversy
42 matches
Mail list logo