Joh wrote:
def gen(iterable, start, end):
it = iter(iterable)
while True:
it, a = tee(it)
a = tuple(islice(a, end-1))
for sz in xrange(start, len(a)+1):
yield a[:sz]
it.next()
if __name__ == __main__:
print list(gen(range(1, 5), 2, 4))
please, this one looks interesting, could you explain a
hello,
thanks to all who replied to my post (2005-01-21) - (i can not post
inside the original thread as i get Unable to retrieve message
[EMAIL PROTECTED] from googlenews :(
Do you mean:
[1,2], [2,3], [3,4], [1,2,3], [2,3,4], [1,3,4]
(E.g. all elements in the power set except the empty set,
Francis Girard [EMAIL PROTECTED] wrote:
...
But besides the fact that generators are either produced with the new yield
reserved word or by defining the __new__ method in a class definition, I
don't know much about them.
Having __new__ in a class definition has nothing much to do with
Nick Coghlan [EMAIL PROTECTED] wrote:
5. Several builtin functions return iterators rather than lists, specifically
xrange(), enumerate() and reversed(). Other builtins that yield sequences
(range(), sorted(), zip()) return lists.
Yes for enumerate and reversed, no for xrange:
xx=xrange(7)
On Sat, 2005-01-22 at 10:10 +0100, Alex Martelli wrote:
The answer for the current implementation, BTW, is in between -- some
buffering, but bounded consumption of memory -- but whether that tidbit
of pragmatics is part of the file specs, heh, that's anything but clear
(just as for other
Le samedi 22 Janvier 2005 10:10, Alex Martelli a crit:
Francis Girard [EMAIL PROTECTED] wrote:
...
But besides the fact that generators are either produced with the new
yield reserved word or by defining the __new__ method in a class
definition, I don't know much about them.
Having
On Sat, 2005-01-22 at 17:46 +0800, I wrote:
I'd be interested to know if there's a better solution to this than:
. inpath = '/tmp/msg.eml'
. infile = open(inpath)
. initer = iter(infile)
. headers = []
. for line in initer:
if not line.strip():
break
Francis Girard [EMAIL PROTECTED] wrote:
...
A 'def' of a function whose body uses 'yield', and in 2.4 the new genexp
construct.
Ok. I guess I'll have to update to version 2.4 (from 2.3) to follow the
discussion.
It's worth upgrading even just for the extra speed;-).
Since you
Craig Ringer [EMAIL PROTECTED] wrote:
. data = ''.join(x for x in infile)
Maybe ''.join(infile) is a better way to express this functionality?
Avoids 2.4 dependency and should be faster as well as more concise.
Might it be worth providing a way to have file objects seek back to the
current
On Sat, 2005-01-22 at 12:20 +0100, Alex Martelli wrote:
Craig Ringer [EMAIL PROTECTED] wrote:
. data = ''.join(x for x in infile)
Maybe ''.join(infile) is a better way to express this functionality?
Avoids 2.4 dependency and should be faster as well as more concise.
Thanks - for some
Francis Girard [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
If I understand correctly,
Almost...
a generator produce something over which you can
iterate with the help of an iterator.
To be exact, the producer is a generator function, a function whose body
contains 'yield'.
hello,
i'm trying to understand how i could build following consecutive sets
from a root one using generator :
l = [1,2,3,4]
would like to produce :
[1,2], [2,3], [3,4], [1,2,3], [2,3,4]
but unfortunately can not, i guess i can do it by using sub generator
and maybe enumerate, please if you
On 21 Jan 2005 05:58:03 -0800
[EMAIL PROTECTED] (Joh) wrote:
i'm trying to understand how i could build following consecutive sets
from a root one using generator :
l = [1,2,3,4]
would like to produce :
[1,2], [2,3], [3,4], [1,2,3], [2,3,4]
def consecutive_sets(l):
... for i in
Hi,
I recently read David Mertz (IBM DeveloperWorks) about generators and got
excited about using lazy constructs in my Python programming.
But besides the fact that generators are either produced with the new yield
reserved word or by defining the __new__ method in a class definition, I
On Fri, 2005-01-21 at 22:38 +0800, Craig Ringer wrote:
consecutive_sets = ( x[offset:offset+subset_size]
for subset_size in xrange(2, len(x))
for offset in xrange(0, len(x) + 1 - subset_size) )
Where 'x' is list to operate on, as I should've initially
On Fri, 2005-01-21 at 16:05 +0100, Francis Girard wrote:
I recently read David Mertz (IBM DeveloperWorks) about generators and
got excited about using lazy constructs in my Python programming.
Speaking of totally great articles, and indirectly to lazyness (though
not lazyily evaluated
Le vendredi 21 Janvier 2005 16:06, Craig Ringer a crit:
On Fri, 2005-01-21 at 22:38 +0800, Craig Ringer wrote:
consecutive_sets = ( x[offset:offset+subset_size]
for subset_size in xrange(2, len(x))
for offset in xrange(0, len(x) + 1 - subset_size) )
Really, thank you Craig Ringer for your great answer.
I'm afraid I can't help you with that. I tend to take the view that side
effects in lazily executed code are a bad plan, and use lazy execution
for things where there is no reason to care when the code is executed.
I completly agree with
On Fri, 2005-01-21 at 16:54 +0100, Francis Girard wrote:
First, I think that you mean :
consecutive_sets = [ x[offset:offset+subset_size]
for subset_size in xrange(2, len(x))
for offset in xrange(0, len(x) + 1 - subset_size)]
(with square
Thank you,
I immediately download version 2.4, switching from version 2.3.
Francis Girard
FRANCE
Le vendredi 21 Janvier 2005 17:34, Craig Ringer a crit:
On Fri, 2005-01-21 at 16:54 +0100, Francis Girard wrote:
First, I think that you mean :
consecutive_sets = [
Francis Girard wrote:
In particular, I don't know what Python constructs does generate a generator.
I know this is now the case for reading lines in a file or with the new
iterator package. But what else ? Does Craig Ringer answer mean that list
comprehensions are lazy ? Where can I find a
21 matches
Mail list logo