Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-20 Thread Vincent Davis
Thanks for the help, this is considerably faster and easier to read (see below). I changed it to avoid the break and I think it makes it easy to understand. I am checking the conditions each time slows it but it is worth it to me at this time. Thanks again Vincent def read_data_file(filename):

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-20 Thread Jonathan Gardner
On Sat, Feb 20, 2010 at 4:21 PM, Vincent Davis vinc...@vincentdavis.netwrote: Thanks for the help, this is considerably faster and easier to read (see below). I changed it to avoid the break and I think it makes it easy to understand. I am checking the conditions each time slows it but it is

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-20 Thread Vincent Davis
Thanks again for the comment, not sure I will implement all of it but I will separate the if not row The files have some extraneous blank rows in the middle that I need to be sure not to import as blank rows. I am actually having trouble with this filling my sys memory, I posted a separate

speed question, reading csv using takewhile() and dropwhile()

2010-02-19 Thread Vincent Davis
I have some some (~50) text files that have about 250,000 rows each. I am reading them in using the following which gets me what I want. But it is not fast. Is there something I am missing that should help. This is mostly an question to help me learn more about python. It takes about 4 min right

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-19 Thread John Posner
On 2/19/2010 3:02 PM, MRAB wrote: Is this any better? def read_data_file(filename): reader = csv.reader(open(filename, U),delimiter='\t') data = [] for row in reader: if '[MASKS]' in row: break data.append(row) As noted in another thread recently, you

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-19 Thread Jonathan Gardner
On Fri, Feb 19, 2010 at 10:22 AM, Vincent Davis vinc...@vincentdavis.netwrote: I have some some (~50) text files that have about 250,000 rows each. I am reading them in using the following which gets me what I want. But it is not fast. Is there something I am missing that should help. This is

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-19 Thread Vincent Davis
In reference to the several comments about [x for x in read] is basically a copy of the entire list. This isn't necessary. or list(read). I had thought I had a problem with having iterators in the takewhile() statement. I thought I testes and it didn't work. It seems I was wrong. It clearly works.

Re: speed question, reading csv using takewhile() and dropwhile()

2010-02-19 Thread Jonathan Gardner
On Fri, Feb 19, 2010 at 1:58 PM, Vincent Davis vinc...@vincentdavis.netwrote: In reference to the several comments about [x for x in read] is basically a copy of the entire list. This isn't necessary. or list(read). I had thought I had a problem with having iterators in the takewhile()