Hey, On Sat, Nov 28, 2009 at 5:30 PM, Simon King <[email protected]> wrote:
> Hi Carlo! > > On 28 Nov., 10:22, Carlo Hamalainen <[email protected]> > wrote: > > On Sat, Nov 28, 2009 at 9:48 AM, Simon King <[email protected]> > wrote: > > > Error: field larger than field limit > > > > It's a Python error message: > > > > http://svn.python.org/projects/python/trunk/Modules/_csv.c > > > > So you're working with huge CSV files? > > Thank you! You are right! > > In fact I am not using files but list of strings, but I guess that's > the same for csv. > > So, then I have to think what I can do to cut my strings in smaller > pieces before sending them to csv. > > One question: Is my impression right that list(csv.reader([s]))[0] is > often faster than s.split(',')? Or is it a stupid idea to not use split > (',') for that purpose? > > Microbenchmarks are perfect for this. sage: str = ','.join(<lorem ipsum text from lipsum.com>.split(' ')).replace('\n', '') sage: timeit('list(csv.reader([str]))[0]') 625 loops, best of 3: 95.5 µs per loop sage: timeit('str.split(",")') 625 loops, best of 3: 50.1 µs per loop So it seems that using the CSV reader for this case is around half as slow as using split (probably due to the type conversions required). > Cheers, > Simon > > -- > To post to this group, send email to [email protected] > To unsubscribe from this group, send email to > [email protected] > For more options, visit this group at > http://groups.google.com/group/sage-support > URL: http://www.sagemath.org > -- Tim Joseph Dumol <tim (at) timdumol (dot) com> http://timdumol.com -- To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/sage-support URL: http://www.sagemath.org
