I have a data file that has over 110000 entry of 3 column data (string, float, float) currently I have written my program so it will do an entry by entry processing with zope. This operation is like this 1. read data (the data file) 2. create product (a python product that store three field data: one string and two float data) 3. update product (update the three field entries) when I first tried it out with the first 1000 entries it took about 30 seconds. That means its going to take 50 ~ 60 minutes for 110000 entries.
It not every day that you have to process over 110000 data entries but processing over 60 minutes is still kind of long. So I was wondering if anyone could propose a different method of doing this..... Love to heard any replies... Allen __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
_______________________________________________ Zope maillist - Zope@zope.org http://mail.zope.org/mailman/listinfo/zope ** No cross posts or HTML encoding! ** (Related lists - http://mail.zope.org/mailman/listinfo/zope-announce http://mail.zope.org/mailman/listinfo/zope-dev )