Been there, done that, don't want to do it again.

I worked on an app that allowed a site admin to upload a .csv file
with updated store data (from his POS system).  I found a UDF at
cflib.org that turned a .csv file into a query object, and I looped
through that.  However, on a fast machine (my workstation, Athlon64
3200+, 2GB RAM, fast drives), it took over a half hour to process a
1400 line .csv file with less than a dozen columns.  Absolutely unsat.

What I ended up doing was importing the .csv file into MS Access and
it ripped through the same data in 30 seconds.  I created a template
database for my client to use, gave him instructions on how to import,
created a DSN for the Access database and an upload interface.  Worked
like a charm, and if something like this is an option, I would
recommend doing it.

Pete

On 1/11/06, Robert Everland III <[EMAIL PROTECTED]> wrote:
> We have a file upload process here where a user needs to upload a file, the 
> file then needs to be parsed through to get data out of it and inserted into 
> a database. The file is 1.4mb long. I'm having issues just reading and 
> looping through the file. Is there a better way to do something like this 
> than to read the file then do listgetat(string, x, delimeter) ? Seems there 
> has to be another way to do this.
>
>
>
> Bob
>
> 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Message: http://www.houseoffusion.com/lists.cfm/link=i:4:229282
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

Reply via email to