On Thu, 24 Jan 2002 16:09:54 -0800, "Chris Giminez" <[EMAIL PROTECTED]> wrote:
>I'm trying to retrieve data and put it into a database from a text file. >The text file is over 100MB with (I'm guessing) around 260,000 records, each record >contains 85 >fields. > >I tried to create a conditional loop at each line break, which works fine if I am >working on a >smaller version of the file, but chokes on the 100MB monster. Is there a more >efficient way to deal >with this other than looping through it or is there a way to break this file up into >smaller chunks >and deal with it piece by piece? Been a while since I used it but I wrote a cfx called... hmm.. I don't recall what it was called /-) ... anyway, the cfx was to read in log files one line at a time. Which is what it does, one carriage-deliminated line per call, not the entire file all at once. It's somewhere at http://www.intrafoundation.com/freeware.html. --min ______________________________________________________________________ Why Share? Dedicated Win 2000 Server � PIII 800 / 256 MB RAM / 40 GB HD / 20 GB MO/XFER Instant Activation � $99/Month � Free Setup http://www.pennyhost.com/redirect.cfm?adcode=coldfusionc FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Archives: http://www.mail-archive.com/[email protected]/ Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists

