Hi Martin,

Try using a bulk import or bulk copy method with your SQL database.  MSSQL has 
a DTS that can do this, and MYSQL has the following:

<cfset newline = Chr(13) & Chr(10) >
<cfquery ....the usual stuff here ... >
    LOAD DATA INFILE '#fabsolute_file_path#' INTO TABLE RAWDATA
    FIELDS TERMINATED BY ',' ENCLOSED BY '"'
    LINES TERMINATED BY '#newline#'
    IGNORE 1 LINES
</cfquery>

Create your RAWDATA table first - can even do this dynamically if the number 
and naming of your data columns varies with each file.
Newline is a line break and carriage return ascii values.
Ignore first line if it has column titles in it.
Be sure that you use double backslashes in your file path for MYSQL (can do 
with ReReplace function)
Use DROP TABLE RAWDATA to quickly empty your database of that table's content.  
Then just CREATE it again.

What could take many minutes with CFFile, etc. will only take seconds with bulk 
import/copy.

Once you have your data in your temporary table, then just use SQL statements 
to put it into your other tables as required.

Cheers,
Martyn



On 7/28/2006 7:48:40 PM, gert franz ([EMAIL PROTECTED]) wrote:
> Hi Martin,
> 
> you could use Railo instead, since Railo can loop through a large file
> line by line without reading it completely. Just use the <cfloop
> file="..."> tag to process the file, just as explained here:
> 
> http://www.railo.ch/en/index.cfm?treeID=144
> 
> Regards Gert
> 
> Greetings / GrĂ¼sse
> Gert Franz
> Customer Care
> [EMAIL PROTECTED]
> www.railo.ch
> 
> Join our Mailing List / Treten Sie unserer Mailingliste bei:
> deutsch: http://de.groups.yahoo.com/group/railo/
> english: http://groups.yahoo.com/group/railo_talk/
> 
> 
> 
> Martin Thorpe wrote:
> > Hello all.
> >
> >
> >
> > I am uploading a 74 + MB tab delimited text file that I am then reading
> > and inserting the values into a database. The problem is it always
> > times out, or just takes too long (did not finish over night!!!!) to
> > read the file. It is uploaded fine.
> >
> > Any suggestions about how I may approach this to make it work with such
> > large files?
> >
> > I looked at a bit of Java code someone had posted here but it went no
> > quicker really.
> >
> > I was thinking of maybe chopping the file into slices and then
> > processing but


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Introducing the Fusion Authority Quarterly Update. 80 pages of hard-hitting,
up-to-date ColdFusion information by your peers, delivered to your door four 
times a year.
http://www.fusionauthority.com/quarterly

Archive: 
http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:247981
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

Reply via email to