CF5 is much better (apparently) in this regard.
Haven't been able to test it yet - to busy with other issues.
<snip>
This is just a suggestion but something I have encountered with CF is
that
since it does not give up the memory heap once its allocated so when you
try
to load a huge file it eats up that much ram and then some. The trick is
to
balance the amount of ram and the file upload size - if you are going to
do
continous uploads you need to write a little script to cycle the Cold
Fusion
Service to free up enough memory.
-----Original Message-----
From: Matt Lewis [mailto:[EMAIL PROTECTED]]
Sent: Monday, August 20, 2001 8:07 PM
To: CF-Talk
Subject: Loading extremely large files via CFFILE
Hi All.
I'm trying to load an extremely load file via CFFILE. This file is the
RDF
content file from the Open Directory Project.
I've already built the script which imports the file, loads into a set
of
structures, and then strips out the URLs and inserts into a database.
This
script has been tested with a subset of the entire RDF file.
However, when I try to run this script with the entire file
(approximately
820Megs in size), I get a memory error. I've resorted to splitting the
file into smaller segments, but each file is taking a huge amount of
time
to load (I've resorted to splitting the file into 10Meg segments).
Can others give me an idea of how they have managed extremely large
files
that need to be loaded into a database via CFFILE? The file format is
too
complex to simply write a SQL import script.
Thanks.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Structure your ColdFusion code with Fusebox. Get the official book at
http://www.fusionauthority.com/bkinfo.cfm
FAQ: http://www.thenetprofits.co.uk/coldfusion/faq
Archives: http://www.mail-archive.com/[email protected]/
Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists