Will,

I had some time to throw something together. This should take care of
the out of memory issues as it doesn't read in the whole file at once:


http://www.bennadel.com/blog/436-Breaking-Enormous-CSV-Files-Into-Smalle
r-CSV-Files.htm
(OR http://bennadel.com/index.cfm?dax=blog:436.view)


This splits the CSV file into many smaller files. Then, you would run
your originally parser on the smaller files, perhaps one per page
request. This should not kill your memory at all. Just remember to
delete the smaller CSV files once you are done with them (or they might
corrupt the next night's CSV data).

Have a great weekend. 

......................
Ben Nadel
Certified Advanced ColdFusion MX7 Developer
www.bennadel.com
 
Need ColdFusion Help?
www.bennadel.com/ask-ben/

-----Original Message-----
From: Will Swain [mailto:[EMAIL PROTECTED] 
Sent: Friday, December 15, 2006 4:23 PM
To: CF-Talk
Subject: RE: splitting large csv file into smaller parts

Well, my test has just run into problems and thrown a
java.lang.OutOfMemoryError error.

I'm thinking though maybe I can use the LineNumberReader code to split
the csv file into smaller parts. Maybe take 5,000 rows at a time and
write them out to a new csv file.

Then I can just run the processing code that I already have on each
smaller file. Does that logic seem sound to you?
        
Cheers

Will
                

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Create robust enterprise, web RIAs.
Upgrade & integrate Adobe Coldfusion MX7 with Flex 2
http://ad.doubleclick.net/clk;56760587;14748456;a?http://www.adobe.com/products/coldfusion/flex2/?sdid=LVNU

Archive: 
http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:264225
Subscription: http://www.houseoffusion.com/groups/CF-Talk/subscribe.cfm
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4

Reply via email to