I know when I had to do this at a previous job I used ArrayAppend to build
each line in the CSV, but I see you are using the string buffer. I had no
performance diffs at the time, so I just stayed with the CF solution. The
one thing I would look at is not using list functions, but instead using
Array functions and then one ArrayToList at the end. Also, make sure the
queries being executed aren't intensive either. I found in our CSV
generation, for every second the query took, it took one second on output so
my resources were essentially 50/50 between query and output,

Phil

On Mon, Jun 2, 2008 at 10:33 AM, Rick Root <[EMAIL PROTECTED]>
wrote:

> SQL Server 2005.
>
> I'm open to suggestion.  This is part of an application that allows users
> to
> generate CSV files of their own based on their own criteria, so though I'm
> open to "non-CF" solutions, I'm not sure there really would be anyway
> except
> maybe a homegrown java class to handle the work and be more strict with
> memory consumption....
>
> Rick
>
> On Mon, Jun 2, 2008 at 10:13 AM, Mark Kruger <[EMAIL PROTECTED]>
> wrote:
>
> > Rick,
> >
> > What's your DB platform?  Are you sure there is not a better "non-cf" way
> > to
> > do it?
> >
> >
> > Mark A. Kruger, CFG, MCSE
> > (402) 408-3733 ext 105
> > www.cfwebtools.com
> > www.coldfusionmuse.com
> > www.necfug.com
> >
> > -----Original Message-----
> > From: Rick Root [mailto:[EMAIL PROTECTED]
> > Sent: Monday, June 02, 2008 9:04 AM
> > To: CF-Talk
> > Subject: CSV Generation MEMORY SUCK
> >
> > So I've got a problem with generating large csv files.. it's a memory
> suck.
> >
> > I do this in an event gateway so that these file drops are generated "in
> > the
> > background"... here's the gateway code:
> >
> > http://cfm.pastebin.org/40043
> >
> > The larger the file drop, the worse the memory suck is.  A relatively
> small
> > drop of about 7200 rows and 138 columns (just over 1 million pieces of
> > data)
> > took 68 seconds.  In my production environment, I've estimated that I can
> > generate between 15,000 and 20,000 pieces of data per second using the
> code
> > above.
> >
> > The problem is this drop (which only generates a 5MB file) causes a
> memory
> > suck of about 100MB...
> >
> > Take a look at this output from the server monitor:
> > www.it.dev.duke.edu/public/temp.rtf
> >
> > It shows the memory graph generated from two file drops, at 9:38 and 9:45
> > am... the first one spiked the memory from 70MB to 170MB...the second one
> > dropped the memory back to about 90MB and then spiked it to 140MB.
> >
> > Of course, this size drop is not what causes my concern, it's when people
> > are dropping 10x that amount.. say 80,000 rows at 130 columns.  Over 10
> > million pieces of data, would take nearly 9 minutes ASSUMING you had no
> > memory issues, which I would.  Such a drop would basically crash the
> > instance.
> >
> > --
> > Rick Root
> > New Brian Vander Ark Album, songs in the music player and cool behind the
> > scenes video at www.myspace.com/brianvanderark
> >
> >
> >
> >
> >
>
> 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Adobe® ColdFusion® 8 software 8 is the most important and dramatic release to 
date
Get the Free Trial
http://ad.doubleclick.net/clk;192386516;25150098;k

Archive: 
http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:306569
Subscription: http://www.houseoffusion.com/groups/CF-Talk/subscribe.cfm
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

Reply via email to