On Wed, 14 May 2014, Randall Davis wrote:
Recently I've run into memory issues because each spreadsheet produced seems to permanently consume ~3MB of memory, and now that I've got more than a thousand data files to process (and the same number of xls files to produce), I'm running out of memory. In principle I can do 500 files at a time I suppose, but there are a variety of reasons why it's much easier to do them all in one pass.
As long as you're closing the output streams and input streams, and not holding references to anything in maps / lists / etc that you don't null / clear, memory should be released after a few GC runs.
I can only suggest you check you're not accidently keeping references to objects in maps / other functions / caches / etc.
Otherwise you'll need to dust off a profiler or similar, and use that to track down where the memory is going and what kinds of things it's going on
Nick --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
