Hello everyone,

I have a script that loops through a directory of spreadsheets and one
by one dumps their contents to a csv using the
Spreadsheet::ParseExcel::Utility xls2csv.

The only problem is that xls2csv is using extremely too much memory,
as my biggest spreadsheet is 1.5 MB.  When started, the script
immediately takes 27MB of memory from the system and then as the
script progresses, instead of reusing freed memory, it keeps grabbing
more, until it's killed by the system.

I'm running it on a Fedora Core 2 installation with a P4 2.8 and 512MB ram.

I've figured out a way to get my data run through the script, by only
calling a few files, rather than all of them, but I'd like to know if
I'm doing something that is causing this "memory leak".

Thanks,
Kevin
-- 
Kevin Old
[EMAIL PROTECTED]

Attachment: itmstsload.pl
Description: Binary data

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>

Reply via email to