At 1:01 PM +0530 8/31/09, Ajay Kumar wrote:
hi All

I have some zip data that contans name of different file when i unzipped data i got hundreds of file i passed each file in a
subroutine and parse the data and write into excel sheet
i am passing approximate 196 file individually

during parsing and writiting into excel script is taking to much cpu
Resources as well as more than one( 1) hour time time to excute

can anybody tell me how to resolve the cpu and time issues here

What is your platform? How big are the files? How are you writing the Excel spreadsheet?

I am doing something similar: I am generating some text reports in separate directories, then using File::Find to find each report, reading each report and extracting some data from the first page, then writing the data to one row per file in a spreadsheet using Spreadsheet::WriteExcel. I am currently up to 51 reports extracted and summarized in an Excel spreadsheet. The whole process takes about 3 seconds, most of which is probably reading the files. The report files are about 800 lines, or 45kb.

Are you using Spreadsheet::WriteExcel to write the spreadsheet or one of the OLE modules to drive Microsoft Excel on Windows? I would guess that the latter is slower than the former, but I never compared the two methods.

--
Jim Gibson
[email protected]

--
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]
http://learn.perl.org/


Reply via email to