You have not given a sample of your code, but if you are processing the files in-line, 
undef the variables after each file is processed so it can be reused by subsequent 
files.  Otherwise Perl will keep requesting more and more memory and only release it 
when the entire process finishes.  When the OS (windows?) sees these huge memory 
requests it will start swapping the memory to disk -- this is where you slow down.

You need to analyze your code and release (undef) memory immediately after it is no 
longer needed.  

-----Original Message-----
From: Gurpreet Sachdeva [mailto:[EMAIL PROTECTED] 
Sent: Thursday, May 13, 2004 12:26 AM
To: [EMAIL PROTECTED]
Subject: Problem in Buffers

 �
I have written a script in PERL where in huge amount of File IO takes place but after 
working in 2-3 files, it sucks to death...

And I need to start it again on rest of files....

Is there any command wherein I can flush my buffers or memory so that it works 
smoothly...

Thanks and Regards,
GSS

_______________________________________________
ActivePerl mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to