On 01/02/2014 10:39 AM, David Precious wrote:
On Thu, 2 Jan 2014 23:21:22 +0800 (SGT)
mani kandan <mani_nm...@yahoo.com> wrote:

Hi,

We have file size of huge size 500MB, Need to Manipulate the file,
some replacement and then write the file, I have used File::slurp and
works for file size of 300MB (Thanks Uri) but for this huge size
500MB it is not processing and come out with error. I have also used
Tie::file module same case as not processing, any guidance.

Firstly, be specific - "come out with error" doesn't help us - what is
the error?

Secondly - do you need to work on the file as a whole, or can you just
loop over it, making changes, and writing them back out?  In other
words, do you *need* to hold the whole file in memory at one time?
More often than not, you don't.

If it's per-line changes, then File::Slurp::edit_file_lines should work
- for e.g.:

   use File::Slurp qw(edit_file_lines);
   my $filename = '/tmp/foo';
   edit_file_lines(sub { s/badger/mushroom/g }, $filename);

The above would of course replace every occurrence of 'badger' with
'mushroom' in the file.

if there is a size issue, that would be just as bad as slurping in the whole file and it would use even more storage as it will be an array of all the lines internally. slurping in 500MB is not a smart thing unless you have many gigs of free ram. otherwise it will just be going to disk on the swap and you don't gain much other than simpler logic.

but i agree, knowing the error message and who is generating it will be valuable. it could be a virtual ram limitation on the OS which can be changed with the ulimit utility (or BSD::Resource if you have that module).

uri


--
Uri Guttman - The Perl Hunter
The Best Perl Jobs, The Best Perl Hackers
http://PerlHunter.com

--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to