Just to verify, since you didn't specify, You're not loading the entire file
to memory before you start working with it, are you?  For example:

   open(BIGFILE,"log.tmp");
   @bigfile = <BIGFILE>;

probably wouldn't be a great idea.

   open(BIGFILE,"log.tmp");
   while(<BIGFILE>){
      do something to each line ($_)...
   }

would be better.

-----Original Message-----
From: Balint, Jess [mailto:[EMAIL PROTECTED]]
Sent: Thursday, February 14, 2002 12:11 PM
To: [EMAIL PROTECTED]
Subject: Caching Large Data Structures To Disk


Hello all. First off, I want to thank everybody on this list who has helped
me with with my previous questions. Thank you.

I am working with input file of very large size. Sometimes up to and greater
than a gig of data. I have previously gotten out of memory errors (people at
work don't like when I do this). I was wondering if there was a way to cache
large data structures to disk. We have a program here called SAS that runs
on a Windows server (blek). Anyways, SAS allows me to operate on unlimited
sizes of data, and always uses large temp files. Maybe there is a
Disk::Cache module or something that I can use to store my data on disk
instead of memory. I know it will be slower, but is something I am willing
to deal with. TIA.

--J-e-s-s--

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--------------------------------------------------------------------------------
This email may contain confidential and privileged 
material for the sole use of the intended recipient. 
If you are not the intended recipient, please contact 
the sender and delete all copies.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to