Hi Rob: I'm trying to merge a whole bunch of files (possibly tens of thousands) into one file. Here's my code (with the error checking removed for readability):
opendir(INDIR, $indir); @logfile=grep(/$mask/i, readdir INDIR); closedir(INDIR); [EMAIL PROTECTED]; # number of files matching mask open(OUTFILE, ">$outdir$outfile"); for ( $ctr=0; $ctr<$nbrfiles; $ctr++ ) { open(INFILE, "$indir$logfile[$ctr]"); print OUTFILE <INFILE>; close(INFILE); } close(OUTFILE); Then I'm writing a file from the @logfile which then gets processed to delete the files. It's done this way for restartability, so if I fail after creating the merged file, I can restart and know which files need deleting. I'd appreciate an alternative to reading the entire file - you're right, I don't need the whole thing in memory at the same time. However, wouldn't processing the file one record at a time be much slower? I'll go that route if I have to... Thanks for your assistance Rob -----Original Message----- From: Rob Dixon [mailto:[EMAIL PROTECTED] Sent: Wednesday, June 11, 2003 4:11 PM To: [EMAIL PROTECTED] Subject: Re: File not getting written Rob Das wrote: > Hi All: > > I added the following line to my program: > $/ = \65536; > > This was because I'm running out of memory when slurping entire files into > memory (potentially hundreds of meg). However, the (separate) log file I'm > writing afterwards is not getting created - nor am I getting any error > messages. If I comment this line out, it works fine. I tried the following: > $opfh = select(OUTFILE); > $| = 1; > select ($opfh); > > ... to try and flush the buffer, but to no avail. > > Would someone tell me what I need to do to get this file written out please? You need to comment out the line $/ = \65536; since it is preventing your log file from printing, and doing nothing else useful. (I suspect that it is being treated as a 16-bit integer and, since 65536 == 0x10000, this is being evaluated as zero.) The buffer size cannot help you to accommodate overly large files. If you try to slurp the entirety of a file that will not fit into your virtual memory then you cannot do it - even by reading it in Very Big Chunks. I have never seen a problem whose solution needs all of a file's contents in memory simultaneously. Why not tell us what you are trying to do and how you have tried to do it. Then we will help. By the way, the cleanest way to autoflush a filehandle is use IO::Handle; autoflush OUTFILE; Rob -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]