I would use the second approach.  I would think that it would be better
performance-wise, not to mention I always like to have a backup in case
things don't work the way I expected.

Alother thing you have to try to do is only read in as much data at one time
as you need to, because a million rows in memory could really kill you.

Maybe try something like this (no doubt someone will have a much better
solution, but I figured I would try anyway):

#!/usr/bin/perl

use strict;
use English;

my $infile = "somefile";
my $outfile = "someotherfile";

open IN, $infile or die $!;
open OUT, "> $outfile" or die $!;

my @buffer = ();

while ( my $line = <IN> ) {
    # push the current line on the buffer

    # $NR is our current line number in the file
    if ( $NR < 4 ) {
        # push the row onto the buffer WITH the newline char
        push @buffer, $line;
    }
    # The "4" is our fourth line
    elsif ( $NR == 4 ) {
        chomp($line);

        for ( my $i = 0; $i < @buffer; $i++ ) {
            my $bufLine = $buffer[$i];
            chomp($bufLine);

            # get the adjusted line num for the buffer line
            my $lineno = $NR - @buffer + $i;

            if ( $lineno == 1 or $lineno == 3 ) {
                $bufLine .= $line;
            }

            # add the line to the output file
            print OUT "$bufLine\n";
        }

        # clear the buffer
        @buffer = ();
    }
    else {
        print OUT $line;
    }
}

close IN;
close OUT;

Rob


-----Original Message-----
From: Ho, Tony [mailto:[EMAIL PROTECTED]]
Subject: appending to rows


Hi guys
I was wondering if you could help me
 
I have 2 files.
One file has 3 data rows, A, B and C
The file has 1 data row D.
 
I need to append row D to rows A and C in the first file, based on some
condition.
Any ideas how I could go about this ?
 
Another alternative would be for me to open up a new file and write
row 1 : AD
row 2 : B
row 3 : CD
 
Is the second approach better in terms of performance than the first
approach as I will have more than 1 million rows to deal with ?
 
Thanks in advance
Tony

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to