Hi there,

I'm writing some code that has to parse a text file containing a series
of records. The records are a direct export from a third party DB and
are delimited by the "¬" char (I hope that comes out on all your email
clients but I guess it doesn't really matter.

What bothers me is that the text file contains only one line which in
turn contains many records. The test files I have are only 300 or 400K,
but I'm a bit worried at the risk of receiving a huge file to parse
which may leave me wanting for memory.

I'm used to parsing files where each record is written to it's own line.
I.e.

    while ( <FD> ) {
        my $job = Job->new();
        chomp;
        $job->FromStr( $_ );
        push ( @{ $self->{people} }, $job );
    }

I could do:

    while ( <FD> ) {
        my @jobs = split /\¬/;
        foreach my $jobstr ( @jobs ) {
             my $job = Job->new();
             job->FromStr( $jobstr );
             push ( @{ $self->{people} }, $job );
        }
    }

But this seems to me to be a little clunky given that I may receive a
HUGE one lined file one day. Is this a valid risk, or am I being too
careful?

Cheers

Breezy

Reply via email to