Maybe because you aren't closing each file after you have done your thing and it remains in memory?

On 2011-01-06 02:26:13 -0500, Jins Thomas said:

--0016364270585953cf049927ffd4

Content-Type: text/plain; charset=ISO-8859-1



Hi experts,



Have you ever experienced Out of memory problem while using

HTML::TableExtract. I'm having little large html files, still i didn't

expect this to happen



Would you be able to suggest some workarounds for this. I'm using this

subroutine in another for loop.



sub zParseHTMLFiles ($$) {



    my ( $lrefFileList, $lrefColNames ) = @_;

    my @ldata;

    foreach my $lFile (@$lrefFileList) {

        my $lTableExtract = HTML::TableExtract->new( headers =>

[...@$lrefcolnames] );

        chomp($lFile);

        $lTableExtract->parse_file($lFile);

        foreach my $ls ( $lTableExtract->tables ) {

            foreach my $lrow ( $lTableExtract->rows ) {

                chomp( @$lrow[$#$lrow] );

                push( @ldata, $lrow );

            }

        }

    }

    return \...@ldata;

}



Thanks

Jins Thomas



--0016364270585953cf049927ffd4--





--
Robert



--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to