Hi experts,

Have you ever experienced Out of memory problem while using
HTML::TableExtract. I'm having little large html files, still i didn't
expect this to happen

Would you be able to suggest some workarounds for this. I'm using this
subroutine in another for loop.

sub zParseHTMLFiles ($$) {

    my ( $lrefFileList, $lrefColNames ) = @_;
    my @ldata;
    foreach my $lFile (@$lrefFileList) {
        my $lTableExtract = HTML::TableExtract->new( headers =>
[...@$lrefcolnames] );
        chomp($lFile);
        $lTableExtract->parse_file($lFile);
        foreach my $ls ( $lTableExtract->tables ) {
            foreach my $lrow ( $lTableExtract->rows ) {
                chomp( @$lrow[$#$lrow] );
                push( @ldata, $lrow );
            }
        }
    }
    return \...@ldata;
}

Thanks
Jins Thomas

Reply via email to