> Here is an example of a program and a perl module that parses a .xls file
> and eats the whole memory.
> I have tried it under Linux and Windows and it has the same problem under
> both OSs, so it has big bugs.
[snip]
> #Insert into database
> my $rapoarte_i = $dbh->prepare("insert ignore into temp_rapoarte values(?,
> ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)");
> 
> foreach my $cod(sort keys %f) {
>   foreach my $market(keys %{$f{$cod}}) {
>     $rapoarte_i->execute(...);
>   }
> }

You should call $rapoarte_i->finish() here. I know if you're using
Oracle, not finishing will keep cursors open and that could
potentially be hogging memory like you're experiencing. Not sure about
other databases. What database are you using? Let us know if this
solves your problem.

> sub DESTROY {
> my $self = shift;
> undef $self;
> 
> #I don't know if this sub helps, or if it is correctly written
> }

This is correctly written, but it's kind of redundant... because if
something is getting garbage collected, it's going to get undef'd
anyway. Unless you have circular references or some other oddity (see
Programming Perl v3, chapter 12). Which it doesn't look like is the
case, at first glance.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to