On Mon, Dec 22, 2008 at 10:46 AM, Lossoth <[email protected]> wrote:
>
> Hello,
>
> I'm using Zend_Dom to parse some web pages in a loop
> The loop is very long (200000 pages)
> I note that for every cycle the memory used by php grow
> It seems that Zend_Dom don't release the memory used after any loop
> The code is similar to this.
>
> foreach($urlArray as $url)
> {
>    $response = getHTM($url);//simple function using fopen, the problem
> isn't here
>    $dom = new Zend_Dom_Query($response);
>    ...
>    //do something with a $dom->query
>
>    unset($dom);
> }
>
> The memory limit in my php ini is 1GB
> After 20000 iterations there is an error of out of memory
>
> Somebody could help me??
>
> Thanks!
>
> Marco

A rough idea would be to implement __destruct which cleans up the object:

Marco_Dom_Query extends Zend_Dom_Query
{
  public function __destruct()
  {
    unset($this->_document);
  }
}

Use it just like Zend_Dom_Query, keep unset($dom) in the loop and see
if this triggers __destruct() and really gets rid off the document you
added to the object. You could also check if memory_get_usage() grows
with each loop, or not.

Till

Reply via email to