Yesterday I've experienced a very similar problem, this is my example code
encountered in writing a crawler using ZF:
example:
require_once 'bootstrap.php';
while (1) {
$u = new Users();
$us = $u->getById(18);
unset($us);
unset($u);
echo number_format(memory_get_usage()) . "\n";
}
exit();
And this is the output (with php 5.2.5):
...
8,282,544
8,283,844
8,285,156
Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 900 bytes) in
/Users/fabionapoleoni/Documents/workspaces/lib/ZendFramework-1.7.0/library/Zend/Db/Adapter/Abstract.php
on line 873
and so on, the memory usage will increment until an Out of Memory will be
reached
I do not post the User and Users classes, but they are simply models
extending Zend_Db_Table_Abstract and Zend_Db_Table_Row_Abstract, with some
helpers methods.
AFAIK this is a well-known PHP 5.2.x Bug, these are some blog posts that
explain it:
http://www.alexatnet.com/node/73
http://paul-m-jones.com/?p=262
and this is the bug report:
http://bugs.php.net/bug.php?id=33595
Maybe this is not a big problem when php is used for web page generation
(but occasionally some of my ZF web applications run out of memory too), but
this is a REAL problem when using cli, in my case I've written an endless
loop that crawling web data, i need approximately 1 million of pages for my
study, and I have to restart the crawler every time it crashes.
I think there are a lot of circular references in ZF implementation, are you
aware of this problem? Is there any possible fix?
Thank you.
Matthew Weier O'Phinney-3 wrote:
>
> -- till <[email protected]> wrote
> (on Monday, 22 December 2008, 11:14 AM +0100):
>> On Mon, Dec 22, 2008 at 10:46 AM, Lossoth <[email protected]>
>> wrote:
>> > I'm using Zend_Dom to parse some web pages in a loop
>> > The loop is very long (200000 pages)
>> > I note that for every cycle the memory used by php grow
>> > It seems that Zend_Dom don't release the memory used after any loop
>> > The code is similar to this.
>> >
>> > foreach($urlArray as $url)
>> > {
>> > $response = getHTM($url);//simple function using fopen, the problem
>> > isn't here
>> > $dom = new Zend_Dom_Query($response);
>> > ...
>> > //do something with a $dom->query
>> >
>> > unset($dom);
>> > }
>> >
>> > The memory limit in my php ini is 1GB
>> > After 20000 iterations there is an error of out of memory
>> >
>> > Somebody could help me??
>> >
>> > Thanks!
>> >
>> > Marco
>>
>> A rough idea would be to implement __destruct which cleans up the object:
>>
>> Marco_Dom_Query extends Zend_Dom_Query
>> {
>> public function __destruct()
>> {
>> unset($this->_document);
>> }
>> }
>>
>> Use it just like Zend_Dom_Query, keep unset($dom) in the loop and see
>> if this triggers __destruct() and really gets rid off the document you
>> added to the object. You could also check if memory_get_usage() grows
>> with each loop, or not.
>
> If this *does* work, please put an issue in the tracker, and we'll add
> this functionality.
>
> --
> Matthew Weier O'Phinney
> Software Architect | [email protected]
> Zend Framework | http://framework.zend.com/
>
>
--
View this message in context:
http://www.nabble.com/Zend_Dom-out-of-memory-tp21125082p21128721.html
Sent from the Zend Framework mailing list archive at Nabble.com.