PHP is the easiest, but not the best for that purpose. If you're
crawling web sites, why not use web crawling software? Take a look @
apache web crawler. The Apache foundation has some amazing software in
it's 'forge'.

If you're doing web scraping, try dapper.net. Follow a link on the
main page to get to the 'old site'.

On Sep 1, 11:17 am, pghoratiu <[email protected]> wrote:
> Hi!
>
> My suggestion is to use PHP 5.3.X, it has improved garbage collection
> and it should help with reclaiming unused memory. Also you should
> group the code that is leaking inside a separate function(s), this way
> the PHP runtime knows that it can release the memory for variables
> within the scope.
>
>     gabriel
>
> On Sep 1, 12:11 pm, "PieR." <[email protected]> wrote:
>
> > Hi,
>
> > I have a sfTask in CLI wich use lot of foreach and preg_matches, and
> > unfortunatly PHP return an error "Allowed memory size...." in few
> > minutes.
>
> > I read that PHP clear the memory when a script ends, so I tried to run
> > tasks inside the main task, but the problem still remains.
>
> > How to manage this memory issue ? clear memory or launch tasks in
> > separate processes ?
>
> > The final aim is to build a web crawler, wich runs many hours per
> > days.
>
> > Thanks in advance for help,
>
> > Regards,
>
> > Pierre
>
>

-- 
If you want to report a vulnerability issue on symfony, please send it to 
security at symfony-project.com

You received this message because you are subscribed to the Google
Groups "symfony users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/symfony-users?hl=en

Reply via email to