Darn.  Theoretically you could exceed any possible limit you place, so the
only logical answer is to run them as individual queries, one at a time, and
see if that works.  If not that then I guess you could have it shell out and
create another instance of the batch after the first one completed (and
processed, say, 50 records).

// andrew

On 15/03/07, Michel D'HOOGE <[EMAIL PROTECTED]> wrote:
>
> On Wednesday 14 March 2007 19:45, Andrew Backer wrote:
> > Aye :) But I think properly managing memory rather than aborting
> Well, AFAIK PHP is really bad at managing memory and garbage collection. I
> tried to put some unset() calls in my code but with no difference at all.
> As
> ved suggested, I guess most of the memory is consumed by either Propel or
> Creole (or both!).
>
> > if you
> > do know the max you need there is a php max memory setting in the ini
> file
> > that is tweakable.
> Hard to say since the data are collected from a table with a non-limited
> TEXT
> field and the number of records is also unknown...
> --
> Michel
>
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"symfony developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/symfony-devs?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to