Thomas, I know very well how to parse XML and load it into the DB
However for this import I need all the business logic from the models. Apart from splitting the load into smaller job batches, what else can I do to minimise memory leaks? On Aug 7, 12:09 am, Thomas Rabaix <[email protected]> wrote: > ORM does not perform well with this kind of operation. > You should use this libhttp://fr.php.net/xmland raw SQL to import your > data. > > On Thu, Aug 6, 2009 at 2:00 PM, Fabian Lange < > > > > > > [email protected]> wrote: > > > Jon can comment on Doctrine internals. > > As far as I can say: wow 50k objects. > > Try PHP 5.3 because it finally has a GC that not only uses reference > > counting. > > > Fabian > > > On Thu, Aug 6, 2009 at 1:56 PM, Ken Golovin<[email protected]> wrote: > > > > Hi, > > > > I am processing a large XML, the module I created iterates through > > > about 50,000 records, creates an object for each record and saves it > > > to the DB, and am running into memory issues. The memory usage > > > measured by memory_get_usage() keeps growing with each new object > > > created. > > > I have tried flushing the Doctrine's identity map and unsetting the > > > objects like this: > > > > $product->getTable()->clear(); > > > unset($product); > > > > however that did not help at all > > > > Is there anything else I can do to free up memory? > > -- > Thomas Rabaixhttp://rabaix.net --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "symfony users" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/symfony-users?hl=en -~----------~----~----~----~------~----~------~--~---
