I've written a Symfony task which executes this simple loop:
while (($line = fgetcsv($in)) !== false)
{
if (count($line) != 7)
{
// Blank lines, other crud
continue;
}
$count++;
if (!($count % 1000))
{
echo("Count: $count\n");
}
// Just update any zips we already have.
$tbZip = Doctrine::getTable('tbZip')->find($line[0]);
if (!$tbZip)
{
$tbZip = new tbZip();
}
$f = 0;
$tbZip->zip = $line[$f++];
$tbZip->city = $line[$f++];
$tbZip->state = $line[$f++];
$tbZip->latitude = $line[$f++];
$tbZip->longitude = $line[$f++];
$tbZip->timezone = $line[$f++];
$tbZip->dst = $line[$f++];
$tbZip->save();
unset($tbZip);
}
The records tick by... a thousand every second or so... and around
43,000 records (alas, just before it's done), it hits the 128MB memory
limit. Watching in top, it seems to be burning
RAM at a steady rate as it goes along.
Adding unset($tbZip); didn't help (and I wouldn't expect it to,
really, since I reassign it anyway).
I could raise the memory limit, but there's a larger issue here:
Doctrine shouldn't leak epic amounts of memory on every little insert.
(: We're talking 43,000 zip code records here... not the kind of thing
that should break the bank.
Thanks for any advice!
(On a related subject, I see Doctrine 1.1 can do spatial indexes...
perhaps? And if so, can Doctrine 1.1 be used with Symfony 1.2?)
--
Tom Boutell
www.punkave.com
www.boutell.com
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"symfony users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/symfony-users?hl=en
-~----------~----~----~----~------~----~------~--~---