Hey Mike, it's totally not a matter of performance here.
First of all, there's no reason on use this approach for a that huge
amout of data, you should probably import it to your db.
But if you want to do this in the app layer, maybe it's a feature - I
don't know, you should definetly build INSERT sql statements and use
Model::execute(), because there are lot's of things happening behind
the scenes on Model::save() and Model::saveAll(), it's really hard
not to run out of mem while executing them thousand times.

A step-by-step, merge some and you'll get a prettier code:

function saveAllThousands($CSVfile) {

$blacklist = array($this->primaryKey); // what columns not to save
$columns = array_keys($this->schemaI()); // get table column names
$whitelist = array_diff($fields, $blacklist);

$fields = join(',', $whitelist);
$sql = "INSERT INTO {$this->useTable} ({$fields}) VALUES ";
foreach($CSVfile as $line) {
    // here you can perform any action with the incoming data -
escaping, glueing...
    $vals = join(',', $line);
    $values[] = "({$vals})";
}
$sql .= join(',', $values);
return $this->execute($values); // execute in plan old MySQL

}

Not much different than in plan old php, but I assure it's the safest
way -> in php, in the app layer - executing on db is much quicker;
Last but not least, comparing raw php access to mysql interaction with
cakephp's data source is madness, of course it's much faster.

rafaelbandeira3
http://rafaelbandeira3.wordpress.com

On 23 set, 22:19, Mike <[EMAIL PROTECTED]> wrote:
> Hello!
>
> I'm trying to load a bunch of rows into a table - on the order of
> 1,000 or so.  This isn't huge, but I'm having problems when I try to
> do it using CakePHP.  My basic approach is to upload a .CSV file,
> which i then pull apart, and use to repeatedly save() new rows into
> the appropriate model.  I'm uploading & parsing the .CSV file just
> fine (according to the debug output), but the process dies before
> completion.
>
> Initially, PHP timed out (it took more than 30 seconds to process),
> and when I reset the max_execution (in PHP.ini) to 300 seconds, it
> then ran out of memory.
>
> My questions:
>
> 1) Are there any obvious things to look for / avoid in my code?
> Adding 1K of new records seems like it ought to be fine (and was fine,
> when I used an earlier version of my web app that was written in raw
> PHP/MySQL)
>
> 2) How do I begin trying to solve this problem?  How does one identify
> what's going wrong here, and then fix it?
>
> 3) Does anyone have any good suggestions for perf-tuning CakePHP apps?
>
> Sorry for not doing more digging myself - the first N Google results
> weren't promising, so I figured that I'd go for help now, since I'm in
> a bit of a rush :(
>
> Thanks!
> --Mike
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to cake-php@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to