Hi All,

I am working on an application that updates really a lot of data in a
shell process.
Customer are sending very big files of data that I need to match and
update .. so far so good ....

I realized that after a while (about 40 000 records) my application
runs out of memory (32 Mb) ... I have 3 millions records to update ..
Digging around, I discover that the validation are consuming the
memory.
I did simple test such as commented this line (nb: 2346 on rev: 7847)
in core model.php:

$valid = $Validation->dispatchMethod($rule, $ruleParams);

I use memory_get_usage(true) every 1000 records, when the above is NOT
commented (actually if the validation is true or false does not
matter) the memory grows slowly .. but double after 10 000 records.
When the above line is commented in model.php the memory is stable
(same amount after 3 millions records).
Again I use this in a shell, so memory should be quite stable since
all objects are instantiated after few records, but it always failed
when "trying to allocate memory for validation.php). I guess RC2 did
not have this problem, I did validate 3 millions records before ...
but now a bit stuck around 40 000 (about 8 models are involved in
transactional mode).

Maybe someone of the core team has an idea about this ? Easy to
reproduce, just validate 1000 times the same data .. and watch the
memory used  (hard to do with the test suite though since you can't
"expect" any memory at first)..
Cheers




--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to