Few years ago, I remember I meet the same problem in a database
migration script.
The problem was coming from the xdebug module. I don't know how this
module work but it was making the script growing up in the memory.
The problem was resolved by disabling the xdebug module.

On 21 mai, 16:55, mohdshakir <mohdsha...@gmail.com> wrote:
> Can you elaborate?
>
> On May 21, 1:34 am, Galou <gael.duc...@gmail.com> wrote:
>
>
>
>
>
>
>
> > It could be due to the php xdebug module.
>
> > Galou
>
> > On 20 mai, 09:14, Mohd Shakir Zakaria <mohdsha...@gmail.com> wrote:
>
> > > I keep on getting this error when running one of my scripts;
>
> > > PHP Fatal error:  Allowed memory size of 1073741824 bytes exhausted
> > > (tried to allocate 71 bytes) in ...
> > > lib/symfony-1.4.11/lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrin
> > >  e/Connection/Statement.php
> > > on line 246, ...
>
> > > The following is the stripped-down version of the script that's
> > > triggering the error;
>
> > > public function executeImportFile(sfWebRequest $request)
> > > {
> > >   ini_set('memory_limit', '1024M');
> > >   set_time_limit ( 0 );
>
> > >   //more codes here...
>
> > >   $files = scandir($workspace.'/'.$directory);
>
> > >   foreach ($files as $file) {
> > >     $path = $workspace.'/'.$directory.'/'.$file;
>
> > >     if ($file != "." && $file != "..") {
> > >       $this->importfile($path);
> > >     }
> > >   }
>
> > > }
>
> > > protected function importfile($path){
> > >   ini_set('memory_limit', '1024M');
> > >   set_time_limit ( 0 );
>
> > >   $connection =
> > > sfContext::getInstance()->getDatabaseManager()->getDatabase('doctrine')->ge
> > >  tDoctrineConnection();
> > >   $connection->beginTransaction();
> > >   try {
>
> > >     //more codes here...
>
> > >     while ($data = $reader->read()) //reads each line of a csv file
> > >     {
> > >       // send the line to another private function to be processed
> > >       // and then write to database
> > >       $this->writewave($data);
> > >     }
>
> > >     $connection->commit();
>
> > >   } catch (Exception $e) {
> > >     $connection->rollback();
> > >   }
>
> > > }
>
> > > What the script does is basically to read all the csv files in a
> > > folder (which contains tens of thousands of lines each), process it,
> > > and the write it to the database using Doctrine's transaction.
>
> > > While I don't think I need to set the memory limit and the time limit
> > > in both functions, the script quits as Doctrine uses up all the the
> > > allocated 1GB of memory.
>
> > > It will normally stop after processing 10 files, and allocating more
> > > memory will allow it to process a bit more files, and will still
> > > crash.
>
> > > Is there anything that I'm missing here that's causing the memory not
> > > to free up after processing each files?
>
> > > Mohd Shakir Zakariahttp://www.mohdshakir.net

-- 
If you want to report a vulnerability issue on symfony, please send it to 
security at symfony-project.com

You received this message because you are subscribed to the Google
Groups "symfony users" group.
To post to this group, send email to symfony-users@googlegroups.com
To unsubscribe from this group, send email to
symfony-users+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/symfony-users?hl=en

Reply via email to