On Mon, January 30, 2006 5:33 am, Peter Hoskin wrote:
> I have written a script to parse a large amount of XML data and insert
> it into SQL... deals with approx 80,000 rows each time I run it. I
> cannot successfully complete this script without running out of
> memory.
> Is pcntl_fork suitable to overcome this?

Probably not...

> I have been playing with pcntl_fork and became completely stuck trying
> to select data from a database and spawn a new fork for each row.
> Later
> I'll have to use something like shmop so each child knows what row its
> working on, and will do a series of xml parsing based on supplied
> data..
> a simple example of forking with a foreach this way would be
> appreciated

Woof.  A new process getting forked for each row is going to swamp
your machine with processes very very very rapidly...

First, though, are you running this from the command line CLI interface?

Just supply a custom php.ini with "memory_limit = 16M" or whatever it
takes to get the job done and call it quits.

If you're trying to run pctl_fork from a web application, you're
doomed anyway -- It's not suitable in that environment.

You probably could re-write the script to not need so much RAM, by
using fopen/fgets to read one LINE at a time, and deal with it, and
then move on to the next line.

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to