This isn't the best solution but it might help bring down the total
time. Can you set up a shell script to retrieve the content from a URL
(in PHP if you wish) and then from your web app spawn 5 processes, with
destination temporary files for the data which you can then poll for
completion (microsleep here and there :)?

Cheers,
Rob.

On Thu, 2003-08-28 at 18:49, Evan Nemerson wrote:
> php.net/pcntl_fork
> 
> 
> 
> On Thursday 28 August 2003 11:09 am, David Otton wrote:
> > On Thu, 28 Aug 2003 20:25:05 +0300, you wrote:
> > >I am looking for PHP analog of Perl LWP::Parallel.
> > >I need to fetch several URL(pages) from PHP at the same time.
> > >I have a script which fetch 5-10 URL, each URL fetched 0.5 - 2 sec.
> > >Totally it's about 10 seconds in average.
> > >I suppose if I fetched them parallel I would speed up the script 5 times.
> > >I am looking mostly for PHP solutions.
> > >(Hope they exist :)
> >
> > Think you're out of luck. Yes, it's a problem I've run up against more than
> > once. There's no thread support in PHP [4. Anyone know if it's in 5?].
> >
> > I suppose you might be able to hack something together by spawning external
> > processes, but... might as well just do it all externally then.
> 
> -- 
> "The missionaries go forth to Christianize the savages- as if the savages 
> weren't dangerous enough already."
> 
> -Edward Abbey
> 
> -- 
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
> 
> 
-- 
.---------------------------------------------.
| Worlds of Carnage - http://www.wocmud.org   |
:---------------------------------------------:
| Come visit a world of myth and legend where |
| fantastical creatures come to life and the  |
| stuff of nightmares grasp for your soul.    |
`---------------------------------------------'

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to