Then I would have to write a parser when I have already written my own
spider. But the problem isn't the spider itself, but rather the lack of
multi-threading within PHP (which is needed in quite alot of sitatuations,
depending on what you're coding). I'm not sure if it really does lack
multi-threading though, it might just be me that doesn't know about it.
Is there any way to execute several functions simultaneously?

Mvh,
Mattias Andersson

Mjukvaruutvecklare, humany AB
Direkt: 08-4540701
Mobil: 0704-526685


-----Ursprungligt meddelande-----
Från: bvr [mailto:[EMAIL PROTECTED]] 
Skickat: den 24 januari 2002 15:33
Till: [EMAIL PROTECTED]; Mattias Andersson
Ämne: Re: [PHP] multi-threading within php?


You may want to use 'wget' spider function, then parse the files with PHP if
necessary.

http://wget.sunsite.dk/

bvr.


On Thu, 24 Jan 2002 14:52:54 +0100, Mattias Andersson wrote:

>Hello!
> 
>I was making a spider for a simple searchengine and all was well until I
>started testing it on larger sites.
>The problem isn't that it doesn't work, it does, but it is very very slow,
>just handling one connection at the time.
>Basically, it just downloads a page, extract all links from it, then
>downloads those pages, extract their links, until all pages on a site has
>been downloaded and archived.
>Is there any way to do it in several threads so that it handles several
>connections simultaneously?
>If it would dynamically allocate threads that would be even better.
> 
>Regards,
>Mattias Andersson
> 
>Software Developer, humany AB
>Direct: 08-4540701
>Mobile: 0704-526685
> 
>



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]

Reply via email to