[PHP] Multi Threading in PHP
Hi, a few days ago i saw here a thread about multi threading, but this didn´t answer my question. I have a page, that must search the best price in multiple databases/plain text files and perhaps one day xml-interfaces, some of them can be located on extern hosts, every search can be a single function/object method. In my dreams i call this in multi thread (with a timeout functionality) to handle lost connections or anything else that would cause endless queries, and return after this to print out the results. Is this possible with any function today? Or will this be possible in future? Best Regards, Bart Frackiewicz -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] multi-threading within php?
Hello! I was making a spider for a simple searchengine and all was well until I started testing it on larger sites. The problem isn't that it doesn't work, it does, but it is very very slow, just handling one connection at the time. Basically, it just downloads a page, extract all links from it, then downloads those pages, extract their links, until all pages on a site has been downloaded and archived. Is there any way to do it in several threads so that it handles several connections simultaneously? If it would dynamically allocate threads that would be even better. Regards, Mattias Andersson Software Developer, humany AB Direct: 08-4540701 Mobile: 0704-526685
Re: [PHP] multi-threading within php?
Sorry, I don't have an answer on your question, but a question to you: I'm currently building a class to browse a page and was interested in the source of your script. Especially on that part extracting the links. Would you mind providing the source or is it a commercial? best regards Stefan Rusterholz - Original Message - From: Mattias Andersson [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Thursday, January 24, 2002 2:52 PM Subject: [PHP] multi-threading within php? Hello! I was making a spider for a simple searchengine and all was well until I started testing it on larger sites. The problem isn't that it doesn't work, it does, but it is very very slow, just handling one connection at the time. Basically, it just downloads a page, extract all links from it, then downloads those pages, extract their links, until all pages on a site has been downloaded and archived. Is there any way to do it in several threads so that it handles several connections simultaneously? If it would dynamically allocate threads that would be even better. Regards, Mattias Andersson Software Developer, humany AB Direct: 08-4540701 Mobile: 0704-526685 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
SV: [PHP] multi-threading within php?
Then I would have to write a parser when I have already written my own spider. But the problem isn't the spider itself, but rather the lack of multi-threading within PHP (which is needed in quite alot of sitatuations, depending on what you're coding). I'm not sure if it really does lack multi-threading though, it might just be me that doesn't know about it. Is there any way to execute several functions simultaneously? Mvh, Mattias Andersson Mjukvaruutvecklare, humany AB Direkt: 08-4540701 Mobil: 0704-526685 -Ursprungligt meddelande- Från: bvr [mailto:[EMAIL PROTECTED]] Skickat: den 24 januari 2002 15:33 Till: [EMAIL PROTECTED]; Mattias Andersson Ämne: Re: [PHP] multi-threading within php? You may want to use 'wget' spider function, then parse the files with PHP if necessary. http://wget.sunsite.dk/ bvr. On Thu, 24 Jan 2002 14:52:54 +0100, Mattias Andersson wrote: Hello! I was making a spider for a simple searchengine and all was well until I started testing it on larger sites. The problem isn't that it doesn't work, it does, but it is very very slow, just handling one connection at the time. Basically, it just downloads a page, extract all links from it, then downloads those pages, extract their links, until all pages on a site has been downloaded and archived. Is there any way to do it in several threads so that it handles several connections simultaneously? If it would dynamically allocate threads that would be even better. Regards, Mattias Andersson Software Developer, humany AB Direct: 08-4540701 Mobile: 0704-526685 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
Re: [PHP] multi-threading within php?
If the links are placed in a db as they are found, then couldn't you run the same script in multiple instances (browser windows, DOS boxes/command lines) against the same db. That way you would be fetching and parsing multiple web pages concurrently. The only 'multi-threading' would be against the db. Your Internet link should be able to cope. =dn - Original Message - From: Mattias Andersson [EMAIL PROTECTED] To: 'bvr' [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: 24 January 2002 17:14 Subject: SV: [PHP] multi-threading within php? Then I would have to write a parser when I have already written my own spider. But the problem isn't the spider itself, but rather the lack of multi-threading within PHP (which is needed in quite alot of sitatuations, depending on what you're coding). I'm not sure if it really does lack multi-threading though, it might just be me that doesn't know about it. Is there any way to execute several functions simultaneously? Mvh, Mattias Andersson Mjukvaruutvecklare, humany AB Direkt: 08-4540701 Mobil: 0704-526685 -Ursprungligt meddelande- Från: bvr [mailto:[EMAIL PROTECTED]] Skickat: den 24 januari 2002 15:33 Till: [EMAIL PROTECTED]; Mattias Andersson Ämne: Re: [PHP] multi-threading within php? You may want to use 'wget' spider function, then parse the files with PHP if necessary. http://wget.sunsite.dk/ bvr. On Thu, 24 Jan 2002 14:52:54 +0100, Mattias Andersson wrote: Hello! I was making a spider for a simple searchengine and all was well until I started testing it on larger sites. The problem isn't that it doesn't work, it does, but it is very very slow, just handling one connection at the time. Basically, it just downloads a page, extract all links from it, then downloads those pages, extract their links, until all pages on a site has been downloaded and archived. Is there any way to do it in several threads so that it handles several connections simultaneously? If it would dynamically allocate threads that would be even better. Regards, Mattias Andersson Software Developer, humany AB Direct: 08-4540701 Mobile: 0704-526685 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED] -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]