Hello

>The usefulness of the single-host spiders is pretty obvious to me.
>But why do people want to write spiders that potentially span all/any hosts?
>(Aside from people who are working for Google or similar.)
>
Maybe the better question is, why do people want to write spiders that 
span many different hosts? Because there might be many new search 
services possible, google did not implement yet.
For example, I implemented a software, which is able to identify events, 
parties and so on, on webpages. And the most useful way to use a 
software like this, was to combine it with a crawler, to scan as much as 
possible german web pages for events and getting on this way an event 
search engine.

Matthias
-- 
http://www.eventax.com




--
This message was sent by the Internet robots and spiders discussion list 
([EMAIL PROTECTED]).  For list server commands, send "help" in the body of a message 
to "[EMAIL PROTECTED]".

Reply via email to