At 3:43 PM -0700 3/7/02, Sean M. Burke wrote:
>The usefulness of the single-host spiders is pretty obvious to me.
>But why do people want to write spiders that potentially span all/any hosts?
>(Aside from people who are working for Google or similar.)
People think a robot can be an intelligent agent, looking for
relatively obscure topics or on specific web pages. In many cases,
clever search syntax in the public search engines would take care of
this, but the hype around intelligent agents is quite seductive.
Avi
--
Complete Guide to Search Engines for Web Sites and Intranets
<http://www.searchtools.com>
--
This message was sent by the Internet robots and spiders discussion list
([EMAIL PROTECTED]). For list server commands, send "help" in the body of a message
to "[EMAIL PROTECTED]".