--- Avi Rappoport <[EMAIL PROTECTED]> wrote: > 
> At 3:43 PM -0700 3/7/02, Sean M. Burke wrote:
> >The usefulness of the single-host spiders is pretty obvious to me.
> >But why do people want to write spiders that potentially span all/any hosts?
> >(Aside from people who are working for Google or similar.)
> 
> People think a robot can be an intelligent agent, looking for 
> relatively obscure topics or on specific web pages.  In many cases, 
> clever search syntax in the public search engines would take care of 
> this, but the hype around intelligent agents is quite seductive.
> 
> Avi

Every so often I try to write one myself based upon existing Perl or Java code but 
find it too
much hassle. 

I am trying to find every book review on the internet (starting off with Science 
Fiction ones if
possible). I have some loose algorithms for identifying a book review but I haven't 
yet found a
suite of code which properly lets me use it. They all assume I am either searching one 
site or the
whole internet not a directed subset of pages.....

Alex



=====
Alex McLintock        [EMAIL PROTECTED]    Open Source Consultancy in London
OpenWeb Analysts Ltd, http://www.OWAL.co.uk/ 
---
SF and Computing Book News and Reviews: http://news.diversebooks.com/
Get Your XML T-Shirt <t-shirt/> at http://www.inversity.co.uk/
Please Remove [EMAIL PROTECTED] from your address book.

__________________________________________________
Do You Yahoo!?
Everything you'll ever need on one web page
from News and Sport to Email and Music Charts
http://uk.my.yahoo.com

--
This message was sent by the Internet robots and spiders discussion list 
([EMAIL PROTECTED]).  For list server commands, send "help" in the body of a message 
to "[EMAIL PROTECTED]".

Reply via email to