Mark Tinka wrote:
If Google's long-term goal is to gather surfing patterns of users so it can exponentially grow the number of web resources it indexes for searching, that would seem plausible, but the initial cost, the accelerator, is a huge loss leader for such a goal. However, if they have the cash, why not!I don't think the loss leader is an issue. Google's front page says that they are searching 8,058,044,651 pages. If each page has only two links that means we have 16,116,089,302 choices on what to crawl next. A full crawl would have to fetch a fantastic amount of content. So using people's browsing patterns to selectively decide what to crawl next (and much more often) would make Google's index fresher and more relevant. People who do not use the web accelerator will also benefit.
Also, if you look at it as a hard problem and a harder problem - it is easier to deploy a web accelerator than to go and crawl everywhere again!
-- G. _______________________________________________ LUG mailing list [email protected] http://kym.net/mailman/listinfo/lug %LUG is generously hosted by INFOCOM http://www.infocom.co.ug/
