Hello,

In fact, it's not really relevant to ad this because only the crawler decide to index 
X documents per second.

A crawler is not a "vacuum cleaner", it have not to stop website traffic during 
crawling. If not...it's not a crawler !

Just image if any website decided to delay indexing in about 1 url per 5 
seconds...Google will take 1 year to revisit all the web (sic !)...

Bad game isnt'it ?

Verticrawl.com team

<[EMAIL PROTECTED]> wrote ..
> I am surprised that after all that talk about adding new semantic
> elements to robots.txt several years ago, nobody commented that the new
> Yahoo crawler (former Inktomi crawler) took a brave step in that
> direction by adding "Crawl-delay:" syntax.
> 
> http://help.yahoo.com/help/us/ysearch/slurp/slurp-03.html
> 
> Time to update your robots.txt parsers!
> 
> Otis Gospodnetic
> 
> _______________________________________________
> Robots mailing list
> [EMAIL PROTECTED]
> http://www.mccmedia.com/mailman/listinfo/robots
_______________________________________________
Robots mailing list
[EMAIL PROTECTED]
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to