Adding a version number after the fact could break existing
robots. So that is not an acceptable design. It violates the
robustness principle.

This would need to be a separate file, probably "robots2.txt".

The proposal assumes a time-based robot with centralized 
control. I seriously doubt that any of the WWW-wide search engines
still use robots like that. The rate directives, in particular,
might not be implementable by any of the high-volume robots.

Has there been a recent survey of robots.txt usage and correctness?
Last I heard, it was pretty consistent at about 5% of sites.

As long as 95% of sites don't use robots.txt at all, it seems odd
to make it more complicated.

wunder
--
Walter Underwood
Principal Architect
Verity Ultraseek

_______________________________________________
Robots mailing list
[EMAIL PROTECTED]
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to