--On Sunday, January 11, 2004 8:13 PM -0500 Sean 'Captain Napalm' Conner <[EMAIL 
PROTECTED]> wrote:
> 
> And there you go.  Using the different directives makes it backwards
> compatible with the original robots.txt (where an older robot will ignore
> the new directives) and without overloading the meaning of existing
> directives (one of the downpoints of my own proposed extention).

No it does not make it backwards compatible. It makes it an
illegal robots.txt file. Parsers built to ignore unknown directives
would still be able to use it. Parsers not built that way would
not be able to parse the file, and would probably miss all the
legal directives as well as the non-standard ones.

I mentioned the internet robustness principle before, but folks
seem to have missed that. It is:

  Be conservative in what you send, liberal in what you accept.

In our case, the contents of the robots.txt file is "sent".
By the robustness principle, we must not add extra stuff on
the assumption that the parsers can deal with it.

Because the original format does not have a version number
there is no way to change the format safely.

wunder
--
Walter Underwood
Principal Architect
Verity Ultraseek

_______________________________________________
Robots mailing list
[EMAIL PROTECTED]
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to