https://bugzilla.wikimedia.org/show_bug.cgi?id=45347

Andre Klapper <[email protected]> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Keywords|ops                         |
           Priority|Low                         |Lowest
                 CC|                            |[email protected]

--- Comment #8 from Andre Klapper <[email protected]> ---
(In reply to comment #0)
> http://en.wikipedia.org/wiki/Humans.txt is the equivalent to robots.txt, both
> are not defined by a standard.

1) Is there any real use? Who would access that file? robots.txt is at least
read by crawlers, but this sounds just like creating yet another file with
duplicated data for no good purpose.

2) What is the scope of who to list in that file? Feels extremely vague to me.
We won't list all Wikipedia editors plus try to keep that updated, would we?

-- 
You are receiving this mail because:
You are on the CC list for the bug.
You are the assignee for the bug.
You are watching all bug changes.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to