https://bugzilla.wikimedia.org/show_bug.cgi?id=45347

--- Comment #3 from Dereckson <[email protected]> ---
[ humans.txt standard ]

First, humans.txt HAS a proposed standard whcih matches this presentation:
http://humanstxt.org/humans.txt

Some sites doesn't respect it, e.g. http://www.google.com/humans.txt

It's documented on http://humanstxt.org/Standard.html (scroll down).


[ robots.txt standard ]

There is a de facto standard, to be understood by crawler and bots visiting
your site. See http://en.wikipedia.org/wiki/Robots_exclusion_standard


[ Is humans.txt an equivalent to robots.txt? ]

The robots.txt file is useful and needed by the community to tell the general
search engines not to index some sensitive pages (e.g. deletion votes) or some
specific engines not to crawl the site.

See http://en.wikipedia.org/robots.txt for the list

The humans.txt file contains arbitrary information, already offered and better
updated on project pages. I mainly fear the humans.txt update is complicated or
neglected.


[ Goal ]

You offer to "describe the page and invite contributors".

First, visitors shouldn't need to read humans.txt to be invited to contribute.
It's rather dubious this outreach strategy would work.

Finally, you seem to have missed the goal of the humans.txt specification
authors: they don't want as much to provide a text file to humans than to
provide a file describing the humans behind the site (like a colophon).


[ Content if we follow the standard ]

Credits (team or thanks) would be too large to mention. There are too many
active contributors. A generic mention like "See pages history" won't really be
useful.

The tools used to build the sites are very various. Sure, we use MediaWiki. But
then, every article editor has their favorite tools, every developer their
favorite environment, there are the bots, the external tools, etc.


[ Technical implementation ]

The humans.txt file should also be considered for localization in every
language we support. The solution offered should contain a l10n effort. This
localization should be compatible with the standard, which is currently English
only.

Note we have a robots.txt generation code from the MediaWiki:Robots.txt system
message, so technically this is fairly simple to implement.


[ Next steps ]

Per the analysis I gave and the dubious benefits we could get, I recommend a
WONTFIX resolution.

If you really want to go on this proposal, please submit a humans.txt sample
content, so we'll have a basis for discussion. I will then give you some
technical notes about how to generate this content if there are fields to
automate. Then, you'll be able to launch a discussion with the community on en.
or meta.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
You are the assignee for the bug.
You are watching all bug changes.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to