https://bugzilla.wikimedia.org/show_bug.cgi?id=61132

--- Comment #5 from Tim Landscheidt <t...@tim-landscheidt.de> ---
(In reply to comment #4)
> [...]

> > and any spider
> > accessing them brings the house down.  As tools are created and updated
> > without
> > any review by admins and wiki edits are not monitored as well, blacklisting
> > them after the meltdown doesn't work.

> > So unlimited spider access is not possible.

> Nobody said unlimited. This works on Toolserver, it's not inherently
> impossible. It's unfortunate that migration implies such usability
> regressions,
> because then tool developers will try to postpone migration as long as
> possible
> and we'll have little time.

I haven't met a tool developer who postpones migration because of robots.txt
(or cares about that at all, because their tools are linked from Wikipedia). 
Noone even asked to change robots.txt.  Who are they?

If tool developers guarantee that a specific tool is resistant to spiders, we
can whitelist that (even automated à la ~/.description).

> [...]

> > msnbot.  And the other spiders?  Will they read it in the same way or as
> > "/"? 

> You'll find out with experience.

> [...]

Why would we take that risk with only marginal benefit gained?  "Experience"
means a lot of people yelling.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to