MZMcBride <> changed:

           What    |Removed                     |Added
                 CC|                            |

--- Comment #4 from MZMcBride <> ---
Is there any reason to believe that more aggressive URL canonicalization will
affect robots.txt entries? I'm not sure there's a valid use-case here.

In reply to comment 3, I'd suggest that you could turn each of those
underscores into " " or "%20" or "__" and come up with thousands more
permutations. :-)

Given that Squid caching is prefix-based, more aggressive URL canonicalization
would have been (or would be) helpful in that context. That is, as I understand
it, Squid viewed "/wiki/Wikipedia_talk%3AB" and "/wiki/Wikipedia_talk:B" as
distinct URLs and would cache both separately.

I'm not sure the same is true of Varnish (which is what Wikimedia wikis now
use), though improving Squid behavior alone might make this a valid request.

You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
Wikibugs-l mailing list

Reply via email to