> One idea is the proposal to install the AbuseFilter in a global mode, > i.e. rules loaded at Meta that apply everywhere. If that were done > (and there are some arguments about whether it is a good idea), then > it could be used to block these types of URLs from being installed, > even by admins.
Identifying client side generated urls from server side opens up a whole lot of problems of its own. Basically you need a script that runs in a hostile environment and reports back to a server when a whole series of urls are injected from code loaded from some sources (mediawiki-space) but not from other sources user space), still code loaded from user space through call to mediawiki space should be allowed. Add to this that your url identifying code has to run after a script has generated the url and before it do any cleanup. The url verification can't just say that a url is hostile, it has to check it somehow, and that leads to reporting of the url - if the reporting code still executes at that moment. Urk... John _______________________________________________ foundation-l mailing list [email protected] Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
