https://bugzilla.wikimedia.org/show_bug.cgi?id=14951

--- Comment #4 from Chris Steipp <[email protected]> 2012-08-24 17:02:36 
UTC ---
There are several options available:

1) The 15 minute cache only applies to remote blacklists, like the one fetched
from meta on most of the other wikis. We could reduce the timeout setting, but
that would directly affect performance.

2) Pulling from a database article is supported-- so each wiki is free to
define a database (wiki) + article that contains a spamblacklist. The article
text will be read each time from a slave database, without caching. So even
re-defining the blacklist location to be "DB: metawiki Spam_blacklist" instead
of "https://meta.wikimedia.org/wiki/Spam_blacklist"; would keep the list from
being cached.

3) Multiple blacklists are supported, so it would be easy for each wiki to
define another wiki-specific, or an article shared on meta for rules that are
being quickly / frequently updated.

It sounds to me like you probably want to leave the cached version of meta's
Spam_blacklist, so that all 15k rules that we want to keep are stored and
cached. But then define another page on meta that all of the wikis can share,
which will have a small list of rules that can be frequently updated, and
immediately effective as soon as it's updated.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to