It was chance that the English Wikipedia was the first noticeable
target of a "revenge porn" attack. The serious concern about potential
public impact started me looking at how to build a report to help
address the problem of image overwrites not being "visible" to
biography article patrollers. I would be happy to hear of non-English
Wikipedia cases that would provide an incentive to extend the report
to other projects based on experience. The SQL underpinning the report
identifies images of living people by searching relevant categories on
the English Wikipedia for images, but several language databases could
be added in (I would need local language help in choosing similar
categories), with a constraint that it would be nice to keep the
report updates within the current 5 minute cycle.

This is a report for Commons hosted images which are used on *all*
other projects. The majority of images currently listed are used in
many languages, so if they are being overwritten with defamatory
material, the internet "footprint" can be extremely large. For a real
example, the current report highlights
<> which is used
on 28 different Wikimedia projects.


On 13 March 2015 at 09:56, Andy Mabbett <> wrote:
> On 13 March 2015 at 09:06, Strainu <> wrote:
>> Fae, you are aware that this is NOT the list for en.wp, right?
> Perhaps you missed the part of Fae's email which read:
>     If there are other Wikipedias that may benefit from a
>     similar report, please drop me a note on Commons or email me.
> together with the lengthy part of his email which discussed matters
> relating to Wikimedia Commons.
> I'm sure Fae will appreciate your apology.
> --
> Andy Mabbett
> @pigsonthewing

Wikimedia-l mailing list, guidelines at:

Reply via email to