We of course do not have as many problematic uploads as FB does (and to be
honest having a personal experience I am not really impressed with the
quality of their moderators), but we still get several hundreds of obvious
copyright violations per day uploaded to Commons, and several hundreds junk
With regard to the issue Facebook is having, if that were to become an
issue on Wikimedia projects something likely would have happened already.
The majority of disturbing content is handled by volunteers, and that which
T handles is often sent to them by volunteers.
Also, given the relatively
Is anyone not already aware of the recent issue facing Facebook over
compensation for moderators
https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/ To
me there appears to be potential risk that the Board and the WMF must
consider in relation to any role that involves any
What Martin mentions should be covered in the recommendations for the 2030
strategy, the measures mentioned here being "fast-tracked" to provide a
starting point for improving Community Health.
Conflict resolution needs to happen on the lowest possible level so that we
don't run into situations
> A former steward fellow and I
> discussed this topic at the Safety Space at Wikimania. Due to the nature of
> the space, the discussion have not been documented but you can find the
> presentation with backgrounds of the situation and open questions on
> Commons
> <
>
The board resolution aims at "addressing harassment and incivility on
Wikimedia projects".
I don't see that this covers "disputes", i.e disputes over content. We
can, of course, disagree with someone totally over a topic, as long as we
discuss our differences in a civil and respectful way - and
That's a tricky topic, especially when local dispute resolution bodies
(which should in most cases be approached first, I agree here) cannot solve
the dispute or when multiple projects are involved. At the moment, there is
in fact a lack of such body and of course it should be transparent,
How is a one-off ban comparable in any way with a structured effort to
develop a policy in consultation with the community, and then implement it
together?
Lodewijk
On Sat, May 23, 2020 at 10:02 PM Todd Allen wrote:
> Worked out great the last time WMF tried to pull something like this,
>
Worked out great the last time WMF tried to pull something like this,
didn't it?
https://en.wikipedia.org/wiki/Wikipedia:Community_response_to_the_Wikimedia_Foundation%27s_ban_of_Fram
Oh, wait. By "worked out great" I mean "was an unmitigated disaster." One
wonders if the folks at the WMF are
On Sun, 24 May 2020 at 04:25, AntiCompositeNumber <
anticompositenum...@gmail.com> wrote:
> Would it be fair to say that:
> - Enforcement of a universal code of conduct would happen though a
> fair, clearly-defined process without significant bias and with
> significant community oversight and
I like the concept as it means the WMF can step up and address the dodgy
corporate players a lost more effectively across all platforms including
taking big stick tools to prevent them white washing articles or providing
paid for services
On Sun, 24 May 2020 at 10:25, AntiCompositeNumber <
While I'm pretty sure that this wasn't the intention, that sounds a
lot like "ban first and ask questions later". As Pine noted, this is
a topic where great care must be taken to communicate intentions
clearly and diplomatically. This point was likely introduced to
respond to concerns about
international experts on harassment, community health
> and children’s rights, as well as additional hiring."
>
> Best,
> Shani.
>
>
>>
>> From: effe iets anders
>> Date: Sat, May 23, 2020 at 4:26 AM
>> Subject: Re: [Wikimedia-l] Trust and safety on Wikimedi
hiring."
Best,
Shani.
>
> From: effe iets anders
> Date: Sat, May 23, 2020 at 4:26 AM
> Subject: Re: [Wikimedia-l] Trust and safety on Wikimedia projects
> To: Wikimedia Mailing List
>
>
> Thanks for this step - I wish that it wouldn't be necessa
Thanks for this step - I wish that it wouldn't be necessary. I'm not sure
of all the implications, but was mostly wondering: will this be primarily a
stick, or is the foundation also going to invest more heavily in carrots
and education?
I get the impression that we have much progress to make in
Hello, Dennis!
Not at all. What it means is that this a not a process that goes into play
*before* a decision to act is made, but *after*. It should stand as an
option for those who want to ensure that actions taken are fair, as long as
the case does not relate to legal risks or other severe
"Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases
which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a
resolution and published a statement[1] regarding the urgent need to make
our movement more safe and inclusive by addressing harassment and
incivility on Wikimedia projects. The statement builds on prior
18 matches
Mail list logo