Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-26 Thread Yaroslav Blanter
We of course do not have as many problematic uploads as FB does (and to be
honest having a personal experience I am not really impressed with the
quality of their moderators), but we still get several hundreds of obvious
copyright violations per day uploaded to Commons, and several hundreds junk
articles started and not passed the new page patrol barrier in the English
Wikipedia (deleted or draftified forever). I am sure we have a similar
situation in other big projects. All these things are cleaned up by a very
few people who are on top of the time lost for these tasks also subject to
constant abuse. Note that I am not saying that WMF must pay admins
compensation (still stronger, I will likely leave WMF projects if it starts
doing so), but the problem of emotional drain of those who are dealing with
this shit on a daily basis is real. I am afraid though it has no solution,
because we know that the obvious solution - get more people - does not work.

I am not even talking about off-wiki harassment - which in my experience is
more rare but much stronger because you do not know how real are the
threats. Last time I had to report to the police. This one has no solution
either.

Best
Yaroslav

On Tue, May 26, 2020 at 6:08 PM Chris Gates via Wikimedia-l <
wikimedia-l@lists.wikimedia.org> wrote:

> With regard to the issue Facebook is having, if that were to become an
> issue on Wikimedia projects something likely would have happened already.
> The majority of disturbing content is handled by volunteers, and that which
> T handles is often sent to them by volunteers.
>
> Also, given the relatively complicated upload process (compared to
> Facebook), we simply don’t get nearly as many problematic uploads as they
> do.
>
> On Tue, May 26, 2020 at 09:19 Gnangarra  wrote:
>
> > Is  anyone not already aware of the recent issue facing Facebook over
> > compensation for moderators
> > https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/
> >  To
> > me there appears to be potential risk that the Board and the WMF must
> > consider in relation to any role that involves any form of moderation;
> >
> >1. is there a problem with setting standards against harassment, toxic
> >behavior, and incivility that is at a minimum equal, understandable,
> and
> >respected on all projects, committees, affiliates, events and
> everything
> >else we do
> >2. is there concern about being asked to contribute at these standards
> >3. is the concern how much the WMF needs to be part of the process, or
> >4. how long it should be allowed to go unaddressed before its
> escalated.
> >
> > I go back way to far back I remember a group targeted stalking of female
> > admins,  I was part of a group of admins that were willing to take action
> > against this group. We lost some very good people during that,
>  Harassment
> > has been an on going issue for all my 15 years, we had some the worst
> > people become tool holders, others have just created 1,000's of socks.
> > There are still people contributing today that are trolls, and harassers
> > contributing today, we know that our failures to deal with it effectively
> > and quickly are legendary.   What ever we do we need to keep improving
> our
> > response and our ability to respond across projects, the alternative is
> > going to be that the Board & WMF are going to have to step in and take
> > responsibility out of the communities hands.
> >
> >
> >
> > On Tue, 26 May 2020 at 18:58, Philip Kopetzky  >
> > wrote:
> >
> > > What Martin mentions should be covered in the recommendations for the
> > 2030
> > > strategy, the measures mentioned here being "fast-tracked" to provide a
> > > starting point for improving Community Health.
> > > Conflict resolution needs to happen on the lowest possible level so
> that
> > we
> > > don't run into situations we've encountered in the past. Of course it's
> > > difficult for one aspect to work without the other, so the overall goal
> > > won't be achieved until every part is in place.
> > >
> > > On Mon, 25 May 2020 at 17:46, Samuel Klein  wrote:
> > >
> > > > > A former steward fellow and I
> > > > > discussed this topic at the Safety Space at Wikimania. Due to the
> > > nature
> > > > of
> > > > > the space, the discussion have not been documented but you can find
> > the
> > > > > presentation with backgrounds of the situation and open questions
> on
> > > > > Commons
> > > > > <
> > > > >
> > > >
> > >
> >
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> > > > > >.
> > > > > Maybe it can give some ideas how to proceed with this.
> > > > >
> > > >
> > > > Yes -- I was just thinking of your discussions of this while reading
> > the
> > > > thread. I hope these steward reflections are considered as people
> move
> > > > forward.
> > > >
> > > > The case of disputes that embroil an entire community and their
> admins
> > > > should 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-26 Thread Chris Gates via Wikimedia-l
With regard to the issue Facebook is having, if that were to become an
issue on Wikimedia projects something likely would have happened already.
The majority of disturbing content is handled by volunteers, and that which
T handles is often sent to them by volunteers.

Also, given the relatively complicated upload process (compared to
Facebook), we simply don’t get nearly as many problematic uploads as they
do.

On Tue, May 26, 2020 at 09:19 Gnangarra  wrote:

> Is  anyone not already aware of the recent issue facing Facebook over
> compensation for moderators
> https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/
>  To
> me there appears to be potential risk that the Board and the WMF must
> consider in relation to any role that involves any form of moderation;
>
>1. is there a problem with setting standards against harassment, toxic
>behavior, and incivility that is at a minimum equal, understandable, and
>respected on all projects, committees, affiliates, events and everything
>else we do
>2. is there concern about being asked to contribute at these standards
>3. is the concern how much the WMF needs to be part of the process, or
>4. how long it should be allowed to go unaddressed before its escalated.
>
> I go back way to far back I remember a group targeted stalking of female
> admins,  I was part of a group of admins that were willing to take action
> against this group. We lost some very good people during that,   Harassment
> has been an on going issue for all my 15 years, we had some the worst
> people become tool holders, others have just created 1,000's of socks.
> There are still people contributing today that are trolls, and harassers
> contributing today, we know that our failures to deal with it effectively
> and quickly are legendary.   What ever we do we need to keep improving our
> response and our ability to respond across projects, the alternative is
> going to be that the Board & WMF are going to have to step in and take
> responsibility out of the communities hands.
>
>
>
> On Tue, 26 May 2020 at 18:58, Philip Kopetzky 
> wrote:
>
> > What Martin mentions should be covered in the recommendations for the
> 2030
> > strategy, the measures mentioned here being "fast-tracked" to provide a
> > starting point for improving Community Health.
> > Conflict resolution needs to happen on the lowest possible level so that
> we
> > don't run into situations we've encountered in the past. Of course it's
> > difficult for one aspect to work without the other, so the overall goal
> > won't be achieved until every part is in place.
> >
> > On Mon, 25 May 2020 at 17:46, Samuel Klein  wrote:
> >
> > > > A former steward fellow and I
> > > > discussed this topic at the Safety Space at Wikimania. Due to the
> > nature
> > > of
> > > > the space, the discussion have not been documented but you can find
> the
> > > > presentation with backgrounds of the situation and open questions on
> > > > Commons
> > > > <
> > > >
> > >
> >
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> > > > >.
> > > > Maybe it can give some ideas how to proceed with this.
> > > >
> > >
> > > Yes -- I was just thinking of your discussions of this while reading
> the
> > > thread. I hope these steward reflections are considered as people move
> > > forward.
> > >
> > > The case of disputes that embroil an entire community and their admins
> > > should (also) specifically be addressed.
> > > S
> > > ___
> > > Wikimedia-l mailing list, guidelines at:
> > > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > > https://meta.wikimedia.org/wiki/Wikimedia-l
> > > New messages to: Wikimedia-l@lists.wikimedia.org
> > > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > > 
> > ___
> > Wikimedia-l mailing list, guidelines at:
> > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > https://meta.wikimedia.org/wiki/Wikimedia-l
> > New messages to: Wikimedia-l@lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > 
>
>
>
> --
> GN.
>
> *Power of Diverse Collaboration*
> *Sharing knowledge brings people together*
> Wikimania Bangkok 2021
> August
> hosted by ESEAP
>
> Wikimania: https://wikimania.wikimedia.org/wiki/User:Gnangarra
> Noongarpedia: https://incubator.wikimedia.org/wiki/Wp/nys/Main_Page
> My print shop: https://www.redbubble.com/people/Gnangarra/shop?asc=u
> ___
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-26 Thread Gnangarra
Is  anyone not already aware of the recent issue facing Facebook over
compensation for moderators
https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/   To
me there appears to be potential risk that the Board and the WMF must
consider in relation to any role that involves any form of moderation;

   1. is there a problem with setting standards against harassment, toxic
   behavior, and incivility that is at a minimum equal, understandable, and
   respected on all projects, committees, affiliates, events and everything
   else we do
   2. is there concern about being asked to contribute at these standards
   3. is the concern how much the WMF needs to be part of the process, or
   4. how long it should be allowed to go unaddressed before its escalated.

I go back way to far back I remember a group targeted stalking of female
admins,  I was part of a group of admins that were willing to take action
against this group. We lost some very good people during that,   Harassment
has been an on going issue for all my 15 years, we had some the worst
people become tool holders, others have just created 1,000's of socks.
There are still people contributing today that are trolls, and harassers
contributing today, we know that our failures to deal with it effectively
and quickly are legendary.   What ever we do we need to keep improving our
response and our ability to respond across projects, the alternative is
going to be that the Board & WMF are going to have to step in and take
responsibility out of the communities hands.



On Tue, 26 May 2020 at 18:58, Philip Kopetzky 
wrote:

> What Martin mentions should be covered in the recommendations for the 2030
> strategy, the measures mentioned here being "fast-tracked" to provide a
> starting point for improving Community Health.
> Conflict resolution needs to happen on the lowest possible level so that we
> don't run into situations we've encountered in the past. Of course it's
> difficult for one aspect to work without the other, so the overall goal
> won't be achieved until every part is in place.
>
> On Mon, 25 May 2020 at 17:46, Samuel Klein  wrote:
>
> > > A former steward fellow and I
> > > discussed this topic at the Safety Space at Wikimania. Due to the
> nature
> > of
> > > the space, the discussion have not been documented but you can find the
> > > presentation with backgrounds of the situation and open questions on
> > > Commons
> > > <
> > >
> >
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> > > >.
> > > Maybe it can give some ideas how to proceed with this.
> > >
> >
> > Yes -- I was just thinking of your discussions of this while reading the
> > thread. I hope these steward reflections are considered as people move
> > forward.
> >
> > The case of disputes that embroil an entire community and their admins
> > should (also) specifically be addressed.
> > S
> > ___
> > Wikimedia-l mailing list, guidelines at:
> > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > https://meta.wikimedia.org/wiki/Wikimedia-l
> > New messages to: Wikimedia-l@lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > 
> ___
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> 



-- 
GN.

*Power of Diverse Collaboration*
*Sharing knowledge brings people together*
Wikimania Bangkok 2021
August
hosted by ESEAP

Wikimania: https://wikimania.wikimedia.org/wiki/User:Gnangarra
Noongarpedia: https://incubator.wikimedia.org/wiki/Wp/nys/Main_Page
My print shop: https://www.redbubble.com/people/Gnangarra/shop?asc=u
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 


Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-26 Thread Philip Kopetzky
What Martin mentions should be covered in the recommendations for the 2030
strategy, the measures mentioned here being "fast-tracked" to provide a
starting point for improving Community Health.
Conflict resolution needs to happen on the lowest possible level so that we
don't run into situations we've encountered in the past. Of course it's
difficult for one aspect to work without the other, so the overall goal
won't be achieved until every part is in place.

On Mon, 25 May 2020 at 17:46, Samuel Klein  wrote:

> > A former steward fellow and I
> > discussed this topic at the Safety Space at Wikimania. Due to the nature
> of
> > the space, the discussion have not been documented but you can find the
> > presentation with backgrounds of the situation and open questions on
> > Commons
> > <
> >
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> > >.
> > Maybe it can give some ideas how to proceed with this.
> >
>
> Yes -- I was just thinking of your discussions of this while reading the
> thread. I hope these steward reflections are considered as people move
> forward.
>
> The case of disputes that embroil an entire community and their admins
> should (also) specifically be addressed.
> S
> ___
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> 
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 


Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-25 Thread Samuel Klein
> A former steward fellow and I
> discussed this topic at the Safety Space at Wikimania. Due to the nature of
> the space, the discussion have not been documented but you can find the
> presentation with backgrounds of the situation and open questions on
> Commons
> <
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> >.
> Maybe it can give some ideas how to proceed with this.
>

Yes -- I was just thinking of your discussions of this while reading the
thread. I hope these steward reflections are considered as people move
forward.

The case of disputes that embroil an entire community and their admins
should (also) specifically be addressed.
S
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 


Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-25 Thread Thyge
The board resolution aims at "addressing harassment and incivility on
Wikimedia projects".

I don't see that this covers "disputes", i.e disputes over content.  We
can, of course, disagree with someone totally over a topic, as long as we
discuss our differences in a civil and respectful way - and consider our
opponent's point-of-view and arguments seriously.

Regards,
Thyge - Sir48



Virusfri.
www.avg.com

<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

Den man. 25. maj 2020 kl. 14.45 skrev DerHexer via Wikimedia-l <
wikimedia-l@lists.wikimedia.org>:

> That's a tricky topic, especially when local dispute resolution bodies
> (which should in most cases be approached first, I agree here) cannot solve
> the dispute or when multiple projects are involved. At the moment, there is
> in fact a lack of such body and of course it should be transparent,
> composed of multi-diverse community members who are trained and supported
> by professional mediation, etc. as pointed out. Currently, stewards like me
> are quite often approached with such topics but this user group is more
> focused on technical stuff like userrights. A former steward fellow and I
> discussed this topic at the Safety Space at Wikimania. Due to the nature of
> the space, the discussion have not been documented but you can find the
> presentation with backgrounds of the situation and open questions on
> Commons
> <
> https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf
> >.
> Maybe it can give some ideas how to proceed with this.
>
> Best,
> Martin/DerHexer
>
>
> Am So., 24. Mai 2020 um 06:19 Uhr schrieb Aron Demian <
> aronmanni...@gmail.com>:
>
> > On Sun, 24 May 2020 at 04:25, AntiCompositeNumber <
> > anticompositenum...@gmail.com> wrote:
> >
> > > Would it be fair to say that:
> > >  - Enforcement of a universal code of conduct would happen though a
> > > fair, clearly-defined process without significant bias and with
> > > significant community oversight and input
> > > - Universal code of conduct enforcement actions would be appealable
> > > through a fair, clearly-defined process with significant community
> > > oversight that allowed statements from involved parties and uninvolved
> > > community members
> > > - To ensure proper community oversight, code of conduct enforcement
> > > actions and appeals would be made as public as possible as often as
> > > possible (excepting issues where public disclosure would harm privacy
> > > or safety)
> > >
> > > AntiComposite
> > >
> >
> > Yes! These are fundamental requirements that need to be met by the
> process
> > that will be implemented in the second phase (Aug - end of 2020).
> > It seems there will be an opportunity to incorporate these requirements:
> >
> > The second phase, outlining clear enforcement pathways, and
> > > *refined with** broad input from the Wikimedia communities*, will be
> > > presented to the Board
> > > for ratification by the end of 2020;
> >
> >
> > I'd add a few more points:
> > - To handle workload and different languages, local boards should be
> > selected as the first step of the process, with possible escalation to a
> > global board if necessary (eg. for conflict-of-interest reason).
> > - To minimize bias the boards should consist of people from different
> > areas. As long as the local DR processes remain operational (ANI and the
> > likes), there should be a clear separation of powers: CoC board members
> > should not be involved with local DR to avoid concentration of power.
> Being
> > an admin should not be a requirement, in fact adminship and dispute
> > resolution should be separate roles, as the latter requires specific
> > training or experience, which is not part of the requirements to be
> admin.
> > - There should be at least 2 independent global boards so one can review
> > the other's decisions and handle appeals. Cases should be evaluated by
> the
> > board that has more members unrelated to the involved parties.
> > - Functionaries and board members should be regularly reviewed and terms
> > limited to a few years.
> >
> > About the DR process:
> > - Most of our communication is publicly visible on-wiki, therefore the
> > cases should be resolved in public. Transparency is crucial for community
> > review and a great learning opportunity about dispute resolution.
> > - Privately handled cases should only happen when all parties agree to
> > it, so one party can't use "privacy" as a means to avoid the burden of
> > proof. Non-public evidence should only be taken into account if there is
> a
> > very strong justification, proportional to the sanction that comes from
> it.
> > - Reports, however, should be created privately and published only when
> the
> > case opens. Before the case 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-25 Thread DerHexer via Wikimedia-l
That's a tricky topic, especially when local dispute resolution bodies
(which should in most cases be approached first, I agree here) cannot solve
the dispute or when multiple projects are involved. At the moment, there is
in fact a lack of such body and of course it should be transparent,
composed of multi-diverse community members who are trained and supported
by professional mediation, etc. as pointed out. Currently, stewards like me
are quite often approached with such topics but this user group is more
focused on technical stuff like userrights. A former steward fellow and I
discussed this topic at the Safety Space at Wikimania. Due to the nature of
the space, the discussion have not been documented but you can find the
presentation with backgrounds of the situation and open questions on Commons
.
Maybe it can give some ideas how to proceed with this.

Best,
Martin/DerHexer


Am So., 24. Mai 2020 um 06:19 Uhr schrieb Aron Demian <
aronmanni...@gmail.com>:

> On Sun, 24 May 2020 at 04:25, AntiCompositeNumber <
> anticompositenum...@gmail.com> wrote:
>
> > Would it be fair to say that:
> >  - Enforcement of a universal code of conduct would happen though a
> > fair, clearly-defined process without significant bias and with
> > significant community oversight and input
> > - Universal code of conduct enforcement actions would be appealable
> > through a fair, clearly-defined process with significant community
> > oversight that allowed statements from involved parties and uninvolved
> > community members
> > - To ensure proper community oversight, code of conduct enforcement
> > actions and appeals would be made as public as possible as often as
> > possible (excepting issues where public disclosure would harm privacy
> > or safety)
> >
> > AntiComposite
> >
>
> Yes! These are fundamental requirements that need to be met by the process
> that will be implemented in the second phase (Aug - end of 2020).
> It seems there will be an opportunity to incorporate these requirements:
>
> The second phase, outlining clear enforcement pathways, and
> > *refined with** broad input from the Wikimedia communities*, will be
> > presented to the Board
> > for ratification by the end of 2020;
>
>
> I'd add a few more points:
> - To handle workload and different languages, local boards should be
> selected as the first step of the process, with possible escalation to a
> global board if necessary (eg. for conflict-of-interest reason).
> - To minimize bias the boards should consist of people from different
> areas. As long as the local DR processes remain operational (ANI and the
> likes), there should be a clear separation of powers: CoC board members
> should not be involved with local DR to avoid concentration of power. Being
> an admin should not be a requirement, in fact adminship and dispute
> resolution should be separate roles, as the latter requires specific
> training or experience, which is not part of the requirements to be admin.
> - There should be at least 2 independent global boards so one can review
> the other's decisions and handle appeals. Cases should be evaluated by the
> board that has more members unrelated to the involved parties.
> - Functionaries and board members should be regularly reviewed and terms
> limited to a few years.
>
> About the DR process:
> - Most of our communication is publicly visible on-wiki, therefore the
> cases should be resolved in public. Transparency is crucial for community
> review and a great learning opportunity about dispute resolution.
> - Privately handled cases should only happen when all parties agree to
> it, so one party can't use "privacy" as a means to avoid the burden of
> proof. Non-public evidence should only be taken into account if there is a
> very strong justification, proportional to the sanction that comes from it.
> - Reports, however, should be created privately and published only when the
> case opens. Before the case opens the reporter might seek advice and help
> to create the report from people they trust. I've outlined a process draft
> for this in the context of the User Reporting System
> <
> https://meta.wikimedia.org/wiki/Talk:Community_health_initiative/User_reporting_system_consultation_2019#Factual,_evidence_based_reporting_tool_-_draft,_proposal
> >
> .
> - Reports should be treated with respect, as the personal experience of a
> person. Nobody should be sanctioned for what a report contains, whether the
> boards, or the community finds that true or false, as that would be a
> deterrent to reporting influential users, who made a mistake or lost their
> way.
> - The focus should be on dispute *resolution. *Disputes and the resulting
> reports often start with disagreements, not bad intent towards each other.
> Mediation is an effective approach to finding a mutually agreeable
> resolution in these situations. Such 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-24 Thread effe iets anders
How is a one-off ban comparable in any way with a structured effort to
develop a policy in consultation with the community, and then implement it
together?

Lodewijk

On Sat, May 23, 2020 at 10:02 PM Todd Allen  wrote:

> Worked out great the last time WMF tried to pull something like this,
> didn't it?
>
>
> https://en.wikipedia.org/wiki/Wikipedia:Community_response_to_the_Wikimedia_Foundation%27s_ban_of_Fram
>
>
> Oh, wait. By "worked out great" I mean "was an unmitigated disaster." One
> wonders if the folks at the WMF are capable of learning from mistakes, and
> one is not encouraged by the apparent answer.
>
> Todd
>
> On Fri, May 22, 2020 at 3:59 PM María Sefidari 
> wrote:
>
> >  Hello everyone,
> >
> > Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> > resolution and published a statement[1] regarding the urgent need to make
> > our movement more safe and inclusive by addressing harassment and
> > incivility on Wikimedia projects. The statement builds on prior
> statements
> > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > universal code of conduct, and directs the Wikimedia Foundation to
> rapidly
> > and substantively address these challenges in complement with existing
> > community processes.
> >
> > This includes developing sustainable practices and tools that eliminate
> > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > respectful discourse, reduce harms to participants, protect the projects
> > from disinformation and bad actors, and promote trust in our projects.
> >
> > Over the past nearly twenty years, the movement has taken a number of
> > unique and sometimes extraordinary steps to create an environment unlike
> > anything else online: a place to share knowledge, to learn, and to
> > collaborate together. In order for the movement to continue to thrive and
> > make progress to our mission, it is essential to build a culture that is
> > welcoming and inclusive.
> >
> > Research has consistently shown that members of our communities have been
> > subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> > Wikimedia 2030 movement strategy recommendations have also identified the
> > safety of our Wikimedia spaces as a core issue to address if we are to
> > reach the 2030 goals, with concrete recommendations which include a
> > universal code of conduct, pathways for users to privately report
> > incidents, and a baseline of community responsibilities.[6]
> >
> > While the movement has made progress in addressing harassment and toxic
> > behavior, we recognize there is still much more to do. The Board’s
> > resolution and statement today is a step toward establishing clear,
> > consistent guidelines around acceptable behavior on our projects, and
> > guiding the Wikimedia Foundation in supporting the movement’s ability to
> > ensure a healthy environment for those who participate in our projects.
> >
> > * Developing and introducing, in close consultation with volunteer
> > contributor communities, a universal code of conduct that will be a
> binding
> > minimum set of standards across all Wikimedia projects;
> >
> > * Taking actions to ban, sanction, or otherwise limit the access of
> > Wikimedia movement participants who do not comply with these policies and
> > the Terms of Use;
> >
> > * Working with community functionaries to create and refine a retroactive
> > review process for cases brought by involved parties, excluding those
> cases
> > which pose legal or other severe risks; and
> >
> > * Significantly increasing support for and collaboration with community
> > functionaries primarily enforcing such compliance in a way that
> prioritizes
> > the personal safety of these functionaries.
> >
> > Together, we have made our movement what it is today. In this same way,
> we
> > must all be responsible for building the positive community culture of
> the
> > future, and accountable for stopping harassment and toxic behavior on our
> > sites.
> >
> > We have also made this statement available on Meta-Wiki for translation
> and
> > wider distribution.[1]
> >
> > On behalf of the Board,
> > María, Board Chair
> >
> > [1]
> >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2020_-_Board_of_Trustees_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
> >
> > [2]
> >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/November_2016_-_Statement_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
> >
> > [3]
> >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archives/2019#Board_statement_posted_at_Community_response_to_the_Wikimedia_Foundation's_ban_of_Fram
> >
> > [4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
> >
> > [5]
> >
> >
> https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of_harassment_has_not_declined_since_2017_and_appears_to_remain_steady
> >
> > 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-23 Thread Todd Allen
Worked out great the last time WMF tried to pull something like this,
didn't it?

https://en.wikipedia.org/wiki/Wikipedia:Community_response_to_the_Wikimedia_Foundation%27s_ban_of_Fram


Oh, wait. By "worked out great" I mean "was an unmitigated disaster." One
wonders if the folks at the WMF are capable of learning from mistakes, and
one is not encouraged by the apparent answer.

Todd

On Fri, May 22, 2020 at 3:59 PM María Sefidari  wrote:

>  Hello everyone,
>
> Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> resolution and published a statement[1] regarding the urgent need to make
> our movement more safe and inclusive by addressing harassment and
> incivility on Wikimedia projects. The statement builds on prior statements
> from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> universal code of conduct, and directs the Wikimedia Foundation to rapidly
> and substantively address these challenges in complement with existing
> community processes.
>
> This includes developing sustainable practices and tools that eliminate
> harassment, toxicity, and incivility, promote inclusivity, cultivate
> respectful discourse, reduce harms to participants, protect the projects
> from disinformation and bad actors, and promote trust in our projects.
>
> Over the past nearly twenty years, the movement has taken a number of
> unique and sometimes extraordinary steps to create an environment unlike
> anything else online: a place to share knowledge, to learn, and to
> collaborate together. In order for the movement to continue to thrive and
> make progress to our mission, it is essential to build a culture that is
> welcoming and inclusive.
>
> Research has consistently shown that members of our communities have been
> subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> Wikimedia 2030 movement strategy recommendations have also identified the
> safety of our Wikimedia spaces as a core issue to address if we are to
> reach the 2030 goals, with concrete recommendations which include a
> universal code of conduct, pathways for users to privately report
> incidents, and a baseline of community responsibilities.[6]
>
> While the movement has made progress in addressing harassment and toxic
> behavior, we recognize there is still much more to do. The Board’s
> resolution and statement today is a step toward establishing clear,
> consistent guidelines around acceptable behavior on our projects, and
> guiding the Wikimedia Foundation in supporting the movement’s ability to
> ensure a healthy environment for those who participate in our projects.
>
> * Developing and introducing, in close consultation with volunteer
> contributor communities, a universal code of conduct that will be a binding
> minimum set of standards across all Wikimedia projects;
>
> * Taking actions to ban, sanction, or otherwise limit the access of
> Wikimedia movement participants who do not comply with these policies and
> the Terms of Use;
>
> * Working with community functionaries to create and refine a retroactive
> review process for cases brought by involved parties, excluding those cases
> which pose legal or other severe risks; and
>
> * Significantly increasing support for and collaboration with community
> functionaries primarily enforcing such compliance in a way that prioritizes
> the personal safety of these functionaries.
>
> Together, we have made our movement what it is today. In this same way, we
> must all be responsible for building the positive community culture of the
> future, and accountable for stopping harassment and toxic behavior on our
> sites.
>
> We have also made this statement available on Meta-Wiki for translation and
> wider distribution.[1]
>
> On behalf of the Board,
> María, Board Chair
>
> [1]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2020_-_Board_of_Trustees_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
>
> [2]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/November_2016_-_Statement_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
>
> [3]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archives/2019#Board_statement_posted_at_Community_response_to_the_Wikimedia_Foundation's_ban_of_Fram
>
> [4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
>
> [5]
>
> https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of_harassment_has_not_declined_since_2017_and_appears_to_remain_steady
>
> [6]
>
> https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommendations/Provide_for_Safety_and_Inclusion
>
> == Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==
>
> Harassment, toxic behavior, and incivility in the Wikimedia movement are
> contrary to our shared values and detrimental to our vision and mission.
> They negatively impact our ability to collect, share, and disseminate free
> 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-23 Thread Aron Demian
On Sun, 24 May 2020 at 04:25, AntiCompositeNumber <
anticompositenum...@gmail.com> wrote:

> Would it be fair to say that:
>  - Enforcement of a universal code of conduct would happen though a
> fair, clearly-defined process without significant bias and with
> significant community oversight and input
> - Universal code of conduct enforcement actions would be appealable
> through a fair, clearly-defined process with significant community
> oversight that allowed statements from involved parties and uninvolved
> community members
> - To ensure proper community oversight, code of conduct enforcement
> actions and appeals would be made as public as possible as often as
> possible (excepting issues where public disclosure would harm privacy
> or safety)
>
> AntiComposite
>

Yes! These are fundamental requirements that need to be met by the process
that will be implemented in the second phase (Aug - end of 2020).
It seems there will be an opportunity to incorporate these requirements:

The second phase, outlining clear enforcement pathways, and
> *refined with** broad input from the Wikimedia communities*, will be
> presented to the Board
> for ratification by the end of 2020;


I'd add a few more points:
- To handle workload and different languages, local boards should be
selected as the first step of the process, with possible escalation to a
global board if necessary (eg. for conflict-of-interest reason).
- To minimize bias the boards should consist of people from different
areas. As long as the local DR processes remain operational (ANI and the
likes), there should be a clear separation of powers: CoC board members
should not be involved with local DR to avoid concentration of power. Being
an admin should not be a requirement, in fact adminship and dispute
resolution should be separate roles, as the latter requires specific
training or experience, which is not part of the requirements to be admin.
- There should be at least 2 independent global boards so one can review
the other's decisions and handle appeals. Cases should be evaluated by the
board that has more members unrelated to the involved parties.
- Functionaries and board members should be regularly reviewed and terms
limited to a few years.

About the DR process:
- Most of our communication is publicly visible on-wiki, therefore the
cases should be resolved in public. Transparency is crucial for community
review and a great learning opportunity about dispute resolution.
- Privately handled cases should only happen when all parties agree to
it, so one party can't use "privacy" as a means to avoid the burden of
proof. Non-public evidence should only be taken into account if there is a
very strong justification, proportional to the sanction that comes from it.
- Reports, however, should be created privately and published only when the
case opens. Before the case opens the reporter might seek advice and help
to create the report from people they trust. I've outlined a process draft
for this in the context of the User Reporting System

.
- Reports should be treated with respect, as the personal experience of a
person. Nobody should be sanctioned for what a report contains, whether the
boards, or the community finds that true or false, as that would be a
deterrent to reporting influential users, who made a mistake or lost their
way.
- The focus should be on dispute *resolution. *Disputes and the resulting
reports often start with disagreements, not bad intent towards each other.
Mediation is an effective approach to finding a mutually agreeable
resolution in these situations. Such resolutions create a more cooperative
environment and allow for personal growth, learning from mistakes.
Mediators should be hired and board members offered mediator training to
support this path.
- When necessary, only the minimal sanctions should be applied that prevent
the reported behaviour, to reduce the abuse potential of blocking. Partial
blocks was a great step in this direction: typical conduct issues should be
addressed early on with minor sanctions, not after years of misconduct,
when a ban becomes warranted. Bans and project-wide blocks should only be
used after numerous escalations and repeated sanctions, or in clear-cut
cases of extreme misconduct.

Dispute resolution is difficult and often requires effort from all parties.
The above approaches are unusual compared to the traditional handling of
disputes, which often results in one-sided sanctioning of the party with
less support from the community. However, adopting new ways of dispute
resolution is necessary to create an inclusive community, where editors are
treated equally and fairly, regardless of their status.

These are just superficial thoughts, which I'll detail in the second phase.

Thanks,
Aron (Demian)

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-23 Thread Gnangarra
I like the concept as it means the WMF can step up and address the dodgy
corporate players a lost more effectively across all platforms including
taking big stick tools to prevent them white washing articles or providing
paid for services

On Sun, 24 May 2020 at 10:25, AntiCompositeNumber <
anticompositenum...@gmail.com> wrote:

> While I'm pretty sure that this wasn't the intention, that sounds a
> lot like "ban first and ask questions later".  As Pine noted, this is
> a topic where great care must be taken to communicate intentions
> clearly and diplomatically. This point was likely introduced to
> respond to concerns about unappealable Office Actions. The way it was
> phrased, however, diminishes the point it was trying to make and also
> implies that community input is only applicable after the fact, and
> only from functionaries.
>
> Would it be fair to say that:
>  - Enforcement of a universal code of conduct would happen though a
> fair, clearly-defined process without significant bias and with
> significant community oversight and input
> - Universal code of conduct enforcement actions would be appealable
> through a fair, clearly-defined process with significant community
> oversight that allowed statements from involved parties and uninvolved
> community members
> - To ensure proper community oversight, code of conduct enforcement
> actions and appeals would be made as public as possible as often as
> possible (excepting issues where public disclosure would harm privacy
> or safety)
>
> AntiComposite
>
> On Fri, May 22, 2020 at 7:52 PM Nataliia Tymkiv 
> wrote:
> >
> > Hello, Dennis!
> >
> > Not at all. What it means is that this a not a process that goes into
> play
> > *before* a decision to act is made, but *after*. It should stand as an
> > option for those who want to ensure that actions taken are fair, as long
> as
> > the case does not relate to legal risks or other severe concerns.
> >
> > Best regards,
> > antanana / Nataliia Tymkiv
> >
> > NOTICE: You may have received this message outside of your normal working
> > hours/days, as I usually can work more as a volunteer during weekend. You
> > should not feel obligated to answer it during your days off. Thank you in
> > advance!
> >
> > On Sat, May 23, 2020, 01:58 Dennis During  wrote:
> >
> > >  "Work with community functionaries to create and refine a retroactive
> > > review process for cases brought by involved parties, excluding those
> cases
> > > which pose legal or other severe risks "
> > >
> > > What does "retroactive review process" mean?
> > >
> > > I hope it doesn't mean applying standards that were not promulgated at
> the
> > > time to past actions and applying severe sanctions to the alleged
> > > perpetrators.
> > >
> > > On Fri, May 22, 2020 at 5:59 PM María Sefidari 
> > > wrote:
> > >
> > > >  Hello everyone,
> > > >
> > > > Today, the Wikimedia Foundation Board of Trustees unanimously passed
> a
> > > > resolution and published a statement[1] regarding the urgent need to
> make
> > > > our movement more safe and inclusive by addressing harassment and
> > > > incivility on Wikimedia projects. The statement builds on prior
> > > statements
> > > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > > > universal code of conduct, and directs the Wikimedia Foundation to
> > > rapidly
> > > > and substantively address these challenges in complement with
> existing
> > > > community processes.
> > > >
> > > > This includes developing sustainable practices and tools that
> eliminate
> > > > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > > > respectful discourse, reduce harms to participants, protect the
> projects
> > > > from disinformation and bad actors, and promote trust in our
> projects.
> > > >
> > > > Over the past nearly twenty years, the movement has taken a number of
> > > > unique and sometimes extraordinary steps to create an environment
> unlike
> > > > anything else online: a place to share knowledge, to learn, and to
> > > > collaborate together. In order for the movement to continue to
> thrive and
> > > > make progress to our mission, it is essential to build a culture
> that is
> > > > welcoming and inclusive.
> > > >
> > > > Research has consistently shown that members of our communities have
> been
> > > > subject to hostility and toxic behavior in Wikimedia spaces.[4][5]
> The
> > > > Wikimedia 2030 movement strategy recommendations have also
> identified the
> > > > safety of our Wikimedia spaces as a core issue to address if we are
> to
> > > > reach the 2030 goals, with concrete recommendations which include a
> > > > universal code of conduct, pathways for users to privately report
> > > > incidents, and a baseline of community responsibilities.[6]
> > > >
> > > > While the movement has made progress in addressing harassment and
> toxic
> > > > behavior, we recognize there is still much more to do. The Board’s
> > > > resolution and statement today is 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-23 Thread AntiCompositeNumber
While I'm pretty sure that this wasn't the intention, that sounds a
lot like "ban first and ask questions later".  As Pine noted, this is
a topic where great care must be taken to communicate intentions
clearly and diplomatically. This point was likely introduced to
respond to concerns about unappealable Office Actions. The way it was
phrased, however, diminishes the point it was trying to make and also
implies that community input is only applicable after the fact, and
only from functionaries.

Would it be fair to say that:
 - Enforcement of a universal code of conduct would happen though a
fair, clearly-defined process without significant bias and with
significant community oversight and input
- Universal code of conduct enforcement actions would be appealable
through a fair, clearly-defined process with significant community
oversight that allowed statements from involved parties and uninvolved
community members
- To ensure proper community oversight, code of conduct enforcement
actions and appeals would be made as public as possible as often as
possible (excepting issues where public disclosure would harm privacy
or safety)

AntiComposite

On Fri, May 22, 2020 at 7:52 PM Nataliia Tymkiv  wrote:
>
> Hello, Dennis!
>
> Not at all. What it means is that this a not a process that goes into play
> *before* a decision to act is made, but *after*. It should stand as an
> option for those who want to ensure that actions taken are fair, as long as
> the case does not relate to legal risks or other severe concerns.
>
> Best regards,
> antanana / Nataliia Tymkiv
>
> NOTICE: You may have received this message outside of your normal working
> hours/days, as I usually can work more as a volunteer during weekend. You
> should not feel obligated to answer it during your days off. Thank you in
> advance!
>
> On Sat, May 23, 2020, 01:58 Dennis During  wrote:
>
> >  "Work with community functionaries to create and refine a retroactive
> > review process for cases brought by involved parties, excluding those cases
> > which pose legal or other severe risks "
> >
> > What does "retroactive review process" mean?
> >
> > I hope it doesn't mean applying standards that were not promulgated at the
> > time to past actions and applying severe sanctions to the alleged
> > perpetrators.
> >
> > On Fri, May 22, 2020 at 5:59 PM María Sefidari 
> > wrote:
> >
> > >  Hello everyone,
> > >
> > > Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> > > resolution and published a statement[1] regarding the urgent need to make
> > > our movement more safe and inclusive by addressing harassment and
> > > incivility on Wikimedia projects. The statement builds on prior
> > statements
> > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > > universal code of conduct, and directs the Wikimedia Foundation to
> > rapidly
> > > and substantively address these challenges in complement with existing
> > > community processes.
> > >
> > > This includes developing sustainable practices and tools that eliminate
> > > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > > respectful discourse, reduce harms to participants, protect the projects
> > > from disinformation and bad actors, and promote trust in our projects.
> > >
> > > Over the past nearly twenty years, the movement has taken a number of
> > > unique and sometimes extraordinary steps to create an environment unlike
> > > anything else online: a place to share knowledge, to learn, and to
> > > collaborate together. In order for the movement to continue to thrive and
> > > make progress to our mission, it is essential to build a culture that is
> > > welcoming and inclusive.
> > >
> > > Research has consistently shown that members of our communities have been
> > > subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> > > Wikimedia 2030 movement strategy recommendations have also identified the
> > > safety of our Wikimedia spaces as a core issue to address if we are to
> > > reach the 2030 goals, with concrete recommendations which include a
> > > universal code of conduct, pathways for users to privately report
> > > incidents, and a baseline of community responsibilities.[6]
> > >
> > > While the movement has made progress in addressing harassment and toxic
> > > behavior, we recognize there is still much more to do. The Board’s
> > > resolution and statement today is a step toward establishing clear,
> > > consistent guidelines around acceptable behavior on our projects, and
> > > guiding the Wikimedia Foundation in supporting the movement’s ability to
> > > ensure a healthy environment for those who participate in our projects.
> > >
> > > * Developing and introducing, in close consultation with volunteer
> > > contributor communities, a universal code of conduct that will be a
> > binding
> > > minimum set of standards across all Wikimedia projects;
> > >
> > > * Taking actions to ban, sanction, or 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread effe iets anders
Thanks for that clarification! I read that initially as capacity, tools,
training etc for community functionaries to be better enforcers (maybe I
read it too quickly - I'm still ambivalent about it). Glad I didn't
interpret that correctly!

Best,
Lodewijk

On Fri, May 22, 2020 at 6:50 PM Shani Evenstein  wrote:

> Hi  Lodewijk,
>
> This ecosystem you are describing is exactly what we are hoping for.
>
> And we absolutely agree that what you called "education" is needed.
> We referred to it as "training" and "capacity building" in this sentence
> in the statement:
>
> "To that end, the Board further directs the Foundation, in collaboration
> with the communities, to make additional i*nvestments in Trust & Safety
> capacity*, including but not limited to: development of tools needed to
> assist our volunteers and staff, research to support data-informed
> decisions, development of clear metrics to measure success, *development
> of training tools and materials* (*including building communities’
> capacities around harassment awareness and conflict resolution*), and
> consultations with international experts on harassment, community health
> and children’s rights, as well as additional hiring."
>
> Best,
> Shani.
>
>
>>
>> From: effe iets anders 
>> Date: Sat, May 23, 2020 at 4:26 AM
>> Subject: Re: [Wikimedia-l] Trust and safety on Wikimedia projects
>> To: Wikimedia Mailing List 
>>
>>
>> Thanks for this step - I wish that it wouldn't be necessary. I'm not sure
>> of all the implications, but was mostly wondering: will this be primarily
>> a
>> stick, or is the foundation also going to invest more heavily in carrots
>> and education?
>>
>> I get the impression that we have much progress to make in training,
>> educating and exposing correct behavior (some chapters have made attempts
>> at this). So much of our energy already goes into the bad behavior, that
>> it
>> exhausts many community members. I'm confident that the Trust and Safety
>> live through a more extreme version of that daily.
>>
>> I'd wish that we manage to build an ecosystem that encourages good
>> behavior, diverts bad behavior at a very early stage, and removes the bad
>> actors that cannot be corrected. Probably not as popular as punishing
>> people, but hopefully more constructive for the community as a whole.
>>
>> Lodewijk
>>
>> On Fri, May 22, 2020 at 4:52 PM Nataliia Tymkiv 
>> wrote:
>>
>> > Hello, Dennis!
>> >
>> > Not at all. What it means is that this a not a process that goes into
>> play
>> > *before* a decision to act is made, but *after*. It should stand as an
>> > option for those who want to ensure that actions taken are fair, as
>> long as
>> > the case does not relate to legal risks or other severe concerns.
>> >
>> > Best regards,
>> > antanana / Nataliia Tymkiv
>> >
>> > NOTICE: You may have received this message outside of your normal
>> working
>> > hours/days, as I usually can work more as a volunteer during weekend.
>> You
>> > should not feel obligated to answer it during your days off. Thank you
>> in
>> > advance!
>> >
>> > On Sat, May 23, 2020, 01:58 Dennis During  wrote:
>> >
>> > >  "Work with community functionaries to create and refine a retroactive
>> > > review process for cases brought by involved parties, excluding those
>> > cases
>> > > which pose legal or other severe risks "
>> > >
>> > > What does "retroactive review process" mean?
>> > >
>> > > I hope it doesn't mean applying standards that were not promulgated at
>> > the
>> > > time to past actions and applying severe sanctions to the alleged
>> > > perpetrators.
>> > >
>> > > On Fri, May 22, 2020 at 5:59 PM María Sefidari 
>> > > wrote:
>> > >
>> > > >  Hello everyone,
>> > > >
>> > > > Today, the Wikimedia Foundation Board of Trustees unanimously
>> passed a
>> > > > resolution and published a statement[1] regarding the urgent need to
>> > make
>> > > > our movement more safe and inclusive by addressing harassment and
>> > > > incivility on Wikimedia projects. The statement builds on prior
>> > > statements
>> > > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
>> > > > universal code of conduct, and directs the W

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread Shani Evenstein
Hi  Lodewijk,

This ecosystem you are describing is exactly what we are hoping for.

And we absolutely agree that what you called "education" is needed.
We referred to it as "training" and "capacity building" in this sentence in
the statement:

"To that end, the Board further directs the Foundation, in collaboration
with the communities, to make additional i*nvestments in Trust & Safety
capacity*, including but not limited to: development of tools needed to
assist our volunteers and staff, research to support data-informed
decisions, development of clear metrics to measure success, *development of
training tools and materials* (*including building communities’ capacities
around harassment awareness and conflict resolution*), and consultations
with international experts on harassment, community health and children’s
rights, as well as additional hiring."

Best,
Shani.


>
> From: effe iets anders 
> Date: Sat, May 23, 2020 at 4:26 AM
> Subject: Re: [Wikimedia-l] Trust and safety on Wikimedia projects
> To: Wikimedia Mailing List 
>
>
> Thanks for this step - I wish that it wouldn't be necessary. I'm not sure
> of all the implications, but was mostly wondering: will this be primarily a
> stick, or is the foundation also going to invest more heavily in carrots
> and education?
>
> I get the impression that we have much progress to make in training,
> educating and exposing correct behavior (some chapters have made attempts
> at this). So much of our energy already goes into the bad behavior, that it
> exhausts many community members. I'm confident that the Trust and Safety
> live through a more extreme version of that daily.
>
> I'd wish that we manage to build an ecosystem that encourages good
> behavior, diverts bad behavior at a very early stage, and removes the bad
> actors that cannot be corrected. Probably not as popular as punishing
> people, but hopefully more constructive for the community as a whole.
>
> Lodewijk
>
> On Fri, May 22, 2020 at 4:52 PM Nataliia Tymkiv 
> wrote:
>
> > Hello, Dennis!
> >
> > Not at all. What it means is that this a not a process that goes into
> play
> > *before* a decision to act is made, but *after*. It should stand as an
> > option for those who want to ensure that actions taken are fair, as long
> as
> > the case does not relate to legal risks or other severe concerns.
> >
> > Best regards,
> > antanana / Nataliia Tymkiv
> >
> > NOTICE: You may have received this message outside of your normal working
> > hours/days, as I usually can work more as a volunteer during weekend. You
> > should not feel obligated to answer it during your days off. Thank you in
> > advance!
> >
> > On Sat, May 23, 2020, 01:58 Dennis During  wrote:
> >
> > >  "Work with community functionaries to create and refine a retroactive
> > > review process for cases brought by involved parties, excluding those
> > cases
> > > which pose legal or other severe risks "
> > >
> > > What does "retroactive review process" mean?
> > >
> > > I hope it doesn't mean applying standards that were not promulgated at
> > the
> > > time to past actions and applying severe sanctions to the alleged
> > > perpetrators.
> > >
> > > On Fri, May 22, 2020 at 5:59 PM María Sefidari 
> > > wrote:
> > >
> > > >  Hello everyone,
> > > >
> > > > Today, the Wikimedia Foundation Board of Trustees unanimously passed
> a
> > > > resolution and published a statement[1] regarding the urgent need to
> > make
> > > > our movement more safe and inclusive by addressing harassment and
> > > > incivility on Wikimedia projects. The statement builds on prior
> > > statements
> > > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > > > universal code of conduct, and directs the Wikimedia Foundation to
> > > rapidly
> > > > and substantively address these challenges in complement with
> existing
> > > > community processes.
> > > >
> > > > This includes developing sustainable practices and tools that
> eliminate
> > > > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > > > respectful discourse, reduce harms to participants, protect the
> > projects
> > > > from disinformation and bad actors, and promote trust in our
> projects.
> > > >
> > > > Over the past nearly twenty years, the movement has taken a number of
> > > > unique and sometimes extraordinary steps to c

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread effe iets anders
Thanks for this step - I wish that it wouldn't be necessary. I'm not sure
of all the implications, but was mostly wondering: will this be primarily a
stick, or is the foundation also going to invest more heavily in carrots
and education?

I get the impression that we have much progress to make in training,
educating and exposing correct behavior (some chapters have made attempts
at this). So much of our energy already goes into the bad behavior, that it
exhausts many community members. I'm confident that the Trust and Safety
live through a more extreme version of that daily.

I'd wish that we manage to build an ecosystem that encourages good
behavior, diverts bad behavior at a very early stage, and removes the bad
actors that cannot be corrected. Probably not as popular as punishing
people, but hopefully more constructive for the community as a whole.

Lodewijk

On Fri, May 22, 2020 at 4:52 PM Nataliia Tymkiv 
wrote:

> Hello, Dennis!
>
> Not at all. What it means is that this a not a process that goes into play
> *before* a decision to act is made, but *after*. It should stand as an
> option for those who want to ensure that actions taken are fair, as long as
> the case does not relate to legal risks or other severe concerns.
>
> Best regards,
> antanana / Nataliia Tymkiv
>
> NOTICE: You may have received this message outside of your normal working
> hours/days, as I usually can work more as a volunteer during weekend. You
> should not feel obligated to answer it during your days off. Thank you in
> advance!
>
> On Sat, May 23, 2020, 01:58 Dennis During  wrote:
>
> >  "Work with community functionaries to create and refine a retroactive
> > review process for cases brought by involved parties, excluding those
> cases
> > which pose legal or other severe risks "
> >
> > What does "retroactive review process" mean?
> >
> > I hope it doesn't mean applying standards that were not promulgated at
> the
> > time to past actions and applying severe sanctions to the alleged
> > perpetrators.
> >
> > On Fri, May 22, 2020 at 5:59 PM María Sefidari 
> > wrote:
> >
> > >  Hello everyone,
> > >
> > > Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> > > resolution and published a statement[1] regarding the urgent need to
> make
> > > our movement more safe and inclusive by addressing harassment and
> > > incivility on Wikimedia projects. The statement builds on prior
> > statements
> > > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > > universal code of conduct, and directs the Wikimedia Foundation to
> > rapidly
> > > and substantively address these challenges in complement with existing
> > > community processes.
> > >
> > > This includes developing sustainable practices and tools that eliminate
> > > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > > respectful discourse, reduce harms to participants, protect the
> projects
> > > from disinformation and bad actors, and promote trust in our projects.
> > >
> > > Over the past nearly twenty years, the movement has taken a number of
> > > unique and sometimes extraordinary steps to create an environment
> unlike
> > > anything else online: a place to share knowledge, to learn, and to
> > > collaborate together. In order for the movement to continue to thrive
> and
> > > make progress to our mission, it is essential to build a culture that
> is
> > > welcoming and inclusive.
> > >
> > > Research has consistently shown that members of our communities have
> been
> > > subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> > > Wikimedia 2030 movement strategy recommendations have also identified
> the
> > > safety of our Wikimedia spaces as a core issue to address if we are to
> > > reach the 2030 goals, with concrete recommendations which include a
> > > universal code of conduct, pathways for users to privately report
> > > incidents, and a baseline of community responsibilities.[6]
> > >
> > > While the movement has made progress in addressing harassment and toxic
> > > behavior, we recognize there is still much more to do. The Board’s
> > > resolution and statement today is a step toward establishing clear,
> > > consistent guidelines around acceptable behavior on our projects, and
> > > guiding the Wikimedia Foundation in supporting the movement’s ability
> to
> > > ensure a healthy environment for those who participate in our projects.
> > >
> > > * Developing and introducing, in close consultation with volunteer
> > > contributor communities, a universal code of conduct that will be a
> > binding
> > > minimum set of standards across all Wikimedia projects;
> > >
> > > * Taking actions to ban, sanction, or otherwise limit the access of
> > > Wikimedia movement participants who do not comply with these policies
> and
> > > the Terms of Use;
> > >
> > > * Working with community functionaries to create and refine a
> retroactive
> > > review process for cases brought by involved 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread Nataliia Tymkiv
Hello, Dennis!

Not at all. What it means is that this a not a process that goes into play
*before* a decision to act is made, but *after*. It should stand as an
option for those who want to ensure that actions taken are fair, as long as
the case does not relate to legal risks or other severe concerns.

Best regards,
antanana / Nataliia Tymkiv

NOTICE: You may have received this message outside of your normal working
hours/days, as I usually can work more as a volunteer during weekend. You
should not feel obligated to answer it during your days off. Thank you in
advance!

On Sat, May 23, 2020, 01:58 Dennis During  wrote:

>  "Work with community functionaries to create and refine a retroactive
> review process for cases brought by involved parties, excluding those cases
> which pose legal or other severe risks "
>
> What does "retroactive review process" mean?
>
> I hope it doesn't mean applying standards that were not promulgated at the
> time to past actions and applying severe sanctions to the alleged
> perpetrators.
>
> On Fri, May 22, 2020 at 5:59 PM María Sefidari 
> wrote:
>
> >  Hello everyone,
> >
> > Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> > resolution and published a statement[1] regarding the urgent need to make
> > our movement more safe and inclusive by addressing harassment and
> > incivility on Wikimedia projects. The statement builds on prior
> statements
> > from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> > universal code of conduct, and directs the Wikimedia Foundation to
> rapidly
> > and substantively address these challenges in complement with existing
> > community processes.
> >
> > This includes developing sustainable practices and tools that eliminate
> > harassment, toxicity, and incivility, promote inclusivity, cultivate
> > respectful discourse, reduce harms to participants, protect the projects
> > from disinformation and bad actors, and promote trust in our projects.
> >
> > Over the past nearly twenty years, the movement has taken a number of
> > unique and sometimes extraordinary steps to create an environment unlike
> > anything else online: a place to share knowledge, to learn, and to
> > collaborate together. In order for the movement to continue to thrive and
> > make progress to our mission, it is essential to build a culture that is
> > welcoming and inclusive.
> >
> > Research has consistently shown that members of our communities have been
> > subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> > Wikimedia 2030 movement strategy recommendations have also identified the
> > safety of our Wikimedia spaces as a core issue to address if we are to
> > reach the 2030 goals, with concrete recommendations which include a
> > universal code of conduct, pathways for users to privately report
> > incidents, and a baseline of community responsibilities.[6]
> >
> > While the movement has made progress in addressing harassment and toxic
> > behavior, we recognize there is still much more to do. The Board’s
> > resolution and statement today is a step toward establishing clear,
> > consistent guidelines around acceptable behavior on our projects, and
> > guiding the Wikimedia Foundation in supporting the movement’s ability to
> > ensure a healthy environment for those who participate in our projects.
> >
> > * Developing and introducing, in close consultation with volunteer
> > contributor communities, a universal code of conduct that will be a
> binding
> > minimum set of standards across all Wikimedia projects;
> >
> > * Taking actions to ban, sanction, or otherwise limit the access of
> > Wikimedia movement participants who do not comply with these policies and
> > the Terms of Use;
> >
> > * Working with community functionaries to create and refine a retroactive
> > review process for cases brought by involved parties, excluding those
> cases
> > which pose legal or other severe risks; and
> >
> > * Significantly increasing support for and collaboration with community
> > functionaries primarily enforcing such compliance in a way that
> prioritizes
> > the personal safety of these functionaries.
> >
> > Together, we have made our movement what it is today. In this same way,
> we
> > must all be responsible for building the positive community culture of
> the
> > future, and accountable for stopping harassment and toxic behavior on our
> > sites.
> >
> > We have also made this statement available on Meta-Wiki for translation
> and
> > wider distribution.[1]
> >
> > On behalf of the Board,
> > María, Board Chair
> >
> > [1]
> >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2020_-_Board_of_Trustees_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
> >
> > [2]
> >
> >
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/November_2016_-_Statement_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
> >
> > [3]
> >
> >
> 

Re: [Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread Dennis During
 "Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases
which pose legal or other severe risks "

What does "retroactive review process" mean?

I hope it doesn't mean applying standards that were not promulgated at the
time to past actions and applying severe sanctions to the alleged
perpetrators.

On Fri, May 22, 2020 at 5:59 PM María Sefidari  wrote:

>  Hello everyone,
>
> Today, the Wikimedia Foundation Board of Trustees unanimously passed a
> resolution and published a statement[1] regarding the urgent need to make
> our movement more safe and inclusive by addressing harassment and
> incivility on Wikimedia projects. The statement builds on prior statements
> from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
> universal code of conduct, and directs the Wikimedia Foundation to rapidly
> and substantively address these challenges in complement with existing
> community processes.
>
> This includes developing sustainable practices and tools that eliminate
> harassment, toxicity, and incivility, promote inclusivity, cultivate
> respectful discourse, reduce harms to participants, protect the projects
> from disinformation and bad actors, and promote trust in our projects.
>
> Over the past nearly twenty years, the movement has taken a number of
> unique and sometimes extraordinary steps to create an environment unlike
> anything else online: a place to share knowledge, to learn, and to
> collaborate together. In order for the movement to continue to thrive and
> make progress to our mission, it is essential to build a culture that is
> welcoming and inclusive.
>
> Research has consistently shown that members of our communities have been
> subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
> Wikimedia 2030 movement strategy recommendations have also identified the
> safety of our Wikimedia spaces as a core issue to address if we are to
> reach the 2030 goals, with concrete recommendations which include a
> universal code of conduct, pathways for users to privately report
> incidents, and a baseline of community responsibilities.[6]
>
> While the movement has made progress in addressing harassment and toxic
> behavior, we recognize there is still much more to do. The Board’s
> resolution and statement today is a step toward establishing clear,
> consistent guidelines around acceptable behavior on our projects, and
> guiding the Wikimedia Foundation in supporting the movement’s ability to
> ensure a healthy environment for those who participate in our projects.
>
> * Developing and introducing, in close consultation with volunteer
> contributor communities, a universal code of conduct that will be a binding
> minimum set of standards across all Wikimedia projects;
>
> * Taking actions to ban, sanction, or otherwise limit the access of
> Wikimedia movement participants who do not comply with these policies and
> the Terms of Use;
>
> * Working with community functionaries to create and refine a retroactive
> review process for cases brought by involved parties, excluding those cases
> which pose legal or other severe risks; and
>
> * Significantly increasing support for and collaboration with community
> functionaries primarily enforcing such compliance in a way that prioritizes
> the personal safety of these functionaries.
>
> Together, we have made our movement what it is today. In this same way, we
> must all be responsible for building the positive community culture of the
> future, and accountable for stopping harassment and toxic behavior on our
> sites.
>
> We have also made this statement available on Meta-Wiki for translation and
> wider distribution.[1]
>
> On behalf of the Board,
> María, Board Chair
>
> [1]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2020_-_Board_of_Trustees_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
>
> [2]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/November_2016_-_Statement_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces
>
> [3]
>
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archives/2019#Board_statement_posted_at_Community_response_to_the_Wikimedia_Foundation's_ban_of_Fram
>
> [4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
>
> [5]
>
> https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of_harassment_has_not_declined_since_2017_and_appears_to_remain_steady
>
> [6]
>
> https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommendations/Provide_for_Safety_and_Inclusion
>
> == Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==
>
> Harassment, toxic behavior, and incivility in the Wikimedia movement are
> contrary to our shared values and detrimental to our vision and mission.
> They negatively impact our ability to collect, share, and disseminate free
> 

[Wikimedia-l] Trust and safety on Wikimedia projects

2020-05-22 Thread María Sefidari
 Hello everyone,

Today, the Wikimedia Foundation Board of Trustees unanimously passed a
resolution and published a statement[1] regarding the urgent need to make
our movement more safe and inclusive by addressing harassment and
incivility on Wikimedia projects. The statement builds on prior statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a
universal code of conduct, and directs the Wikimedia Foundation to rapidly
and substantively address these challenges in complement with existing
community processes.

This includes developing sustainable practices and tools that eliminate
harassment, toxicity, and incivility, promote inclusivity, cultivate
respectful discourse, reduce harms to participants, protect the projects
from disinformation and bad actors, and promote trust in our projects.

Over the past nearly twenty years, the movement has taken a number of
unique and sometimes extraordinary steps to create an environment unlike
anything else online: a place to share knowledge, to learn, and to
collaborate together. In order for the movement to continue to thrive and
make progress to our mission, it is essential to build a culture that is
welcoming and inclusive.

Research has consistently shown that members of our communities have been
subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The
Wikimedia 2030 movement strategy recommendations have also identified the
safety of our Wikimedia spaces as a core issue to address if we are to
reach the 2030 goals, with concrete recommendations which include a
universal code of conduct, pathways for users to privately report
incidents, and a baseline of community responsibilities.[6]

While the movement has made progress in addressing harassment and toxic
behavior, we recognize there is still much more to do. The Board’s
resolution and statement today is a step toward establishing clear,
consistent guidelines around acceptable behavior on our projects, and
guiding the Wikimedia Foundation in supporting the movement’s ability to
ensure a healthy environment for those who participate in our projects.

* Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a binding
minimum set of standards across all Wikimedia projects;

* Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and
the Terms of Use;

* Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases
which pose legal or other severe risks; and

* Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that prioritizes
the personal safety of these functionaries.

Together, we have made our movement what it is today. In this same way, we
must all be responsible for building the positive community culture of the
future, and accountable for stopping harassment and toxic behavior on our
sites.

We have also made this statement available on Meta-Wiki for translation and
wider distribution.[1]

On behalf of the Board,
María, Board Chair

[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2020_-_Board_of_Trustees_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces

[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/November_2016_-_Statement_on_Healthy_Community_Culture,_Inclusivity,_and_Safe_Spaces

[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archives/2019#Board_statement_posted_at_Community_response_to_the_Wikimedia_Foundation's_ban_of_Fram

[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015

[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of_harassment_has_not_declined_since_2017_and_appears_to_remain_steady

[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommendations/Provide_for_Safety_and_Inclusion

== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==

Harassment, toxic behavior, and incivility in the Wikimedia movement are
contrary to our shared values and detrimental to our vision and mission.
They negatively impact our ability to collect, share, and disseminate free
knowledge, harm the immediate well-being of individual Wikimedians, and
threaten the long-term health and success of the Wikimedia projects. The
Board does not believe we have made enough progress toward creating
welcoming, inclusive, harassment-free spaces in which people can contribute
productively and debate constructively.

In recognition of the urgency of these issues, the Board is directing the
Wikimedia Foundation to directly improve the situation in collaboration
with our communities. This should include developing sustainable practices
and tools that eliminate harassment, toxicity, and incivility,