Folks, I'm exploring the possibility to have Google hire Sage for the
training and then extending an invitation to members of the ASF to join in
they are interested. Will keep you posted on how that convo goes.




Gris Cuevas Zambrano

Open Source Strategist - Big Data Analytics

+1 (650) 772-2947

1160 N. Mathilda Avenue , Sunnyvale, CA 94089




On Thu, 25 Jul 2019 at 16:25, Sage Sharp <[email protected]> wrote:

> I have a couple of resources that might help the moderation team determine
> whether someone is trolling. There's a list of logical falacies
> <https://yourlogicalfallacyis.com/> that are often used to derail
> conversations (they want you to buy a poster, but the files are available
> <
> https://yourlogicalfallacyis.com/system/App/Settings/fallacies_poster_files/000/000/001/original/CriticalThinkingPDFs.zip
> >
> under a Creative Commons license). There's also a list of traits
> <https://outofthefog.website/traits/> that may indicate someone is being
> emotionally or verbally abusive.
>
> (This next bit is cross posting about my business, please let me know if
> that is OT for these lists.) I would also highly encourage moderator teams
> or people enforcing a Code of Conduct to take my Incident Response workshop
> <https://otter.technology/code-of-conduct-training/>. We discuss a
> framework for evaluating a report and determining what behavioral
> modification plan and/or consequences are necessary.
>
> One of the more important points from that workshop: What matters as a
> moderator or Code of Conduct committee is that the inappropriate behavior
> stops. Someone's intent does not matter, except to determine if they will
> repeat the behavior.
>
> Sometimes a warning is all that's needed, and then the person can agree to
> change their behavior. Other times more severe consequences may be
> necessary, like temporary or permanent bans. If someone cannot agree to
> change behavior that moderators have determined is inappropriate, and that
> behavior is making community members feel unsafe or unwelcome, then they
> should not be allowed to interact with that community.
>
> With that in mind, trolling is not about intent. Determining a person's
> "intent" to troll will derail the moderation process. What moderators need
> to focus on is how to stop behavior that derails conversations, especially
> behavior that has a negative impact on people from marginalized groups.
>
> Sage Sharp
>
> On Mon, Jul 22, 2019 at 10:36 AM Naomi S <[email protected]> wrote:
>
> > I want the moderation team to be a team of people trusted by the VP D&I
> to
> > make the call re who is trolling or not. that is subjective, and it is,
> for
> > sure, going to come down to shared values. but it's important for this
> > initiative that we can assert a coherent set of shared values
> >
> >
> >
> >
> > On Mon, 22 Jul 2019 at 19:33, Myrle Krantz <[email protected]> wrote:
> >
> > > On Mon, Jul 22, 2019 at 12:25 PM Patricia Shanahan <[email protected]>
> wrote:
> > >
> > > > I don't think there is a way, because one person's troll may be
> another
> > > > person's sincerely held and strongly expressed opinion.
> > > >
> > > > My preference would be to drop it completely. The CoC already covers
> > the
> > > > cases that I think should be restricted. We are all adults here. If
> > > > someone wants to make the rest of the mailing list participants think
> > > > they are rude and inconsiderate, they should be let do so.
> > > >
> > >
> > > I'd like to hear what the rest of the committee prefers.  I can accept
> > > either approach.
> > >
> > > The options are:
> > > 1.) Continue to block trolling.  Either use the existing wording or
> look
> > > for a better wording.
> > > 2.) Not block trolling in technical moderation, but fall back to social
> > > moderation.  Some trolling may fall under other rules (for example,
> list
> > > relevance).
> > >
> > > Best Regards,
> > > Myrle
> > >
> >
>

Reply via email to