Re: [Wikimedia-l] Implementing Katherine's Vision: "Discussing Discussions"

2016-11-23 Thread J.
Hi, y'all.
I apologize that I could not figure out how to snip this message with a
Gmail reply.

Scott: Do not forget "Email this user" (from user and talk pages) in our
contact methods.
Cheers! Wayne Calhoon
[[User:Checkingfax]]
925-899-4051
Co-coordinator: Bay Area WikiSalon [1][2][3]
[1] Meta planning page: https://meta.wikimedia.org/wiki/Bay_Area_WikiSalon
[2] Forward facing public page:
https://en.wikipedia.org/wiki/Wikipedia:Bay_Area_WikiSalon
[3] November event page:
https://en.wikipedia.org/wiki/Wikipedia:Bay_Area_WikiSalon_November_2016

On Fri, Nov 18, 2016 at 11:35 AM, C. Scott Ananian 
wrote:

> A few weeks ago our Executive Director gave a talk on "Privacy and
> Harassment on the Internet" at MozFest 2016 in London.  I encourage you to
> read the transcript:
>
> https://en.wikisource.org/wiki/Privacy_and_Harassment_on_the_Internet
>
>
> Katherine argued that the Wikimedia project can take a lead role in
> creating a culture of respect and inclusion online.  I whole-heartedly
> agree, and I hope you all do too.  She concluded with:
>
> "We have a lot of work to do. I know that. We know that. As Molly’s story
> > illustrates, we are not there yet."
>
>
> I'd like to open a broader discussion on how we get "there": how to
> build/maintain places where we can get work done and control abuse and
> vandalism while still remaining wide open to the universe of differing
> viewpoints present in our projects.  We can't afford to create filter
> bubbles, but we must be able to provide users safe(r) spaces to work.
>
> By habit I would propose that this be a technical discussion, on specific
> tools or features that our platform is currently missing to facilitate
> healthy discussions.  But the "filter bubble" is a social problem, not a
> technical one.  Our project isn't just a collection of code; it's a
> community, a set of norms and habits, and a reflection of the social
> process of collaboration.  A graph algorithm might be able to identify a
> filter bubble and good UX can make countervailing opinions no more than a
> click away, but it takes human will to seek out uncomfortable truth.
>
> So although my endgame is specific engineering tasks, we need to start with
> a broader conversation about our work as social creatures.  How do we work
> in the projects, how do we communicate among ourselves, and how do we
> balance openness and the pursuit of truth with the fight against abuse,
> harassment, and bias.
>
> Let's discuss discussions!
>
> Here are some jumping off points; feel free to contribute your own:
>
> We currently use a mixture of Talk pages, Echo, mailing lists, IRC,
> Phabricator, OTRS, Slack, Conpherence, and Google Doc on our projects, with
> different logging, publication, privacy/identity, and other
> characteristics.  I tried to start cataloging them here:
>
> https://lists.wikimedia.org/pipermail/wikimedia-l/2016-
> November/085542.html
>
>
> Because of this diversity, we lack a unified code of conduct or mechanism
> to report/combat harassment and vandalism.
>
> Matt Flaschen replied in the above thread with an update on the Code of
> Conduct for technical spaces:
>
> https://lists.wikimedia.org/pipermail/wikimedia-l/2016-
> November/085542.html
>
> ...which should definitely help!  The creation of a centralized reporting
> mechanism, in particular, would be most welcome.
>
> I created a proposal for the Wikimedia Developer Summit in January
> discussing "safe spaces" on our projects:
>
> https://phabricator.wikimedia.org/T149665
>
> Subscribe/comment/click "award token" to support its inclusion in the dev
> summit or to start a conversation there.
>
> I have another, broader, proposal as well, on the "future of chat" on our
> projects:
>
> https://phabricator.wikimedia.org/T149661
>
> Subscribe/comment/click "award token" there if that angle piques your
> interest.
>
> It seems that "groups of users" arise repeatedly as an architectural
> meta-concept, whether it's a group of collaborators you want to invite to
> an editing session, a group of users you want to block or ban, a group of
> users who belong to a particular wikiproject, or who watch a certain page.
> We don't really have a first-class representation of that concept in our
> code right now.  In previous conversations I've heard that people "don't
> want  to turn into another facebook" and so have pushed
> back strongly on the idea of "friend lists" (one type of group of users) --
> but inverting the concept to allow WikiProjects to maintain a list of
> "members of the wikiproject" is more palatable, more focused on the editing
> task.  From a computer science perspective "friend list" and "member of a
> wikiproject" might seem identical--they are both lists of users--but from a
> social perspective the connotations and focus are significantly different.
> But who administers that list of users?
>
> Perhaps we can build a system which avoids grappling with user groups
> entirely.  It was 

Re: [Wikimedia-l] Implementing Katherine's Vision: "Discussing Discussions"

2016-11-19 Thread Pine W
Hi Scott,

Thanks for your enthusiasm about making the wikis friendlier places. A few
points to bring to your attention:

* Patrick Earley in Support & Safety has been researching issues related to
harassment for some time. I suggest that you consult him on possible
approaches.

* I was happy to hear Katherine say in the November WMF Metrics Meeting
that there may be some additional resources put toward projects in this
domain in the next annual plan. (Did I understand that correctly,
Katherine?) You might want to indicate your interest in working on these
kinds of projects to folks in your chain of command who could assign you
related work.

* You might want to connect with people in Research, such as Dario, who
seem to be looking at data related to (un-)friendliness on English
Wikipedia.

* You might also want to talk with the people in Grantmaking (a.k.a.
Community Resources), such as Chris Schilling, to see if they have
suggestions. You might also talk with them about potential funding sources
and grantees for projects.

* The Wikimedian in me feels compelled to make a clarification. The
aspiration to improve the on-wiki climate on our projects is not a novel
vision from Katherine. (No offense to Katherine.) I believe that people
including WMF staff were thinking about this subject prior to 2012, and
possibly much earlier than that. What is new in the past few years is
devoting meaningful amounts of resources to improve the climate in ways
other than fire-fighting immediate problems and having discussions. (Risker
and Maggie probably know more of the history than I do.) The point is,
rather than a culture of civility being a novel vision of one person, this
is a long-held aspiration of many people. I am glad that Katherine, you,
and many other people are engaged in thinking about our online climate and
how to make it better, and I am glad that WMF is investing financial,
technical, and human resources toward this goal.

I'm sorry to say that this is likely to be my last post to public mailing
lists for at least a few days (and possibly much longer) because I've got
other issues to address, but I'm very glad to see your interest in working
to improving the civility of our online spaces. I hope that by improving
the civility of our dialogue, we will help to grow the size and diversity
of our online communities.

Onward and upward,

Pine


On Fri, Nov 18, 2016 at 11:35 AM, C. Scott Ananian 
wrote:

> A few weeks ago our Executive Director gave a talk on "Privacy and
> Harassment on the Internet" at MozFest 2016 in London.  I encourage you to
> read the transcript:
>
> https://en.wikisource.org/wiki/Privacy_and_Harassment_on_the_Internet
>
>
> Katherine argued that the Wikimedia project can take a lead role in
> creating a culture of respect and inclusion online.  I whole-heartedly
> agree, and I hope you all do too.  She concluded with:
>
> "We have a lot of work to do. I know that. We know that. As Molly’s story
> > illustrates, we are not there yet."
>
>
> I'd like to open a broader discussion on how we get "there": how to
> build/maintain places where we can get work done and control abuse and
> vandalism while still remaining wide open to the universe of differing
> viewpoints present in our projects.  We can't afford to create filter
> bubbles, but we must be able to provide users safe(r) spaces to work.
>
> By habit I would propose that this be a technical discussion, on specific
> tools or features that our platform is currently missing to facilitate
> healthy discussions.  But the "filter bubble" is a social problem, not a
> technical one.  Our project isn't just a collection of code; it's a
> community, a set of norms and habits, and a reflection of the social
> process of collaboration.  A graph algorithm might be able to identify a
> filter bubble and good UX can make countervailing opinions no more than a
> click away, but it takes human will to seek out uncomfortable truth.
>
> So although my endgame is specific engineering tasks, we need to start with
> a broader conversation about our work as social creatures.  How do we work
> in the projects, how do we communicate among ourselves, and how do we
> balance openness and the pursuit of truth with the fight against abuse,
> harassment, and bias.
>
> Let's discuss discussions!
>
> Here are some jumping off points; feel free to contribute your own:
>
> We currently use a mixture of Talk pages, Echo, mailing lists, IRC,
> Phabricator, OTRS, Slack, Conpherence, and Google Doc on our projects, with
> different logging, publication, privacy/identity, and other
> characteristics.  I tried to start cataloging them here:
>
> https://lists.wikimedia.org/pipermail/wikimedia-l/2016-
> November/085542.html
>
>
> Because of this diversity, we lack a unified code of conduct or mechanism
> to report/combat harassment and vandalism.
>
> Matt Flaschen replied in the above thread with an update on the Code of
> Conduct for technical 

Re: [Wikimedia-l] Implementing Katherine's Vision: "Discussing Discussions"

2016-11-18 Thread Pax Ahimsa Gethen
Chris Schilling gave a talk on harassment with regards to June's Inspire 
Campaign [1], at yesterday's Metrics & Activities meeting [2]. In it, he 
discussed an idea I had about reducing/preventing user page harassment 
[3], which we turned into an RfC [4], and is now being worked out on 
Phabricator [5].


- Pax aka Funcrunch

[1] https://youtu.be/4GHy3BIx3JM?t=16m29s
[2] 
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_metrics_and_activities_meetings
[3] 
https://meta.wikimedia.org/wiki/Grants:IdeaLab/Protect_user_space_by_default
[4] 
https://en.wikipedia.org/wiki/Wikipedia:Requests_for_comment/Protect_user_pages_by_default

[5] https://phabricator.wikimedia.org/T149445


On 11/18/16 11:35 AM, C. Scott Ananian wrote:

A few weeks ago our Executive Director gave a talk on "Privacy and
Harassment on the Internet" at MozFest 2016 in London.  I encourage you to
read the transcript:

https://en.wikisource.org/wiki/Privacy_and_Harassment_on_the_Internet


Katherine argued that the Wikimedia project can take a lead role in
creating a culture of respect and inclusion online.  I whole-heartedly
agree, and I hope you all do too.  She concluded with:

"We have a lot of work to do. I know that. We know that. As Molly’s story

illustrates, we are not there yet."


I'd like to open a broader discussion on how we get "there": how to
build/maintain places where we can get work done and control abuse and
vandalism while still remaining wide open to the universe of differing
viewpoints present in our projects.  We can't afford to create filter
bubbles, but we must be able to provide users safe(r) spaces to work.

By habit I would propose that this be a technical discussion, on specific
tools or features that our platform is currently missing to facilitate
healthy discussions.  But the "filter bubble" is a social problem, not a
technical one.  Our project isn't just a collection of code; it's a
community, a set of norms and habits, and a reflection of the social
process of collaboration.  A graph algorithm might be able to identify a
filter bubble and good UX can make countervailing opinions no more than a
click away, but it takes human will to seek out uncomfortable truth.

So although my endgame is specific engineering tasks, we need to start with
a broader conversation about our work as social creatures.  How do we work
in the projects, how do we communicate among ourselves, and how do we
balance openness and the pursuit of truth with the fight against abuse,
harassment, and bias.

Let's discuss discussions!

Here are some jumping off points; feel free to contribute your own:

We currently use a mixture of Talk pages, Echo, mailing lists, IRC,
Phabricator, OTRS, Slack, Conpherence, and Google Doc on our projects, with
different logging, publication, privacy/identity, and other
characteristics.  I tried to start cataloging them here:

https://lists.wikimedia.org/pipermail/wikimedia-l/2016-November/085542.html


Because of this diversity, we lack a unified code of conduct or mechanism
to report/combat harassment and vandalism.

Matt Flaschen replied in the above thread with an update on the Code of
Conduct for technical spaces:

https://lists.wikimedia.org/pipermail/wikimedia-l/2016-November/085542.html

...which should definitely help!  The creation of a centralized reporting
mechanism, in particular, would be most welcome.

I created a proposal for the Wikimedia Developer Summit in January
discussing "safe spaces" on our projects:

https://phabricator.wikimedia.org/T149665

Subscribe/comment/click "award token" to support its inclusion in the dev
summit or to start a conversation there.

I have another, broader, proposal as well, on the "future of chat" on our
projects:

https://phabricator.wikimedia.org/T149661

Subscribe/comment/click "award token" there if that angle piques your
interest.

It seems that "groups of users" arise repeatedly as an architectural
meta-concept, whether it's a group of collaborators you want to invite to
an editing session, a group of users you want to block or ban, a group of
users who belong to a particular wikiproject, or who watch a certain page.
We don't really have a first-class representation of that concept in our
code right now.  In previous conversations I've heard that people "don't
want  to turn into another facebook" and so have pushed
back strongly on the idea of "friend lists" (one type of group of users) --
but inverting the concept to allow WikiProjects to maintain a list of
"members of the wikiproject" is more palatable, more focused on the editing
task.  From a computer science perspective "friend list" and "member of a
wikiproject" might seem identical--they are both lists of users--but from a
social perspective the connotations and focus are significantly different.
But who administers that list of users?

Perhaps we can build a system which avoids grappling with user groups
entirely.  It was suggested that we might use an ORES-like system to
automatically suggest