[Wikimedia-l] Re: Sharing the Wikimedia Foundation’s Human Rights Impact Assessment

2022-07-14 Thread rgaines
Thank you for your message, Andreas! I just left you a response on the HRIA 
talk page on Meta: 
https://meta.wikimedia.org/wiki/Talk:Wikimedia_Foundation_Human_Rights_Impact_Assessment
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/IK2CJRXOPBHHXBG6B5II6YMH44FISECU/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org


[Wikimedia-l] Re: Sharing the Wikimedia Foundation’s Human Rights Impact Assessment

2022-07-13 Thread Andreas Kolbe
Dear Richard and all,

The recommendations from this 2020 report that you have published now make
interesting reading.

– Some (UCoC, Human Rights Policy) have clearly been implemented since you
received the report two years ago.
– Others (training for admins and rights holders) are in the process of
implementation/community negotiation.
– Some ("audit protocol to assess projects that are at high risk of capture
or government-sponsored disinformation") have been at least partially
implemented (Croatian Wikipedia, disinformation hires; Japanese Wikipedia?).
– Others ("provide access to a geotargeted suicide prevention hotline at
the top of the articles on Suicide Methods") have neither been discussed
(to my knowledge) nor implemented to date.
– Yet others ("develop a Content Oversight Committee (COC) to review
content with a focus on bias and have the ability to make binding editorial
decisions in line with ICCPR 19") have not been discussed, and
implementation status in the various language versions is unknown.

Could you provide an overview here or on Meta as to the status of each of
the priority recommendations?

I append the complete set of priority recommendations below for everybody's
reference.

Best,
Andreas

Article One developed a suite of recommendations to address each category
of salient risks. *We recognize the need to engage and secure input from
Wikimedia’s vast volunteer base and as such recommend that the Foundation
consult with volunteers and other experts to determine the best path
forward.* Priority recommendations include:
Strategies for the Foundation

*1.  *Develop a standalone Human Rights Policy that commits to
respecting all internationally recognized human rights by referencing the
International Bill of Human Rights.

*2.  *Conduct ongoing human rights due diligence to continually assess
risks to rightsholders. A Foundation-level HRIA should be conducted every
three years or whenever significant changes could have an effect on human
rights.

*3.  *Develop rights-compatible channels to address human rights
concerns, including private channels, and ensure alignment with the UNGPs’
effectiveness criteria.
Harmful Content

*1.  *Develop an audit protocol to assess projects that are at high
risk of capture or government-sponsored disinformation.

*2.  *Develop a Content Oversight Committee (COC) to review content
with a focus on bias and have the ability to make binding editorial
decisions in line with ICCPR 19.

*3.  *Continue efforts outlined in the Knowledge Integrity white paper
to develop: a) a machine-readable representation of knowledge that exists
within Wikimedia projects along with its provenance; b) models to assess
the quality of information provenance; and c) models to assess content
neutrality and bias. Ensure that all AI/ML tools are designed to detect
content and action that would be considered illegal under international
human rights law, and that the response aligns with the threepart ICCPR
test requiring that any restriction on the right to free expression be
legal, proportional, and necessary.

*4.  *Provide access to a geotargeted suicide prevention hotline at the
top of the articles on Suicide Methods.
Harassment

*1.  *Develop and deploy training programs for admins and volunteers
with advanced rights on detecting and responding to harassment claims.

*2.  *Commission a “social norms marketing” research project to assess
what type of messaging is likely to reduce and prevent harassing comments
and actions.

*3.  *Explore opportunities to rate the toxicity of users, helping to
identify repeat offenders and patterns of harassment. Consider awards for
projects with the lowest toxicity levels.

*4.  *Consider developing admin metrics focused on enforcing civility
and applying the forthcoming Universal Code of Conduct (UCoC).

*5.  *Ensure that the (UCoC) and its accompanying governance mechanism
is reviewed by human rights experts, including experts on free expression
and incitement to violence.
Government surveillance and censorship

*1.  *Continue efforts underway as part of the IP-masking project to
further protect users from public identification.

*2.  *Develop awareness-raising tools and programs for all volunteers
to understand and mitigate risks of engagement. Tools should be made
publicly available and should be translated into languages spoken by
volunteers in higher risk regions.[1] <#_ftn1>
Risks to child rights

*1.  *Conduct a child rights impact assessment of Wikimedia projects,
including conducting interviews and focus groups with child contributors
across the globe.

*2.  *Create child safeguarding tools, including child-friendly
guidance on privacy settings, data collection, reporting of grooming
attempts, the forthcoming UCoC as well a “Child’s Guide to Editing
Wikimedia Project” to help advance the right of children to be civically
engaged.
Limitations on knowledge equity

*1.