[Wikimedia-l] Re: [Wikimedia Research Showcase] Online Learning

2021-12-15 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Thu, Dec 9, 2021 at 4:06 PM Janna Layton  wrote:

> Hello all,
>
> The next Wikimedia Research Showcase will be held Wednesday, December 15
> at 17:30 UTC (9:30 PT / 12:30 ET / 18:30 CET).
>
> You can view the livestream here: https://youtu.be/HKODaHgmQWw
> <https://www.google.com/url?q=https://youtu.be/HKODaHgmQWw=D=calendar=1639318315124385=AOvVaw3golHXRGc6dP0iV3H1Br4l>
>
> The Showcase will feature the following talks:
>
> *Latin American Youth and their Information Ecosystem: Finding,
> Evaluation, Creating, and  Sharing Content Online*
> The increased importance the Internet plays as a core source of
> information in youth's lives, now underscored by the pandemic, gives new
> urgency to the need to better understand young people’s information habits
> and attitudes. Answers to questions like where young people go to look for
> information, what information they decide to trust and how they share the
> information they find, hold important implications for the knowledge they
> obtain, the beliefs they form and the actions they take in areas ranging
> from personal health, professional employment or their educational training.
>
> In this research showcase, we will be summarizing insights from focus
> group interviews in Latin America that offer a window into the experiences
> of young people themselves. Taken together, these perspectives might help
> us to develop a more comprehensive understanding of how young people in
> Latin America use the Internet in general and interact with information
> from online sources in particular.
>
>
> Speakers: Lionel Brossi and Ana María Castillo. Artificial Intelligence
> and Society Hub at University of Chile.
>
> --
>
> Characterizing the Online Learning Landscape: What and How People Learn
> Online
>
> Hundreds of millions of people learn something new online every day.
> Simultaneously, the study of online education has blossomed with new
> systems, experiments, and observations creating and exploring previously
> undiscovered online learning environments. In this talk I will discuss our
> study, in which we endeavor to characterize this entire landscape of online
> learning experiences using a national survey of 2260 US adults who are
> balanced to match the demographics of the U.S. We examine the online
> learning resources that they consult, and we analyze the subjects that they
> pursue using those resources. Furthermore, we compare both formal and
> informal online learning experiences on a larger scale than has ever been
> done before, to our knowledge, to better understand which subjects people
> are seeking for intensive study. We find that there is a core set of online
> learning experiences that are central to other experiences and these are
> shared among the majority of people who learn online.
>
>
> Speaker: Sean Kross, University of California San Diego
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/GDLYXCY5OTGZ5TKV23BWGYVQZOWLCAV3/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] Online Learning

2021-12-13 Thread Janna Layton
Just a reminder that the Research Showcase will be this Wednesday.

On Thu, Dec 9, 2021 at 4:06 PM Janna Layton  wrote:

> Hello all,
>
> The next Wikimedia Research Showcase will be held Wednesday, December 15
> at 17:30 UTC (9:30 PT / 12:30 ET / 18:30 CET).
>
> You can view the livestream here: https://youtu.be/HKODaHgmQWw
> <https://www.google.com/url?q=https://youtu.be/HKODaHgmQWw=D=calendar=1639318315124385=AOvVaw3golHXRGc6dP0iV3H1Br4l>
>
> The Showcase will feature the following talks:
>
> *Latin American Youth and their Information Ecosystem: Finding,
> Evaluation, Creating, and  Sharing Content Online*
> The increased importance the Internet plays as a core source of
> information in youth's lives, now underscored by the pandemic, gives new
> urgency to the need to better understand young people’s information habits
> and attitudes. Answers to questions like where young people go to look for
> information, what information they decide to trust and how they share the
> information they find, hold important implications for the knowledge they
> obtain, the beliefs they form and the actions they take in areas ranging
> from personal health, professional employment or their educational training.
>
> In this research showcase, we will be summarizing insights from focus
> group interviews in Latin America that offer a window into the experiences
> of young people themselves. Taken together, these perspectives might help
> us to develop a more comprehensive understanding of how young people in
> Latin America use the Internet in general and interact with information
> from online sources in particular.
>
>
> Speakers: Lionel Brossi and Ana María Castillo. Artificial Intelligence
> and Society Hub at University of Chile.
>
> --
>
> Characterizing the Online Learning Landscape: What and How People Learn
> Online
>
> Hundreds of millions of people learn something new online every day.
> Simultaneously, the study of online education has blossomed with new
> systems, experiments, and observations creating and exploring previously
> undiscovered online learning environments. In this talk I will discuss our
> study, in which we endeavor to characterize this entire landscape of online
> learning experiences using a national survey of 2260 US adults who are
> balanced to match the demographics of the U.S. We examine the online
> learning resources that they consult, and we analyze the subjects that they
> pursue using those resources. Furthermore, we compare both formal and
> informal online learning experiences on a larger scale than has ever been
> done before, to our knowledge, to better understand which subjects people
> are seeking for intensive study. We find that there is a core set of online
> learning experiences that are central to other experiences and these are
> shared among the majority of people who learn online.
>
>
> Speaker: Sean Kross, University of California San Diego
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/KCQX2UGS26W7OOQ25K7YIWSNQ7NETBPP/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] Online Learning

2021-12-09 Thread Janna Layton
Hello all,

The next Wikimedia Research Showcase will be held Wednesday, December 15 at
17:30 UTC (9:30 PT / 12:30 ET / 18:30 CET).

You can view the livestream here: https://youtu.be/HKODaHgmQWw
<https://www.google.com/url?q=https://youtu.be/HKODaHgmQWw=D=calendar=1639318315124385=AOvVaw3golHXRGc6dP0iV3H1Br4l>

The Showcase will feature the following talks:

*Latin American Youth and their Information Ecosystem: Finding, Evaluation,
Creating, and  Sharing Content Online*
The increased importance the Internet plays as a core source of information
in youth's lives, now underscored by the pandemic, gives new urgency to the
need to better understand young people’s information habits and attitudes.
Answers to questions like where young people go to look for information,
what information they decide to trust and how they share the information
they find, hold important implications for the knowledge they obtain, the
beliefs they form and the actions they take in areas ranging from personal
health, professional employment or their educational training.

In this research showcase, we will be summarizing insights from focus group
interviews in Latin America that offer a window into the experiences of
young people themselves. Taken together, these perspectives might help us
to develop a more comprehensive understanding of how young people in Latin
America use the Internet in general and interact with information from
online sources in particular.


Speakers: Lionel Brossi and Ana María Castillo. Artificial Intelligence and
Society Hub at University of Chile.

--

Characterizing the Online Learning Landscape: What and How People Learn
Online

Hundreds of millions of people learn something new online every day.
Simultaneously, the study of online education has blossomed with new
systems, experiments, and observations creating and exploring previously
undiscovered online learning environments. In this talk I will discuss our
study, in which we endeavor to characterize this entire landscape of online
learning experiences using a national survey of 2260 US adults who are
balanced to match the demographics of the U.S. We examine the online
learning resources that they consult, and we analyze the subjects that they
pursue using those resources. Furthermore, we compare both formal and
informal online learning experiences on a larger scale than has ever been
done before, to our knowledge, to better understand which subjects people
are seeking for intensive study. We find that there is a core set of online
learning experiences that are central to other experiences and these are
shared among the majority of people who learn online.


Speaker: Sean Kross, University of California San Diego

https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/YSXYPIHAGWCDH5XAOVMUERPRKVHYQCKQ/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] Content Moderation

2021-11-17 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Wed, Nov 10, 2021 at 3:24 PM Janna Layton  wrote:

> Hello all,
>
> The next Wikimedia Research Showcase will be on Wednesday, November 17, at
> 17:30 UTC (9:30am PST/12:30pm EST/ 18:30 CET). The topic is content
> moderation.
>
> Livestream: https://www.youtube.com/watch?v=Rx3xesDkp2o
>
>
> *Amy S. Bruckman (Georgia Institute of Technology, USA)Is Deplatforming
> Censorship? What happened when controversial figures were deplatformed,
> with philosophical musings on the nature of free speech*
>
> Abstract: When a controversial figure is deplatformed, what happens to
> their online influence? In this talk, first, I’ll present results from a
> study of the deplatforming from Twitter of three figures who repeatedly
> broke platform rules (Alex Jones, Milo Yiannopoulos, and Owen Benjamin).
> Second, I’ll discuss what happened when this study was on the front page of
> Reddit, and the range of angry reactions from people who say that they’re
> in favor of “free speech.” I’ll explore the nature of free speech, and why
> our current speech regulation framework is fundamentally broken. Finally,
> I’ll conclude with thoughts on the strength of Wikipedia’s model in
> contrast to other platforms, and highlight opportunities for improvement.
>
>
> *Nathan TeBlunthuis (University of Washington / Northwestern University,
> USA)Effects of Algorithmic Flagging on Fairness. Quasi-experimental
> Evidence from Wikipedia*
>
> Abstract: Online community moderators often rely on social signals such as
> whether or not a user has an account or a profile page as clues that users
> may cause problems. Reliance on these clues can lead to "overprofiling bias
> when moderators focus on these signals but overlook the misbehavior of
> others. We propose that algorithmic flagging systems deployed to improve
> the efficiency of moderation work can also make moderation actions more
> fair to these users by reducing reliance on social signals and making norm
> violations by everyone else more visible. We analyze moderator behavior in
> Wikipedia as mediated by RCFilters, a system which displays social signals
> and algorithmic flags, and estimate the causal effect of being flagged on
> moderator actions. We show that algorithmically flagged edits are reverted
> more often, especially those by established editors with positive social
> signals, and that flagging decreases the likelihood that moderation actions
> will be undone. Our results suggest that algorithmic flagging systems can
> lead to increased fairness in some contexts but that the relationship is
> complex and contingent.
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/5SKYGLGKKJGHGAHBVMOO4Y2SDBR4ANFF/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] Content Moderation

2021-11-15 Thread Janna Layton
Just a reminder that this event is on Wednesday.

On Wed, Nov 10, 2021 at 3:24 PM Janna Layton  wrote:

> Hello all,
>
> The next Wikimedia Research Showcase will be on Wednesday, November 17, at
> 17:30 UTC (9:30am PST/12:30pm EST/ 18:30 CET). The topic is content
> moderation.
>
> Livestream: https://www.youtube.com/watch?v=Rx3xesDkp2o
>
>
> *Amy S. Bruckman (Georgia Institute of Technology, USA)Is Deplatforming
> Censorship? What happened when controversial figures were deplatformed,
> with philosophical musings on the nature of free speech*
>
> Abstract: When a controversial figure is deplatformed, what happens to
> their online influence? In this talk, first, I’ll present results from a
> study of the deplatforming from Twitter of three figures who repeatedly
> broke platform rules (Alex Jones, Milo Yiannopoulos, and Owen Benjamin).
> Second, I’ll discuss what happened when this study was on the front page of
> Reddit, and the range of angry reactions from people who say that they’re
> in favor of “free speech.” I’ll explore the nature of free speech, and why
> our current speech regulation framework is fundamentally broken. Finally,
> I’ll conclude with thoughts on the strength of Wikipedia’s model in
> contrast to other platforms, and highlight opportunities for improvement.
>
>
> *Nathan TeBlunthuis (University of Washington / Northwestern University,
> USA)Effects of Algorithmic Flagging on Fairness. Quasi-experimental
> Evidence from Wikipedia*
>
> Abstract: Online community moderators often rely on social signals such as
> whether or not a user has an account or a profile page as clues that users
> may cause problems. Reliance on these clues can lead to "overprofiling bias
> when moderators focus on these signals but overlook the misbehavior of
> others. We propose that algorithmic flagging systems deployed to improve
> the efficiency of moderation work can also make moderation actions more
> fair to these users by reducing reliance on social signals and making norm
> violations by everyone else more visible. We analyze moderator behavior in
> Wikipedia as mediated by RCFilters, a system which displays social signals
> and algorithmic flags, and estimate the causal effect of being flagged on
> moderator actions. We show that algorithmically flagged edits are reverted
> more often, especially those by established editors with positive social
> signals, and that flagging decreases the likelihood that moderation actions
> will be undone. Our results suggest that algorithmic flagging systems can
> lead to increased fairness in some contexts but that the relationship is
> complex and contingent.
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/LC456SE7XB5MZ256HWXFPGKLDBKHWNDB/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] Content Moderation

2021-11-10 Thread Janna Layton
Hello all,

The next Wikimedia Research Showcase will be on Wednesday, November 17, at
17:30 UTC (9:30am PST/12:30pm EST/ 18:30 CET). The topic is content
moderation.

Livestream: https://www.youtube.com/watch?v=Rx3xesDkp2o


*Amy S. Bruckman (Georgia Institute of Technology, USA)Is Deplatforming
Censorship? What happened when controversial figures were deplatformed,
with philosophical musings on the nature of free speech*

Abstract: When a controversial figure is deplatformed, what happens to
their online influence? In this talk, first, I’ll present results from a
study of the deplatforming from Twitter of three figures who repeatedly
broke platform rules (Alex Jones, Milo Yiannopoulos, and Owen Benjamin).
Second, I’ll discuss what happened when this study was on the front page of
Reddit, and the range of angry reactions from people who say that they’re
in favor of “free speech.” I’ll explore the nature of free speech, and why
our current speech regulation framework is fundamentally broken. Finally,
I’ll conclude with thoughts on the strength of Wikipedia’s model in
contrast to other platforms, and highlight opportunities for improvement.


*Nathan TeBlunthuis (University of Washington / Northwestern University,
USA)Effects of Algorithmic Flagging on Fairness. Quasi-experimental
Evidence from Wikipedia*

Abstract: Online community moderators often rely on social signals such as
whether or not a user has an account or a profile page as clues that users
may cause problems. Reliance on these clues can lead to "overprofiling bias
when moderators focus on these signals but overlook the misbehavior of
others. We propose that algorithmic flagging systems deployed to improve
the efficiency of moderation work can also make moderation actions more
fair to these users by reducing reliance on social signals and making norm
violations by everyone else more visible. We analyze moderator behavior in
Wikipedia as mediated by RCFilters, a system which displays social signals
and algorithmic flags, and estimate the causal effect of being flagged on
moderator actions. We show that algorithmically flagged edits are reverted
more often, especially those by established editors with positive social
signals, and that flagging decreases the likelihood that moderation actions
will be undone. Our results suggest that algorithmic flagging systems can
lead to increased fairness in some contexts but that the relationship is
complex and contingent.

https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/C3JJQNEZVBC7XFOBZNUINTA757WIP6HO/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] Bridging knowledge gaps

2021-10-27 Thread Janna Layton
This event will start in about 30 minutes.

On Thu, Oct 21, 2021 at 10:53 AM Janna Layton  wrote:

> Hi all,
>
> The next Wikimedia Research Showcase will be on October 27, 16:30 UTC (9:30am
> PT/ 12:30pm ET/ 18:30pm CEST). The Wikimedia Foundation Research Team will
> present on knowledge gaps.
>
> Livestream: https://www.youtube.com/watch?v=d0Qg98EVmuI
>
> Speaker: Wikimedia Foundation Research Team
>
> Title: Automatic approaches to bridge knowledge gaps in Wikimedia projects
>
> Abstract: In order to advance knowledge equity as part of the Wikimedia
> Movement’s 2030 strategic direction, the Research team at the Wikimedia
> Foundation has been conducting research to “Address Knowledge Gaps” as one
> of its main programs. One core component of this program is to develop
> technologies to bridge knowledge gaps. In this talk, we give an overview on
> how we approach this task using tools from Machine Learning in four
> different contexts: section alignment in content translation, link
> recommendation in structured editing, image recommendation in multimedia
> knowledge gaps, and the equity of the recommendations themselves. We will
> present how these models can assist contributors in addressing knowledge
> gaps. Finally, we will discuss the impact of these models in applications
> deployed across Wikimedia projects supporting different Product initiatives
> at the Wikimedia Foundation.
>
> More information:
>
> * Section alignment:
> meta:Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment
> <https://meta.wikimedia.org/wiki/Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment>
>
> * Link recommendation:
> meta:Research:Link_recommendation_model_for_add-a-link_structured_task
> <https://meta.wikimedia.org/wiki/Research:Link_recommendation_model_for_add-a-link_structured_task>
>
> * Image recommendation:
> meta:Research:Recommending_Images_to_Wikipedia_Articles
> <https://meta.wikimedia.org/wiki/Research:Recommending_Images_to_Wikipedia_Articles>
>
> * Equity in recommendations:
> meta:Research:Prioritization_of_Wikipedia_Articles/Recommendation
> <https://meta.wikimedia.org/wiki/Research:Prioritization_of_Wikipedia_Articles/Recommendation>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/4A5UHZC75K6EUBZS7IA36OJ2QUOC45AB/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] Bridging knowledge gaps

2021-10-25 Thread Janna Layton
Just a reminder that the below is happening on Wednesday.

On Thu, Oct 21, 2021 at 10:53 AM Janna Layton  wrote:

> Hi all,
>
> The next Wikimedia Research Showcase will be on October 27, 16:30 UTC (9:30am
> PT/ 12:30pm ET/ 18:30pm CEST). The Wikimedia Foundation Research Team will
> present on knowledge gaps.
>
> Livestream: https://www.youtube.com/watch?v=d0Qg98EVmuI
>
> Speaker: Wikimedia Foundation Research Team
>
> Title: Automatic approaches to bridge knowledge gaps in Wikimedia projects
>
> Abstract: In order to advance knowledge equity as part of the Wikimedia
> Movement’s 2030 strategic direction, the Research team at the Wikimedia
> Foundation has been conducting research to “Address Knowledge Gaps” as one
> of its main programs. One core component of this program is to develop
> technologies to bridge knowledge gaps. In this talk, we give an overview on
> how we approach this task using tools from Machine Learning in four
> different contexts: section alignment in content translation, link
> recommendation in structured editing, image recommendation in multimedia
> knowledge gaps, and the equity of the recommendations themselves. We will
> present how these models can assist contributors in addressing knowledge
> gaps. Finally, we will discuss the impact of these models in applications
> deployed across Wikimedia projects supporting different Product initiatives
> at the Wikimedia Foundation.
>
> More information:
>
> * Section alignment:
> meta:Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment
> <https://meta.wikimedia.org/wiki/Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment>
>
> * Link recommendation:
> meta:Research:Link_recommendation_model_for_add-a-link_structured_task
> <https://meta.wikimedia.org/wiki/Research:Link_recommendation_model_for_add-a-link_structured_task>
>
> * Image recommendation:
> meta:Research:Recommending_Images_to_Wikipedia_Articles
> <https://meta.wikimedia.org/wiki/Research:Recommending_Images_to_Wikipedia_Articles>
>
> * Equity in recommendations:
> meta:Research:Prioritization_of_Wikipedia_Articles/Recommendation
> <https://meta.wikimedia.org/wiki/Research:Prioritization_of_Wikipedia_Articles/Recommendation>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/2WIP4RXNXTPH522C77UZUW7PU4JNYOLO/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] Bridging knowledge gaps

2021-10-21 Thread Janna Layton
Hi all,

The next Wikimedia Research Showcase will be on October 27, 16:30 UTC (9:30am
PT/ 12:30pm ET/ 18:30pm CEST). The Wikimedia Foundation Research Team will
present on knowledge gaps.

Livestream: https://www.youtube.com/watch?v=d0Qg98EVmuI

Speaker: Wikimedia Foundation Research Team

Title: Automatic approaches to bridge knowledge gaps in Wikimedia projects

Abstract: In order to advance knowledge equity as part of the Wikimedia
Movement’s 2030 strategic direction, the Research team at the Wikimedia
Foundation has been conducting research to “Address Knowledge Gaps” as one
of its main programs. One core component of this program is to develop
technologies to bridge knowledge gaps. In this talk, we give an overview on
how we approach this task using tools from Machine Learning in four
different contexts: section alignment in content translation, link
recommendation in structured editing, image recommendation in multimedia
knowledge gaps, and the equity of the recommendations themselves. We will
present how these models can assist contributors in addressing knowledge
gaps. Finally, we will discuss the impact of these models in applications
deployed across Wikimedia projects supporting different Product initiatives
at the Wikimedia Foundation.

More information:

* Section alignment:
meta:Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment
<https://meta.wikimedia.org/wiki/Research:Expanding_Wikipedia_articles_across_languages/Inter_language_approach#Section_Alignment>

* Link recommendation:
meta:Research:Link_recommendation_model_for_add-a-link_structured_task
<https://meta.wikimedia.org/wiki/Research:Link_recommendation_model_for_add-a-link_structured_task>

* Image recommendation:
meta:Research:Recommending_Images_to_Wikipedia_Articles
<https://meta.wikimedia.org/wiki/Research:Recommending_Images_to_Wikipedia_Articles>

* Equity in recommendations:
meta:Research:Prioritization_of_Wikipedia_Articles/Recommendation
<https://meta.wikimedia.org/wiki/Research:Prioritization_of_Wikipedia_Articles/Recommendation>

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/AJX4VDTDQWMPTNNUA2P4CGXURGTZ3ZJQ/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] September 15, 2021: Socialization on Wikipedia

2021-09-15 Thread Janna Layton
This event will be starting in about 30 minutes.

On Thu, Sep 9, 2021 at 1:15 PM Janna Layton  wrote:

> Hello all,
>
> The September Wikimedia Research Showcase will be on September 15 at 16:30
> UTC (9:30am PT/ 12:30pm ET/ 18:30pm CEST). The theme will be
> "socialization on Wikipedia" with speakers Rosta Farzan and J. Nathan
> Matias.
>
> Livestream: https://www.youtube.com/watch?v=YVqabVvLIZU
>
> Talk 1
>
> Speaker: Rosta Farzan (School of Computing and Information, University of
> Pittsburgh)
>
> Title: Unlocking the Wikipedia clubhouse to newcomers: results from two
> studies
>
> Abstract: It is no news to any of us that success of online production
> communities such as Wikipedia highly relies on a continuous stream of
> newcomers to replace the inevitable high turnover and to bring on board new
> sources of ideas and workforce. However, these communities have been
> struggling with attracting newcomers, especially from a diverse population
> of users, and further retention of newcomers. In this talk, I will present
> about two different approaches in engaging new editors in Wikipedia: (1)
> newcomers joining through the Wiki Ed program, an online program in which
> college students edit Wikipedia articles as class assignments; (2)
> newcomers joining through a Wikipedia Art+Feminism edit-a-thon.  I present
> how each approach incorporated techniques in engaging newcomers and how
> they succeed in attracting and retention of newcomers.
>
> More information:
>
>- Bring on Board New Enthusiasts! A Case Study of Impact of Wikipedia
>Art + Feminism Edit-A-Thon Events on Newcomers
><https://link.springer.com/chapter/10.1007/978-3-319-47880-7_2>,
>SocInfo 2016 (pdf
><http://saviaga.com/wp-content/uploads/2016/06/socinfo_ediathons.pdf>)
>- Successful Online Socialization: Lessons from the Wikipedia
>Education Program <https://dl.acm.org/doi/abs/10.1145/3392857>, CSCW
>2020 (pdf
><https://www.cc.gatech.edu/~dyang888/docs/cscw_li_2020_wiki.pdf>)
>
>
> Talk 2
>
> Speaker: J. Nathan Matias <http://natematias.com/> (Citizens and
> Technology Lab <http://citizensandtech.org/>, Cornell University
> Departments of Communication and Information Science)
>
> Title: The Effect of Receiving Appreciation on Wikipedias. A Community
> Co-Designed Field Experiment
>
> Abstract: Can saying “thank you” make online communities stronger & more
> inclusive? Or does thanking others for their voluntary efforts have little
> effect? To ask this question, the Citizens and Technology Lab (CAT Lab)
> organized 344 volunteers to send thanks to Wikipedia contributors across
> the Arabic, German, Polish, and Persian languages. We then observed the
> behavior of 15,558 newcomers and experienced contributors to Wikipedia. On
> average, we found that organizing volunteers to thank others increases
> two-week retention of newcomers and experienced accounts. It also caused
> people to send more thanks to others. This study was a field experiment, a
> randomized trial that sent thanks to some people and not to others. These
> experiments can help answer questions about the impact of community
> practices and platform design. But they can sometimes face community
> mistrust, especially when researchers conduct them without community
> consent. In this talk, learn more about CAT Lab's approach to community-led
> research and discuss open questions about best practices.
>
> More information:
>
>-
>
>Volunteers Thanked Thousands of Wikipedia Editors to Learn the Effects
>of Receiving Thanks
>
> <https://citizensandtech.org/2020/06/effects-of-saying-thanks-on-wikipedia/>,
>blogpost (in EN, DE, AR, PL, FA) <https://osf.io/ueq5f/>
>-
>
>The Diffusion and Influence of Gratitude Expressions in Large-Scale
>Cooperation: A Field Experiment in Four Knowledge Networks
><https://osf.io/ueq5f/>, paper preprint
>
>
> More information:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/OKCSGSLESF4LPRV67KTKIN2A6OYTJVFB/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] September 15, 2021: Socialization on Wikipedia

2021-09-13 Thread Janna Layton
Reminder that the September Research Showcase is this Wednesday.

On Thu, Sep 9, 2021 at 1:15 PM Janna Layton  wrote:

> Hello all,
>
> The September Wikimedia Research Showcase will be on September 15 at 16:30
> UTC (9:30am PT/ 12:30pm ET/ 18:30pm CEST). The theme will be
> "socialization on Wikipedia" with speakers Rosta Farzan and J. Nathan
> Matias.
>
> Livestream: https://www.youtube.com/watch?v=YVqabVvLIZU
>
> Talk 1
>
> Speaker: Rosta Farzan (School of Computing and Information, University of
> Pittsburgh)
>
> Title: Unlocking the Wikipedia clubhouse to newcomers: results from two
> studies
>
> Abstract: It is no news to any of us that success of online production
> communities such as Wikipedia highly relies on a continuous stream of
> newcomers to replace the inevitable high turnover and to bring on board new
> sources of ideas and workforce. However, these communities have been
> struggling with attracting newcomers, especially from a diverse population
> of users, and further retention of newcomers. In this talk, I will present
> about two different approaches in engaging new editors in Wikipedia: (1)
> newcomers joining through the Wiki Ed program, an online program in which
> college students edit Wikipedia articles as class assignments; (2)
> newcomers joining through a Wikipedia Art+Feminism edit-a-thon.  I present
> how each approach incorporated techniques in engaging newcomers and how
> they succeed in attracting and retention of newcomers.
>
> More information:
>
>- Bring on Board New Enthusiasts! A Case Study of Impact of Wikipedia
>Art + Feminism Edit-A-Thon Events on Newcomers
><https://link.springer.com/chapter/10.1007/978-3-319-47880-7_2>,
>SocInfo 2016 (pdf
><http://saviaga.com/wp-content/uploads/2016/06/socinfo_ediathons.pdf>)
>- Successful Online Socialization: Lessons from the Wikipedia
>Education Program <https://dl.acm.org/doi/abs/10.1145/3392857>, CSCW
>2020 (pdf
><https://www.cc.gatech.edu/~dyang888/docs/cscw_li_2020_wiki.pdf>)
>
>
> Talk 2
>
> Speaker: J. Nathan Matias <http://natematias.com/> (Citizens and
> Technology Lab <http://citizensandtech.org/>, Cornell University
> Departments of Communication and Information Science)
>
> Title: The Effect of Receiving Appreciation on Wikipedias. A Community
> Co-Designed Field Experiment
>
> Abstract: Can saying “thank you” make online communities stronger & more
> inclusive? Or does thanking others for their voluntary efforts have little
> effect? To ask this question, the Citizens and Technology Lab (CAT Lab)
> organized 344 volunteers to send thanks to Wikipedia contributors across
> the Arabic, German, Polish, and Persian languages. We then observed the
> behavior of 15,558 newcomers and experienced contributors to Wikipedia. On
> average, we found that organizing volunteers to thank others increases
> two-week retention of newcomers and experienced accounts. It also caused
> people to send more thanks to others. This study was a field experiment, a
> randomized trial that sent thanks to some people and not to others. These
> experiments can help answer questions about the impact of community
> practices and platform design. But they can sometimes face community
> mistrust, especially when researchers conduct them without community
> consent. In this talk, learn more about CAT Lab's approach to community-led
> research and discuss open questions about best practices.
>
> More information:
>
>-
>
>Volunteers Thanked Thousands of Wikipedia Editors to Learn the Effects
>of Receiving Thanks
>
> <https://citizensandtech.org/2020/06/effects-of-saying-thanks-on-wikipedia/>,
>blogpost (in EN, DE, AR, PL, FA) <https://osf.io/ueq5f/>
>-
>
>The Diffusion and Influence of Gratitude Expressions in Large-Scale
>Cooperation: A Field Experiment in Four Knowledge Networks
><https://osf.io/ueq5f/>, paper preprint
>
>
> More information:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/FGM24NLCCVBDL4PQIQMPFCPNHJAIM7V5/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] September 15, 2021: Socialization on Wikipedia

2021-09-09 Thread Janna Layton
Hello all,

The September Wikimedia Research Showcase will be on September 15 at 16:30
UTC (9:30am PT/ 12:30pm ET/ 18:30pm CEST). The theme will be "socialization
on Wikipedia" with speakers Rosta Farzan and J. Nathan Matias.

Livestream: https://www.youtube.com/watch?v=YVqabVvLIZU

Talk 1

Speaker: Rosta Farzan (School of Computing and Information, University of
Pittsburgh)

Title: Unlocking the Wikipedia clubhouse to newcomers: results from two
studies

Abstract: It is no news to any of us that success of online production
communities such as Wikipedia highly relies on a continuous stream of
newcomers to replace the inevitable high turnover and to bring on board new
sources of ideas and workforce. However, these communities have been
struggling with attracting newcomers, especially from a diverse population
of users, and further retention of newcomers. In this talk, I will present
about two different approaches in engaging new editors in Wikipedia: (1)
newcomers joining through the Wiki Ed program, an online program in which
college students edit Wikipedia articles as class assignments; (2)
newcomers joining through a Wikipedia Art+Feminism edit-a-thon.  I present
how each approach incorporated techniques in engaging newcomers and how
they succeed in attracting and retention of newcomers.

More information:

   - Bring on Board New Enthusiasts! A Case Study of Impact of Wikipedia
   Art + Feminism Edit-A-Thon Events on Newcomers
   <https://link.springer.com/chapter/10.1007/978-3-319-47880-7_2>, SocInfo
   2016 (pdf
   <http://saviaga.com/wp-content/uploads/2016/06/socinfo_ediathons.pdf>)
   - Successful Online Socialization: Lessons from the Wikipedia Education
   Program <https://dl.acm.org/doi/abs/10.1145/3392857>, CSCW 2020 (pdf
   <https://www.cc.gatech.edu/~dyang888/docs/cscw_li_2020_wiki.pdf>)


Talk 2

Speaker: J. Nathan Matias <http://natematias.com/> (Citizens and Technology
Lab <http://citizensandtech.org/>, Cornell University Departments of
Communication and Information Science)

Title: The Effect of Receiving Appreciation on Wikipedias. A Community
Co-Designed Field Experiment

Abstract: Can saying “thank you” make online communities stronger & more
inclusive? Or does thanking others for their voluntary efforts have little
effect? To ask this question, the Citizens and Technology Lab (CAT Lab)
organized 344 volunteers to send thanks to Wikipedia contributors across
the Arabic, German, Polish, and Persian languages. We then observed the
behavior of 15,558 newcomers and experienced contributors to Wikipedia. On
average, we found that organizing volunteers to thank others increases
two-week retention of newcomers and experienced accounts. It also caused
people to send more thanks to others. This study was a field experiment, a
randomized trial that sent thanks to some people and not to others. These
experiments can help answer questions about the impact of community
practices and platform design. But they can sometimes face community
mistrust, especially when researchers conduct them without community
consent. In this talk, learn more about CAT Lab's approach to community-led
research and discuss open questions about best practices.

More information:

   -

   Volunteers Thanked Thousands of Wikipedia Editors to Learn the Effects
   of Receiving Thanks
   <https://citizensandtech.org/2020/06/effects-of-saying-thanks-on-wikipedia/>,
   blogpost (in EN, DE, AR, PL, FA) <https://osf.io/ueq5f/>
   -

   The Diffusion and Influence of Gratitude Expressions in Large-Scale
   Cooperation: A Field Experiment in Four Knowledge Networks
   <https://osf.io/ueq5f/>, paper preprint


More information: https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/TXPUKQWK7CWXYSAKBFPMLTV56ANIKHEI/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] July 21: Effects of campaigns to close content gaps

2021-07-21 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Thu, Jul 15, 2021 at 4:59 PM Janna Layton  wrote:

> Hello all,
>
> The July Research Showcase will take place on July 21, 16:30 UTC (9:30am
> PT/ 12:30pm ET/ 18:30pm CEST). The theme is the effects of campaigns to
> close content gaps on Wikipedia, and speakers will be Kai Zhu from McGill
> University and Isabelle Langrock from the University of Pennsylvania.
>
> Livestream: https://www.youtube.com/watch?v=otN3H-hIImQ
>
> Talk 1
> Speaker: Kai Zhu (McGill University, Canada)
> Title: Addressing Information Poverty on Wikipedia
> Abstract: Open collaboration platforms have fundamentally changed the way
> that knowledge is produced, disseminated, and consumed. In these systems,
> contributions arise organically with little to no central governance.
> Although such decentralization provides many benefits, a lack of broad
> oversight and coordination can leave questions of information poverty and
> skewness to the mercy of the system’s natural dynamics. Unfortunately, we
> still lack a basic understanding of the dynamics at play in these systems
> and specifically, how contribution and attention interact and propagate
> through information networks. We leverage a large-scale natural experiment
> to study how exogenous content contributions to Wikipedia articles affect
> the attention that they attract and how that attention spills over to other
> articles in the network. Results reveal that exogenously added content
> leads to significant, substantial, and long-term increases in both content
> consumption and subsequent contributions. Furthermore, we find significant
> attention spillover to downstream hyperlinked articles. Through both
> analytical estimation and empirically informed simulation, we evaluate
> policies to harness this attention contagion to address the problem of
> information poverty and skewness. We find that harnessing attention
> contagion can lead to as much as a twofold increase in the total attention
> flow to clusters of disadvantaged articles. Our findings have important
> policy implications for open collaboration platforms and information
> networks.
>
> Talk 2
> Speaker: Isabelle Langrock (University of Pennsylvania, USA)
> Title: Quantifying and Assessing the Impact of Two Feminist Interventions
> Abstract: Wikipedia has a well-known gender divide affecting its
> biographical content. This bias not only shapes social perceptions of
> knowledge, but it can also propagate beyond the platform as its contents
> are leveraged to correct misinformation, train machine-learning tools, and
> enhance search engine results. What happens when feminist movements
> intervene to try to close existing gaps? In this talk, we present a recent
> study of two popular feminist interventions designed to counteract digital
> knowledge inequality. Our findings show that the interventions are
> successful at adding content about women that would otherwise be missing,
> but they are less successful at addressing several structural biases that
> limit the visibility of women within Wikipedia. We argue for more granular
> and cumulative analysis of gender divides in collaborative environments and
> identify key areas of support that can further aid the feminist movements
> in closing Wikipedia’s gender gaps.
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/BZWN5QP5S7R6G6IT5A6RQJ6YAY5WXQZY/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] July 21: Effects of campaigns to close content gaps

2021-07-19 Thread Janna Layton
Hi all,

Just a reminder for our upcoming Research Showcase.

On Thu, Jul 15, 2021 at 4:59 PM Janna Layton  wrote:

> Hello all,
>
> The July Research Showcase will take place on July 21, 16:30 UTC (9:30am
> PT/ 12:30pm ET/ 18:30pm CEST). The theme is the effects of campaigns to
> close content gaps on Wikipedia, and speakers will be Kai Zhu from McGill
> University and Isabelle Langrock from the University of Pennsylvania.
>
> Livestream: https://www.youtube.com/watch?v=otN3H-hIImQ
>
> Talk 1
> Speaker: Kai Zhu (McGill University, Canada)
> Title: Addressing Information Poverty on Wikipedia
> Abstract: Open collaboration platforms have fundamentally changed the way
> that knowledge is produced, disseminated, and consumed. In these systems,
> contributions arise organically with little to no central governance.
> Although such decentralization provides many benefits, a lack of broad
> oversight and coordination can leave questions of information poverty and
> skewness to the mercy of the system’s natural dynamics. Unfortunately, we
> still lack a basic understanding of the dynamics at play in these systems
> and specifically, how contribution and attention interact and propagate
> through information networks. We leverage a large-scale natural experiment
> to study how exogenous content contributions to Wikipedia articles affect
> the attention that they attract and how that attention spills over to other
> articles in the network. Results reveal that exogenously added content
> leads to significant, substantial, and long-term increases in both content
> consumption and subsequent contributions. Furthermore, we find significant
> attention spillover to downstream hyperlinked articles. Through both
> analytical estimation and empirically informed simulation, we evaluate
> policies to harness this attention contagion to address the problem of
> information poverty and skewness. We find that harnessing attention
> contagion can lead to as much as a twofold increase in the total attention
> flow to clusters of disadvantaged articles. Our findings have important
> policy implications for open collaboration platforms and information
> networks.
>
> Talk 2
> Speaker: Isabelle Langrock (University of Pennsylvania, USA)
> Title: Quantifying and Assessing the Impact of Two Feminist Interventions
> Abstract: Wikipedia has a well-known gender divide affecting its
> biographical content. This bias not only shapes social perceptions of
> knowledge, but it can also propagate beyond the platform as its contents
> are leveraged to correct misinformation, train machine-learning tools, and
> enhance search engine results. What happens when feminist movements
> intervene to try to close existing gaps? In this talk, we present a recent
> study of two popular feminist interventions designed to counteract digital
> knowledge inequality. Our findings show that the interventions are
> successful at adding content about women that would otherwise be missing,
> but they are less successful at addressing several structural biases that
> limit the visibility of women within Wikipedia. We argue for more granular
> and cumulative analysis of gender divides in collaborative environments and
> identify key areas of support that can further aid the feminist movements
> in closing Wikipedia’s gender gaps.
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/KALZ7G7ELOEXDMWA44IMJDPG2DLTFSNG/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] July 21: Effects of campaigns to close content gaps

2021-07-15 Thread Janna Layton
Hello all,

The July Research Showcase will take place on July 21, 16:30 UTC (9:30am
PT/ 12:30pm ET/ 18:30pm CEST). The theme is the effects of campaigns to
close content gaps on Wikipedia, and speakers will be Kai Zhu from McGill
University and Isabelle Langrock from the University of Pennsylvania.

Livestream: https://www.youtube.com/watch?v=otN3H-hIImQ

Talk 1
Speaker: Kai Zhu (McGill University, Canada)
Title: Addressing Information Poverty on Wikipedia
Abstract: Open collaboration platforms have fundamentally changed the way
that knowledge is produced, disseminated, and consumed. In these systems,
contributions arise organically with little to no central governance.
Although such decentralization provides many benefits, a lack of broad
oversight and coordination can leave questions of information poverty and
skewness to the mercy of the system’s natural dynamics. Unfortunately, we
still lack a basic understanding of the dynamics at play in these systems
and specifically, how contribution and attention interact and propagate
through information networks. We leverage a large-scale natural experiment
to study how exogenous content contributions to Wikipedia articles affect
the attention that they attract and how that attention spills over to other
articles in the network. Results reveal that exogenously added content
leads to significant, substantial, and long-term increases in both content
consumption and subsequent contributions. Furthermore, we find significant
attention spillover to downstream hyperlinked articles. Through both
analytical estimation and empirically informed simulation, we evaluate
policies to harness this attention contagion to address the problem of
information poverty and skewness. We find that harnessing attention
contagion can lead to as much as a twofold increase in the total attention
flow to clusters of disadvantaged articles. Our findings have important
policy implications for open collaboration platforms and information
networks.

Talk 2
Speaker: Isabelle Langrock (University of Pennsylvania, USA)
Title: Quantifying and Assessing the Impact of Two Feminist Interventions
Abstract: Wikipedia has a well-known gender divide affecting its
biographical content. This bias not only shapes social perceptions of
knowledge, but it can also propagate beyond the platform as its contents
are leveraged to correct misinformation, train machine-learning tools, and
enhance search engine results. What happens when feminist movements
intervene to try to close existing gaps? In this talk, we present a recent
study of two popular feminist interventions designed to counteract digital
knowledge inequality. Our findings show that the interventions are
successful at adding content about women that would otherwise be missing,
but they are less successful at addressing several structural biases that
limit the visibility of women within Wikipedia. We argue for more granular
and cumulative analysis of gender divides in collaborative environments and
identify key areas of support that can further aid the feminist movements
in closing Wikipedia’s gender gaps.

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/WJ5WE7LNAAZTKROQBK255VN52SU2WHOB/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] AI Model Governance: June 23, 2021

2021-06-23 Thread Janna Layton
Reminder: the Research Showcase will be starting in about 30 minutes.

On Mon, Jun 21, 2021 at 11:52 AM Janna Layton  wrote:

> Hello all,
>
> This month's Wikimedia Research Showcase will be held on Wednesday, June
> 23, at 16:30 UTC (9:30am PDT). The theme is "AI model governance" with
> presentations from Haiyi Zhu of Carnegie Mellon University and Andy Craze
> and Chris Albon of the Wikimedia Foundation's Machine Learning Team.
>
> Youtube: https://www.youtube.com/watch?v=USSBuwebWt4
> Mediawiki:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#June_2021
>
> Talk 1
>
> Speaker: Haiyi Zhu (Carnegie Mellon University)
>
> Title: Bridging AI and HCI: Incorporating Human Values into the
> Development of AI Technologies
>
> Abstract: The increasing accuracy and falling costs of AI have stimulated
> the increased use of AI technologies in mainstream user-facing applications
> and services. However, there is a disconnect between mathematically
> rigorous AI approaches and the human stakeholders’ needs, motivations, and
> values, as well as organizational and institutional realities, contexts,
> and constraints; this disconnect is likely to undermine practical
> initiatives and may sometimes lead to negative societal impacts. In this
> presentation, I will discuss my research on incorporating human
> stakeholders’  values and feedback into the creation process of AI
> technologies. I will describe a series of projects in the context of the
> Wikipedia community to illustrate my approach. I hope this presentation
> will contribute to the rich ongoing conversation concerning bridging HCI
> and AI and using HCI methods to address AI challenges.
>
> Talk 2
>
> Speaker: Andy Craze, Chris Albon (Wikimedia Foundation, Machine Learning
> Team)
>
> Title: ML Governance: First Steps
>
> Abstract: The WMF Machine Learning team is upgrading the Foundation's
> infrastructure to support the modern machine learning ecosystem. As part of
> this work, the team seeks to understand its ethical and legal
> responsibilities for developing and hosting predictive models within a
> global context. Drawing from previous WMF research related to ethical &
> human-centered machine learning, the team wishes to begin a series of
> conversations to discuss how we can deploy responsible systems that are
> inclusive to newcomers and non-experts, while upholding our commitment to
> free and open knowledge.
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/AE7XR4HGR6CZKZKZF65YQNJWUCCUANYT/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] AI Model Governance: June 23, 2021

2021-06-21 Thread Janna Layton
Hello all,

This month's Wikimedia Research Showcase will be held on Wednesday, June
23, at 16:30 UTC (9:30am PDT). The theme is "AI model governance" with
presentations from Haiyi Zhu of Carnegie Mellon University and Andy Craze
and Chris Albon of the Wikimedia Foundation's Machine Learning Team.

Youtube: https://www.youtube.com/watch?v=USSBuwebWt4
Mediawiki:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#June_2021

Talk 1

Speaker: Haiyi Zhu (Carnegie Mellon University)

Title: Bridging AI and HCI: Incorporating Human Values into the Development
of AI Technologies

Abstract: The increasing accuracy and falling costs of AI have stimulated
the increased use of AI technologies in mainstream user-facing applications
and services. However, there is a disconnect between mathematically
rigorous AI approaches and the human stakeholders’ needs, motivations, and
values, as well as organizational and institutional realities, contexts,
and constraints; this disconnect is likely to undermine practical
initiatives and may sometimes lead to negative societal impacts. In this
presentation, I will discuss my research on incorporating human
stakeholders’  values and feedback into the creation process of AI
technologies. I will describe a series of projects in the context of the
Wikipedia community to illustrate my approach. I hope this presentation
will contribute to the rich ongoing conversation concerning bridging HCI
and AI and using HCI methods to address AI challenges.

Talk 2

Speaker: Andy Craze, Chris Albon (Wikimedia Foundation, Machine Learning
Team)

Title: ML Governance: First Steps

Abstract: The WMF Machine Learning team is upgrading the Foundation's
infrastructure to support the modern machine learning ecosystem. As part of
this work, the team seeks to understand its ethical and legal
responsibilities for developing and hosting predictive models within a
global context. Drawing from previous WMF research related to ethical &
human-centered machine learning, the team wishes to begin a series of
conversations to discuss how we can deploy responsible systems that are
inclusive to newcomers and non-experts, while upholding our commitment to
free and open knowledge.

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/7EVZEJRSHHV225AIJTWVNZOCP4LF2UHX/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] The Importance and Value of Wikipedia: May 19, 16:30 UTC

2021-05-19 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Thu, May 13, 2021 at 2:37 PM Janna Layton  wrote:

> Hello all,
>
>
> In this month’s showcase, we have two presentations on the importance and
> value of Wikipedia outside of Wikipedia. In the first talk, Nick Vincent
> will present results on the importance of Wikipedia for search engines in
> which they looked how often and where Wikipedia links appeared in search
> results of Google, Bing, DuckDuckGo, etc. In the second talk, Tiziano
> Piccardi will present a recent study on the value of Wikipedia’s links to
> external websites by quantifying the amount of traffic generated by those
> links and analyzing for which types of articles Wikipedia acts as a
> stepping stone to the intended destination.
>
> Time/date: May 19, 16:30 UTC (9:30am PT/ 12:30pm ET/ 18:30pm CET)
>
> Youtube-link: https://www.youtube.com/watch?v=VoX5rFNzkXs
>
>
> Talk 1
>
> Speaker: Nick Vincent (Northwestern University, USA)
>
> Title: The Importance of Wikipedia to Search Engines and Other Systems
>
> Abstract: A growing body of work has highlighted the important role that
> Wikipedia’s volunteer-created content plays in helping search engines
> achieve their core goal of addressing the information needs of hundreds of
> millions of people. In this talk, I will discuss a recent study looking at
> how often, and where, Wikipedia links appear in search engine results. In
> this study, we found that Wikipedia links appeared prominently and
> frequently in Google, Bing, and DuckDuckGo results, though less often for
> searches from a mobile device. I will connect this study to past work
> looking at the value of Wikipedia links to other online platforms, and to
> ongoing discussions around Wikipedia's value as a training source for
> modern AI.
>
> Related paper:
>
>-
>
>A Deeper Investigation of the Importance of Wikipedia Links to Search
>Engine Results. To Appear in CSCW 2021.
>https://nickmvincent.com/static/wikiserp_cscw.pdf (pdf)
>
>
> Talk 2
>
> Speaker: Tiziano Piccardi (EPFL, Switzerland)
>
> Title: On the Value of Wikipedia as a Gateway to the Web
>
> Abstract: By linking to external websites, Wikipedia can act as a gateway
> to the Web. However, little is known about the amount of traffic generated
> by Wikipedia's external links. We fill this gap in a detailed analysis of
> usage logs gathered from Wikipedia users' client devices. We discovered
> that in one month, English Wikipedia generated 43M clicks to external
> websites, with the highest click-through rate on the official links listed
> in the infoboxes. Our analysis highlights that the articles about
> businesses, educational institutions, and websites show the highest
> engagement, and for some content, Wikipedia act as a stepping stone to the
> intended destination. We conclude our analysis by quantifying the
> hypothetical economic value of the clicks received by external websites. We
> estimate that the respective website owners would need to pay a total of
> $7--13 million per month to obtain the same volume of traffic via sponsored
> search. These findings shed light on Wikipedia's role not only as an
> important source of information but also as a high-traffic gateway to the
> broader Web ecosystem.
>
> Related paper:
>
>-
>
>On the Value of Wikipedia as a Gateway to the Web. WWW 2021.
>https://arxiv.org/pdf/2102.07385 (pdf)
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikimedia Research Showcase] The Importance and Value of Wikipedia: May 19, 16:30 UTC

2021-05-17 Thread Janna Layton
Just a reminder that the May Research Showcase will be this Wednesday.

On Thu, May 13, 2021 at 2:37 PM Janna Layton  wrote:

> Hello all,
>
>
> In this month’s showcase, we have two presentations on the importance and
> value of Wikipedia outside of Wikipedia. In the first talk, Nick Vincent
> will present results on the importance of Wikipedia for search engines in
> which they looked how often and where Wikipedia links appeared in search
> results of Google, Bing, DuckDuckGo, etc. In the second talk, Tiziano
> Piccardi will present a recent study on the value of Wikipedia’s links to
> external websites by quantifying the amount of traffic generated by those
> links and analyzing for which types of articles Wikipedia acts as a
> stepping stone to the intended destination.
>
> Time/date: May 19, 16:30 UTC (9:30am PT/ 12:30pm ET/ 18:30pm CET)
>
> Youtube-link: https://www.youtube.com/watch?v=VoX5rFNzkXs
>
>
> Talk 1
>
> Speaker: Nick Vincent (Northwestern University, USA)
>
> Title: The Importance of Wikipedia to Search Engines and Other Systems
>
> Abstract: A growing body of work has highlighted the important role that
> Wikipedia’s volunteer-created content plays in helping search engines
> achieve their core goal of addressing the information needs of hundreds of
> millions of people. In this talk, I will discuss a recent study looking at
> how often, and where, Wikipedia links appear in search engine results. In
> this study, we found that Wikipedia links appeared prominently and
> frequently in Google, Bing, and DuckDuckGo results, though less often for
> searches from a mobile device. I will connect this study to past work
> looking at the value of Wikipedia links to other online platforms, and to
> ongoing discussions around Wikipedia's value as a training source for
> modern AI.
>
> Related paper:
>
>-
>
>A Deeper Investigation of the Importance of Wikipedia Links to Search
>Engine Results. To Appear in CSCW 2021.
>https://nickmvincent.com/static/wikiserp_cscw.pdf (pdf)
>
>
> Talk 2
>
> Speaker: Tiziano Piccardi (EPFL, Switzerland)
>
> Title: On the Value of Wikipedia as a Gateway to the Web
>
> Abstract: By linking to external websites, Wikipedia can act as a gateway
> to the Web. However, little is known about the amount of traffic generated
> by Wikipedia's external links. We fill this gap in a detailed analysis of
> usage logs gathered from Wikipedia users' client devices. We discovered
> that in one month, English Wikipedia generated 43M clicks to external
> websites, with the highest click-through rate on the official links listed
> in the infoboxes. Our analysis highlights that the articles about
> businesses, educational institutions, and websites show the highest
> engagement, and for some content, Wikipedia act as a stepping stone to the
> intended destination. We conclude our analysis by quantifying the
> hypothetical economic value of the clicks received by external websites. We
> estimate that the respective website owners would need to pay a total of
> $7--13 million per month to obtain the same volume of traffic via sponsored
> search. These findings shed light on Wikipedia's role not only as an
> important source of information but also as a high-traffic gateway to the
> broader Web ecosystem.
>
> Related paper:
>
>-
>
>On the Value of Wikipedia as a Gateway to the Web. WWW 2021.
>https://arxiv.org/pdf/2102.07385 (pdf)
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] [Wikimedia Research Showcase] The Importance and Value of Wikipedia: May 19, 16:30 UTC

2021-05-13 Thread Janna Layton
Hello all,


In this month’s showcase, we have two presentations on the importance and
value of Wikipedia outside of Wikipedia. In the first talk, Nick Vincent
will present results on the importance of Wikipedia for search engines in
which they looked how often and where Wikipedia links appeared in search
results of Google, Bing, DuckDuckGo, etc. In the second talk, Tiziano
Piccardi will present a recent study on the value of Wikipedia’s links to
external websites by quantifying the amount of traffic generated by those
links and analyzing for which types of articles Wikipedia acts as a
stepping stone to the intended destination.

Time/date: May 19, 16:30 UTC (9:30am PT/ 12:30pm ET/ 18:30pm CET)

Youtube-link: https://www.youtube.com/watch?v=VoX5rFNzkXs


Talk 1

Speaker: Nick Vincent (Northwestern University, USA)

Title: The Importance of Wikipedia to Search Engines and Other Systems

Abstract: A growing body of work has highlighted the important role that
Wikipedia’s volunteer-created content plays in helping search engines
achieve their core goal of addressing the information needs of hundreds of
millions of people. In this talk, I will discuss a recent study looking at
how often, and where, Wikipedia links appear in search engine results. In
this study, we found that Wikipedia links appeared prominently and
frequently in Google, Bing, and DuckDuckGo results, though less often for
searches from a mobile device. I will connect this study to past work
looking at the value of Wikipedia links to other online platforms, and to
ongoing discussions around Wikipedia's value as a training source for
modern AI.

Related paper:

   -

   A Deeper Investigation of the Importance of Wikipedia Links to Search
   Engine Results. To Appear in CSCW 2021.
   https://nickmvincent.com/static/wikiserp_cscw.pdf (pdf)


Talk 2

Speaker: Tiziano Piccardi (EPFL, Switzerland)

Title: On the Value of Wikipedia as a Gateway to the Web

Abstract: By linking to external websites, Wikipedia can act as a gateway
to the Web. However, little is known about the amount of traffic generated
by Wikipedia's external links. We fill this gap in a detailed analysis of
usage logs gathered from Wikipedia users' client devices. We discovered
that in one month, English Wikipedia generated 43M clicks to external
websites, with the highest click-through rate on the official links listed
in the infoboxes. Our analysis highlights that the articles about
businesses, educational institutions, and websites show the highest
engagement, and for some content, Wikipedia act as a stepping stone to the
intended destination. We conclude our analysis by quantifying the
hypothetical economic value of the clicks received by external websites. We
estimate that the respective website owners would need to pay a total of
$7--13 million per month to obtain the same volume of traffic via sponsored
search. These findings shed light on Wikipedia's role not only as an
important source of information but also as a high-traffic gateway to the
broader Web ecosystem.

Related paper:

   -

   On the Value of Wikipedia as a Gateway to the Web. WWW 2021.
   https://arxiv.org/pdf/2102.07385 (pdf)


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: To unsubscribe send an email to 
wikimedia-l-le...@lists.wikimedia.org
Unsubscribe: %(web_page_url)slistinfo%(cgiext)s/%(_internal_name)s, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] March 17: Curiosity

2021-03-17 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Thu, Mar 11, 2021 at 12:49 PM Janna Layton  wrote:

> In this showcase, Prof. Danielle Bassett will present recent work studying
> individual and collective curiosity as network building processes using
> Wikipedia.
>
> Date/Time: March 17, 16:30 UTC (9:30am PT/12:30pm ET/17:30pm CET)
> Youtube: https://www.youtube.com/watch?v=jw2s_Y4J2tI
>
> Speaker: Danielle Bassett (University of Pennsylvania)
>
> Title: The curious human
>
> Abstract: The human mind is curious. It is strange, remarkable, and
> mystifying; it is eager, probing, questioning. Despite its pervasiveness
> and its relevance for our well-being, scientific studies of human curiosity
> that bridge both the organ of curiosity and the object of curiosity remain
> in their infancy. In this talk, I will integrate historical, philosophical,
> and psychological perspectives with techniques from applied mathematics and
> statistical physics to study individual and collective curiosity. In the
> former, I will evaluate how humans walk on the knowledge network of
> Wikipedia during unconstrained browsing. In doing so, we will capture
> idiosyncratic forms of curiosity that span multiple millennia, cultures,
> languages, and timescales. In the latter, I will consider the fruition of
> collective curiosity in the building of scientific knowledge as encoded in
> Wikipedia. Throughout, I will make a case for the position that individual
> and collective curiosity are both network building processes, providing a
> connective counterpoint to the common acquisitional account of curiosity in
> humans.
>
> Related papers:
>
> Hunters, busybodies, and the knowledge network building associated with
> curiosity. https://doi.org/10.31234/osf.io/undy4
>
> The network structure of scientific revolutions.
> http://arxiv.org/abs/2010.08381
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#March_2021
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] March 17: Curiosity

2021-03-15 Thread Janna Layton
Reminder that this will be happening on Wednesday.

On Thu, Mar 11, 2021 at 12:49 PM Janna Layton  wrote:

> In this showcase, Prof. Danielle Bassett will present recent work studying
> individual and collective curiosity as network building processes using
> Wikipedia.
>
> Date/Time: March 17, 16:30 UTC (9:30am PT/12:30pm ET/17:30pm CET)
> Youtube: https://www.youtube.com/watch?v=jw2s_Y4J2tI
>
> Speaker: Danielle Bassett (University of Pennsylvania)
>
> Title: The curious human
>
> Abstract: The human mind is curious. It is strange, remarkable, and
> mystifying; it is eager, probing, questioning. Despite its pervasiveness
> and its relevance for our well-being, scientific studies of human curiosity
> that bridge both the organ of curiosity and the object of curiosity remain
> in their infancy. In this talk, I will integrate historical, philosophical,
> and psychological perspectives with techniques from applied mathematics and
> statistical physics to study individual and collective curiosity. In the
> former, I will evaluate how humans walk on the knowledge network of
> Wikipedia during unconstrained browsing. In doing so, we will capture
> idiosyncratic forms of curiosity that span multiple millennia, cultures,
> languages, and timescales. In the latter, I will consider the fruition of
> collective curiosity in the building of scientific knowledge as encoded in
> Wikipedia. Throughout, I will make a case for the position that individual
> and collective curiosity are both network building processes, providing a
> connective counterpoint to the common acquisitional account of curiosity in
> humans.
>
> Related papers:
>
> Hunters, busybodies, and the knowledge network building associated with
> curiosity. https://doi.org/10.31234/osf.io/undy4
>
> The network structure of scientific revolutions.
> http://arxiv.org/abs/2010.08381
>
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#March_2021
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] [Wikimedia Research Showcase] March 17: Curiosity

2021-03-11 Thread Janna Layton
In this showcase, Prof. Danielle Bassett will present recent work studying
individual and collective curiosity as network building processes using
Wikipedia.

Date/Time: March 17, 16:30 UTC (9:30am PT/12:30pm ET/17:30pm CET)
Youtube: https://www.youtube.com/watch?v=jw2s_Y4J2tI

Speaker: Danielle Bassett (University of Pennsylvania)

Title: The curious human

Abstract: The human mind is curious. It is strange, remarkable, and
mystifying; it is eager, probing, questioning. Despite its pervasiveness
and its relevance for our well-being, scientific studies of human curiosity
that bridge both the organ of curiosity and the object of curiosity remain
in their infancy. In this talk, I will integrate historical, philosophical,
and psychological perspectives with techniques from applied mathematics and
statistical physics to study individual and collective curiosity. In the
former, I will evaluate how humans walk on the knowledge network of
Wikipedia during unconstrained browsing. In doing so, we will capture
idiosyncratic forms of curiosity that span multiple millennia, cultures,
languages, and timescales. In the latter, I will consider the fruition of
collective curiosity in the building of scientific knowledge as encoded in
Wikipedia. Throughout, I will make a case for the position that individual
and collective curiosity are both network building processes, providing a
connective counterpoint to the common acquisitional account of curiosity in
humans.

Related papers:

Hunters, busybodies, and the knowledge network building associated with
curiosity. https://doi.org/10.31234/osf.io/undy4

The network structure of scientific revolutions.
http://arxiv.org/abs/2010.08381

https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#March_2021

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] December 16, 2020: Disinformation and Reliability of Sources in Wikipedia

2020-12-16 Thread Janna Layton
This will be starting in about 30 minutes.

On Mon, Dec 14, 2020 at 11:01 AM Janna Layton  wrote:

> Just a reminder that this Showcase will be happening on Wednesday.
>
> On Fri, Dec 11, 2020 at 11:22 AM Janna Layton 
> wrote:
>
>> Hello,
>>
>> The next Research Showcase will be live-streamed on Wednesday, December
>> 16, at 9:30 AM PST/17:30 UTC, and will be on the theme of disinformation
>> and reliability of sources in Wikipedia. In the first talk, Włodzimierz
>> Lewoniewski will present recent work around multilingual approaches for the
>> assessment of content quality and reliability of sources in Wikipedia
>> leveraging machine learning algorithms. In the second talk, Diego
>> Saez-Trumper will give an overview of ongoing work on fighting
>> disinformation in Wikipedia; specifically, the development of tools and
>> datasets aimed at supporting the discovery of suspicious content and
>> improving verifiability.
>>
>> Youtube stream: https://www.youtube.com/watch?v=v9Wcc-TeaEY
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> Talk 1
>>
>> Speaker: Włodzimierz Lewoniewski (Poznań University of Economics and
>> Business, Poland)
>>
>> Title: Quality assessment of Wikipedia and its sources
>>
>> Abstract: Information in Wikipedia can be edited in over 300 languages
>> independently. Therefore often the same subject in Wikipedia can be
>> described differently depending on language edition. In order to compare
>> information between them one usually needs to understand each of considered
>> languages. We work on solutions that can help to automate this process.
>> They leverage machine learning and artificial intelligence algorithms. The
>> crucial component, however, is assessment of article quality therefore we
>> need to know how to define and extract different quality measures. This
>> presentation briefly introduces some of the recent activities of Department
>> of Information Systems at Poznań University of Economics and Business
>> related to quality assessment of multilingual content in Wikipedia. In
>> particular, we
>>
>> demonstrate some of the approaches for the reliability assessment of
>> sources in Wikipedia articles. Such solutions can help to enrich various
>> language editions of Wikipedia and other knowledge bases with information
>> of better quality.
>>
>>
>> Talk 2
>>
>> Speaker: Diego Saez-Trumper (Research, Wikimedia Foundation)
>>
>> Title: Challenges on fighting Disinformation in Wikipedia: Who has the
>> (ground-)truth?
>>
>> Abstract: Different from the major social media websites where the fight
>> against disinformation mainly refers to preventing users to massively
>> replicate fake content, fighting disinformation in Wikipedia requires tools
>> that allows editors to apply the content policies of: verifiability,
>> non-original research, and neutral point of view. Moreover, while other
>> platforms try to apply automatic fact checking techniques to verify
>> content, the ground-truth for such verification is done based on Wikipedia,
>> for obvious reasons we can't follow the same pipeline for fact checking
>> content on Wikipedia. In this talk we will explain the ML approach we are
>> developing to build tools to efficiently support wikipedians to discover
>> suspicious content and how we collaborate with external researchers on this
>> task. We will also describe a group of datasets we are preparing to share
>> with the research community in order to produce state-of-the-art algorithms
>> to improve the verifiability of content on Wikipedia.
>>
>> --
>> Janna Layton (she/her)
>> Administrative Associate - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] December 16, 2020: Disinformation and Reliability of Sources in Wikipedia

2020-12-14 Thread Janna Layton
Just a reminder that this Showcase will be happening on Wednesday.

On Fri, Dec 11, 2020 at 11:22 AM Janna Layton  wrote:

> Hello,
>
> The next Research Showcase will be live-streamed on Wednesday, December
> 16, at 9:30 AM PST/17:30 UTC, and will be on the theme of disinformation
> and reliability of sources in Wikipedia. In the first talk, Włodzimierz
> Lewoniewski will present recent work around multilingual approaches for the
> assessment of content quality and reliability of sources in Wikipedia
> leveraging machine learning algorithms. In the second talk, Diego
> Saez-Trumper will give an overview of ongoing work on fighting
> disinformation in Wikipedia; specifically, the development of tools and
> datasets aimed at supporting the discovery of suspicious content and
> improving verifiability.
>
> Youtube stream: https://www.youtube.com/watch?v=v9Wcc-TeaEY
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> Talk 1
>
> Speaker: Włodzimierz Lewoniewski (Poznań University of Economics and
> Business, Poland)
>
> Title: Quality assessment of Wikipedia and its sources
>
> Abstract: Information in Wikipedia can be edited in over 300 languages
> independently. Therefore often the same subject in Wikipedia can be
> described differently depending on language edition. In order to compare
> information between them one usually needs to understand each of considered
> languages. We work on solutions that can help to automate this process.
> They leverage machine learning and artificial intelligence algorithms. The
> crucial component, however, is assessment of article quality therefore we
> need to know how to define and extract different quality measures. This
> presentation briefly introduces some of the recent activities of Department
> of Information Systems at Poznań University of Economics and Business
> related to quality assessment of multilingual content in Wikipedia. In
> particular, we
>
> demonstrate some of the approaches for the reliability assessment of
> sources in Wikipedia articles. Such solutions can help to enrich various
> language editions of Wikipedia and other knowledge bases with information
> of better quality.
>
>
> Talk 2
>
> Speaker: Diego Saez-Trumper (Research, Wikimedia Foundation)
>
> Title: Challenges on fighting Disinformation in Wikipedia: Who has the
> (ground-)truth?
>
> Abstract: Different from the major social media websites where the fight
> against disinformation mainly refers to preventing users to massively
> replicate fake content, fighting disinformation in Wikipedia requires tools
> that allows editors to apply the content policies of: verifiability,
> non-original research, and neutral point of view. Moreover, while other
> platforms try to apply automatic fact checking techniques to verify
> content, the ground-truth for such verification is done based on Wikipedia,
> for obvious reasons we can't follow the same pipeline for fact checking
> content on Wikipedia. In this talk we will explain the ML approach we are
> developing to build tools to efficiently support wikipedians to discover
> suspicious content and how we collaborate with external researchers on this
> task. We will also describe a group of datasets we are preparing to share
> with the research community in order to produce state-of-the-art algorithms
> to improve the verifiability of content on Wikipedia.
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] [Wikimedia Research Showcase] December 16, 2020: Disinformation and Reliability of Sources in Wikipedia

2020-12-11 Thread Janna Layton
Hello,

The next Research Showcase will be live-streamed on Wednesday, December 16,
at 9:30 AM PST/17:30 UTC, and will be on the theme of disinformation and
reliability of sources in Wikipedia. In the first talk, Włodzimierz
Lewoniewski will present recent work around multilingual approaches for the
assessment of content quality and reliability of sources in Wikipedia
leveraging machine learning algorithms. In the second talk, Diego
Saez-Trumper will give an overview of ongoing work on fighting
disinformation in Wikipedia; specifically, the development of tools and
datasets aimed at supporting the discovery of suspicious content and
improving verifiability.

Youtube stream: https://www.youtube.com/watch?v=v9Wcc-TeaEY

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

Talk 1

Speaker: Włodzimierz Lewoniewski (Poznań University of Economics and
Business, Poland)

Title: Quality assessment of Wikipedia and its sources

Abstract: Information in Wikipedia can be edited in over 300 languages
independently. Therefore often the same subject in Wikipedia can be
described differently depending on language edition. In order to compare
information between them one usually needs to understand each of considered
languages. We work on solutions that can help to automate this process.
They leverage machine learning and artificial intelligence algorithms. The
crucial component, however, is assessment of article quality therefore we
need to know how to define and extract different quality measures. This
presentation briefly introduces some of the recent activities of Department
of Information Systems at Poznań University of Economics and Business
related to quality assessment of multilingual content in Wikipedia. In
particular, we

demonstrate some of the approaches for the reliability assessment of
sources in Wikipedia articles. Such solutions can help to enrich various
language editions of Wikipedia and other knowledge bases with information
of better quality.


Talk 2

Speaker: Diego Saez-Trumper (Research, Wikimedia Foundation)

Title: Challenges on fighting Disinformation in Wikipedia: Who has the
(ground-)truth?

Abstract: Different from the major social media websites where the fight
against disinformation mainly refers to preventing users to massively
replicate fake content, fighting disinformation in Wikipedia requires tools
that allows editors to apply the content policies of: verifiability,
non-original research, and neutral point of view. Moreover, while other
platforms try to apply automatic fact checking techniques to verify
content, the ground-truth for such verification is done based on Wikipedia,
for obvious reasons we can't follow the same pipeline for fact checking
content on Wikipedia. In this talk we will explain the ML approach we are
developing to build tools to efficiently support wikipedians to discover
suspicious content and how we collaborate with external researchers on this
task. We will also describe a group of datasets we are preparing to share
with the research community in order to produce state-of-the-art algorithms
to improve the verifiability of content on Wikipedia.

-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] November 18, 2020: Interpersonal Communication Between Editors

2020-11-18 Thread Janna Layton
The Research Showcase will be starting in 30 minutes.

On Mon, Nov 16, 2020 at 11:57 AM Janna Layton  wrote:

> Hello, everyone,
>
> The next Research Showcase will be live-streamed on Wednesday, November
> 18, at 9:30 AM PST/17:30 UTC, and will be on the theme of interpersonal
> communication between editors. Interpersonal communication, for example via
> talk pages, plays a crucial role for editors to coordinate their efforts in
> online collaborative communities. For this month’s showcase we have invited
> 2 speakers sharing their research on getting a deeper understanding of
> interpersonal communication on Wikipedia. In the first talk, Anna Rader
> will give an overview on editors’ communication networks and patterns, and
> the different types of dynamics commonly found in the way that users
> interact. In the second talk, Sneha Narayan presents recent work
> investigating whether easier interpersonal communication leads to enhanced
> productivity and newcomer participation across more than 200 wikis.
>
> YouTube stream: https://www.youtube.com/watch?v=G35OEDJ53bY
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Talk before you type - Interpersonal communication on Wikipedia
>
> By Dr Anna Rader, Research Consultant
>
> Formally, the work of Wikipedia’s community of volunteers is asynchronous
> and anarchic: around the world, editors labor individually and in
> disorganized ways on the collective project. Yet this work is also
> underscored by informal and vibrant interpersonal communication: in the
> lively exchanges of talk pages and the labor-sharing of editorial networks,
> anonymous strangers communicate their intentions and coordinate their
> efforts to maintain the world’s largest online encyclopaedia. This working
> paper offers an overview of academic research into editors’ communication
> networks and patterns, with a particular focus on the role of talk pages.
> It considers four communication dynamics of editor interaction:
> cooperation, deliberation, conflict and coordination; and reviews key
> recommendations for enhancing peer-to-peer communication within the
> Wikipedia community.
>
> All Talk - How Increasing Interpersonal Communication on Wikis May Not
> Enhance Productivity
>
> By Sneha Narayan, Assistant Professor, Carlton College
>
> What role does interpersonal communication play in sustaining production
> in online collaborative communities? This paper sheds light on that
> question by examining the impact of a communication feature called "message
> walls" that allows for faster and more intuitive interpersonal
> communication in a population of wikis on Wikia. Using panel data from a
> sample of 275 wiki communities that migrated to message walls and a method
> inspired by regression discontinuity designs, we analyze these transitions
> and estimate the impact of the system's introduction. Although the adoption
> of message walls was associated with increased communication among all
> editors and newcomers, it had little effect on productivity, and was
> further associated with a decrease in article contributions from new
> editors. Our results imply that design changes that make communication
> easier in a social computing system may not always translate to increased
> participation along other dimensions.
>
>-
>
>Paper <https://dl.acm.org/doi/10.1145/3359203>
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] [Wikimedia Research Showcase] November 18, 2020: Interpersonal Communication Between Editors

2020-11-16 Thread Janna Layton
Hello, everyone,

The next Research Showcase will be live-streamed on Wednesday, November 18,
at 9:30 AM PST/17:30 UTC, and will be on the theme of interpersonal
communication between editors. Interpersonal communication, for example via
talk pages, plays a crucial role for editors to coordinate their efforts in
online collaborative communities. For this month’s showcase we have invited
2 speakers sharing their research on getting a deeper understanding of
interpersonal communication on Wikipedia. In the first talk, Anna Rader
will give an overview on editors’ communication networks and patterns, and
the different types of dynamics commonly found in the way that users
interact. In the second talk, Sneha Narayan presents recent work
investigating whether easier interpersonal communication leads to enhanced
productivity and newcomer participation across more than 200 wikis.

YouTube stream: https://www.youtube.com/watch?v=G35OEDJ53bY

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Talk before you type - Interpersonal communication on Wikipedia

By Dr Anna Rader, Research Consultant

Formally, the work of Wikipedia’s community of volunteers is asynchronous
and anarchic: around the world, editors labor individually and in
disorganized ways on the collective project. Yet this work is also
underscored by informal and vibrant interpersonal communication: in the
lively exchanges of talk pages and the labor-sharing of editorial networks,
anonymous strangers communicate their intentions and coordinate their
efforts to maintain the world’s largest online encyclopaedia. This working
paper offers an overview of academic research into editors’ communication
networks and patterns, with a particular focus on the role of talk pages.
It considers four communication dynamics of editor interaction:
cooperation, deliberation, conflict and coordination; and reviews key
recommendations for enhancing peer-to-peer communication within the
Wikipedia community.

All Talk - How Increasing Interpersonal Communication on Wikis May Not
Enhance Productivity

By Sneha Narayan, Assistant Professor, Carlton College

What role does interpersonal communication play in sustaining production in
online collaborative communities? This paper sheds light on that question
by examining the impact of a communication feature called "message walls"
that allows for faster and more intuitive interpersonal communication in a
population of wikis on Wikia. Using panel data from a sample of 275 wiki
communities that migrated to message walls and a method inspired by
regression discontinuity designs, we analyze these transitions and estimate
the impact of the system's introduction. Although the adoption of message
walls was associated with increased communication among all editors and
newcomers, it had little effect on productivity, and was further associated
with a decrease in article contributions from new editors. Our results
imply that design changes that make communication easier in a social
computing system may not always translate to increased participation along
other dimensions.

   -

   Paper <https://dl.acm.org/doi/10.1145/3359203>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] September 23, 2020: Knowledge Gaps

2020-09-23 Thread Janna Layton
This will be starting in about 15 minutes.

On Mon, Sep 21, 2020 at 8:38 PM Janna Layton  wrote:

> Hi all,
>
> Just a reminder that the next Research Showcase will be live-streamed on
> Wednesday, September 23, at 9:30 AM PDT/16:30 UTC, and will be on the theme
> of knowledge gaps. The Research Team will give an overview on the first
> draft of the taxonomy of knowledge gaps in Wikimedia projects. The taxonomy
> is a first milestone towards developing a framework to understand and
> measure knowledge gaps with the goal of capturing the multi-dimensional
> aspect of knowledge gaps and inform long-term decision making. The first
> draft of the taxonomy is now published and we seek your feedback to improve
> it.
>
> YouTube stream: https://www.youtube.com/watch?v=GJDsKPsz64o
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> A first draft of the knowledge gaps taxonomy for Wikimedia projects
>
> By the Wikimedia Foundation Research Team
> <https://research.wikimedia.org/>
>
> In response to Wikimedia Movement’s 2030 strategic direction
> <https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20>,
> the Research team <https://research.wikimedia.org/team.html> at the
> Wikimedia Foundation is developing a framework to understand and measure
> knowledge gaps. The goal is to capture the multi-dimensional aspect of
> knowledge gaps and inform long-term decision making. The first milestone
> was to develop a taxonomy of knowledge gaps which offers a grouping and
> descriptions of the different Wikimedia knowledge gaps. The first draft of
> the taxonomy is now published <https://arxiv.org/abs/2008.12314> and we
> seek your feedback to improve it. In this talk, we will give an overview
> over the first draft of the taxonomy of knowledge gaps in Wikimedia
> projects. Following that, we will host an extended Q in which we would
> like to get your feedback and discuss with you the taxonomy and knowledge
> gaps more generally.
>
>    -
>
>More information:
>https://meta.wikimedia.org/wiki/Research:Knowledge_Gaps_Index/Taxonomy
>
>
> On Thu, Sep 17, 2020 at 3:27 PM Janna Layton 
> wrote:
>
>> Hi all,
>>
>>
>> The next Research Showcase will be live-streamed on Wednesday, September
>> 23, at 9:30 AM PDT/16:30 UTC, and will be on the theme of knowledge gaps.
>> Miriam Redi will give an overview on the first draft of the taxonomy of
>> knowledge gaps in Wikimedia projects. The taxonomy is a first milestone
>> towards developing a framework to understand and measure knowledge gaps
>> with the goal of capturing the multi-dimensional aspect of knowledge gaps
>> and inform long-term decision making.
>>
>> YouTube stream: https://www.youtube.com/watch?v=GJDsKPsz64o
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentation:
>>
>> A first draft of the knowledge gaps taxonomy for Wikimedia projects
>>
>> By the Wikimedia Foundation Research Team
>> <https://research.wikimedia.org/>
>>
>> In response to Wikimedia Movement’s 2030 strategic direction
>> <https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20>,
>> the Research team <https://research.wikimedia.org/team.html> at the
>> Wikimedia Foundation is developing a framework to understand and measure
>> knowledge gaps. The goal is to capture the multi-dimensional aspect of
>> knowledge gaps and inform long-term decision making. The first milestone
>> was to develop a taxonomy of knowledge gaps which offers a grouping and
>> descriptions of the different Wikimedia knowledge gaps. The first draft of
>> the taxonomy is now published <https://arxiv.org/abs/2008.12314> and we
>> seek your feedback to improve it. In this talk, we will give an overview
>> over the first draft of the taxonomy of knowledge gaps in Wikimedia
>> projects. Following that, we will host an extended Q in which we would
>> like to get your feedback and discuss with you the taxonomy and knowledge
>> gaps more generally.
>>
>>- More information:
>>https://meta.wikimedia.org/wiki/Research:Knowledge_Gaps_Index/Taxonomy
>>
>>
>> --
>> Janna Layton (she/her)
>> Administrative Associate - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.o

Re: [Wikimedia-l] [Wikimedia Research Showcase] September 23, 2020: Knowledge Gaps

2020-09-21 Thread Janna Layton
Hi all,

Just a reminder that the next Research Showcase will be live-streamed on
Wednesday, September 23, at 9:30 AM PDT/16:30 UTC, and will be on the theme
of knowledge gaps. The Research Team will give an overview on the first
draft of the taxonomy of knowledge gaps in Wikimedia projects. The taxonomy
is a first milestone towards developing a framework to understand and
measure knowledge gaps with the goal of capturing the multi-dimensional
aspect of knowledge gaps and inform long-term decision making. The first
draft of the taxonomy is now published and we seek your feedback to improve
it.

YouTube stream: https://www.youtube.com/watch?v=GJDsKPsz64o

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentation:

A first draft of the knowledge gaps taxonomy for Wikimedia projects

By the Wikimedia Foundation Research Team <https://research.wikimedia.org/>

In response to Wikimedia Movement’s 2030 strategic direction
<https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20>,
the Research
team <https://research.wikimedia.org/team.html> at the Wikimedia Foundation
is developing a framework to understand and measure knowledge gaps. The
goal is to capture the multi-dimensional aspect of knowledge gaps and
inform long-term decision making. The first milestone was to develop a
taxonomy of knowledge gaps which offers a grouping and descriptions of the
different Wikimedia knowledge gaps. The first draft of the taxonomy is now
published <https://arxiv.org/abs/2008.12314> and we seek your feedback to
improve it. In this talk, we will give an overview over the first draft of
the taxonomy of knowledge gaps in Wikimedia projects. Following that, we
will host an extended Q in which we would like to get your feedback and
discuss with you the taxonomy and knowledge gaps more generally.

   -

   More information:
   https://meta.wikimedia.org/wiki/Research:Knowledge_Gaps_Index/Taxonomy


On Thu, Sep 17, 2020 at 3:27 PM Janna Layton  wrote:

> Hi all,
>
>
> The next Research Showcase will be live-streamed on Wednesday, September
> 23, at 9:30 AM PDT/16:30 UTC, and will be on the theme of knowledge gaps.
> Miriam Redi will give an overview on the first draft of the taxonomy of
> knowledge gaps in Wikimedia projects. The taxonomy is a first milestone
> towards developing a framework to understand and measure knowledge gaps
> with the goal of capturing the multi-dimensional aspect of knowledge gaps
> and inform long-term decision making.
>
> YouTube stream: https://www.youtube.com/watch?v=GJDsKPsz64o
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> A first draft of the knowledge gaps taxonomy for Wikimedia projects
>
> By the Wikimedia Foundation Research Team
> <https://research.wikimedia.org/>
>
> In response to Wikimedia Movement’s 2030 strategic direction
> <https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20>,
> the Research team <https://research.wikimedia.org/team.html> at the
> Wikimedia Foundation is developing a framework to understand and measure
> knowledge gaps. The goal is to capture the multi-dimensional aspect of
> knowledge gaps and inform long-term decision making. The first milestone
> was to develop a taxonomy of knowledge gaps which offers a grouping and
> descriptions of the different Wikimedia knowledge gaps. The first draft of
> the taxonomy is now published <https://arxiv.org/abs/2008.12314> and we
> seek your feedback to improve it. In this talk, we will give an overview
> over the first draft of the taxonomy of knowledge gaps in Wikimedia
> projects. Following that, we will host an extended Q in which we would
> like to get your feedback and discuss with you the taxonomy and knowledge
> gaps more generally.
>
>- More information:
>https://meta.wikimedia.org/wiki/Research:Knowledge_Gaps_Index/Taxonomy
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] [Wikimedia Research Showcase] September 23, 2020: Knowledge Gaps

2020-09-17 Thread Janna Layton
Hi all,


The next Research Showcase will be live-streamed on Wednesday, September
23, at 9:30 AM PDT/16:30 UTC, and will be on the theme of knowledge gaps.
Miriam Redi will give an overview on the first draft of the taxonomy of
knowledge gaps in Wikimedia projects. The taxonomy is a first milestone
towards developing a framework to understand and measure knowledge gaps
with the goal of capturing the multi-dimensional aspect of knowledge gaps
and inform long-term decision making.

YouTube stream: https://www.youtube.com/watch?v=GJDsKPsz64o

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentation:

A first draft of the knowledge gaps taxonomy for Wikimedia projects

By the Wikimedia Foundation Research Team <https://research.wikimedia.org/>

In response to Wikimedia Movement’s 2030 strategic direction
<https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20>,
the Research
team <https://research.wikimedia.org/team.html> at the Wikimedia Foundation
is developing a framework to understand and measure knowledge gaps. The
goal is to capture the multi-dimensional aspect of knowledge gaps and
inform long-term decision making. The first milestone was to develop a
taxonomy of knowledge gaps which offers a grouping and descriptions of the
different Wikimedia knowledge gaps. The first draft of the taxonomy is now
published <https://arxiv.org/abs/2008.12314> and we seek your feedback to
improve it. In this talk, we will give an overview over the first draft of
the taxonomy of knowledge gaps in Wikimedia projects. Following that, we
will host an extended Q in which we would like to get your feedback and
discuss with you the taxonomy and knowledge gaps more generally.

   - More information:
   https://meta.wikimedia.org/wiki/Research:Knowledge_Gaps_Index/Taxonomy


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] August 19, 2020: Readership and Navigation

2020-08-19 Thread Janna Layton
The Research Showcase will start in about 30 minutes.

On Thu, Aug 13, 2020 at 9:50 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, August 19,
> at 9:30 AM PDT/16:30 UTC, and will be on the theme of readership and
> navigation.
>
> YouTube stream: https://www.youtube.com/watch?v=MeUl0zjHdF8
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> What matters to us most and why? Studying popularity and attention
> dynamics via Wikipedia navigation data.
>
> By Taha Yasseri (University College Dublin), Patrick Gildersleve (Oxford
> Internet Institute)
>
> While Wikipedia research was initially focused on editorial behaviour or
> the content to a great extent, soon researchers realized the value of the
> navigation data both as a reflection of readers interest and, more
> generally, as a proxy for behaviour of online information seekers. In this
> talk we will report on various projects in which we utilized pageview
> statistics or readers navigation data to study: movies financial success
> [1], electoral popularity [2], disaster triggered collective attention [3]
> and collective memory [4], general navigation patterns and article typology
> [5], and attention patterns in relation to news breakouts.
>
>-
>
>[1] Early Prediction of Movie Box Office Success Based on Wikipedia
>Activity Big Data. PLoS One (2013).
>https://doi.org/10.1371/journal.pone.0071226
>-
>
>[2] Wikipedia traffic data and electoral prediction: towards
>theoretically informed models. EPJ Data Science (2016).
>https://doi.org/10.1140/epjds/s13688-016-0083-3
>-
>
>[3] Dynamics and biases of online attention: the case of aircraft
>crashes. Royal Society Open Science (2016).
>https://doi.org/10.1098/rsos.160460
>-
>
>[4] The memory remains: Understanding collective memory in the digital
>age. Science Advances (2018). https://doi.org/10.1126/sciadv.1602368
>-
>
>[5] Inspiration, captivation, and misdirection: Emergent properties in
>networks of online navigation. Springer (2018).
>https://ora.ox.ac.uk/objects/uuid:73baed3c-d3fe-4200-8e90-2d80b11f21cf
>
>
>
> Query for Architecture, Click through Military. Comparing the Roles of
> Search and Navigation on Wikipedia
>
> By Dimitar Dimitrov (GESIS - Leibniz Institute for the Social Sciences)
>
> As one of the richest sources of encyclopedic information on the Web,
> Wikipedia generates an enormous amount of traffic. In this paper, we study
> large-scale article access data of the English Wikipedia in order to
> compare articles with respect to the two main paradigms of information
> seeking, i.e., search by formulating a query, and navigation by following
> hyperlinks. To this end, we propose and employ two main metrics, namely (i)
> searchshare -- the relative amount of views an article received by search
> --, and (ii) resistance -- the ability of an article to relay traffic to
> other Wikipedia articles -- to characterize articles. We demonstrate how
> articles in distinct topical categories differ substantially in terms of
> these properties. For example, architecture-related articles are often
> accessed through search and are simultaneously a "dead end" for traffic,
> whereas historical articles about military events are mainly navigated. We
> further link traffic differences to varying network, content, and editing
> activity features. Lastly, we measure the impact of the article properties
> by modeling access behavior on articles with a gradient boosting approach.
> The results of this paper constitute a step towards understanding human
> information seeking behavior on the Web.
>
>
>-
>
>Different Topic, Different Traffic: How Search and Navigation
>Interplay on Wikipedia. Journal of Web Science (2019).
>https://doi.org/10.34962/jws-71
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] August 19, 2020: Readership and Navigation

2020-08-17 Thread Janna Layton
Just a reminder that this will be happening on Wednesday!

On Thu, Aug 13, 2020 at 9:50 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, August 19,
> at 9:30 AM PDT/16:30 UTC, and will be on the theme of readership and
> navigation.
>
> YouTube stream: https://www.youtube.com/watch?v=MeUl0zjHdF8
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> What matters to us most and why? Studying popularity and attention
> dynamics via Wikipedia navigation data.
>
> By Taha Yasseri (University College Dublin), Patrick Gildersleve (Oxford
> Internet Institute)
>
> While Wikipedia research was initially focused on editorial behaviour or
> the content to a great extent, soon researchers realized the value of the
> navigation data both as a reflection of readers interest and, more
> generally, as a proxy for behaviour of online information seekers. In this
> talk we will report on various projects in which we utilized pageview
> statistics or readers navigation data to study: movies financial success
> [1], electoral popularity [2], disaster triggered collective attention [3]
> and collective memory [4], general navigation patterns and article typology
> [5], and attention patterns in relation to news breakouts.
>
>-
>
>[1] Early Prediction of Movie Box Office Success Based on Wikipedia
>Activity Big Data. PLoS One (2013).
>https://doi.org/10.1371/journal.pone.0071226
>-
>
>[2] Wikipedia traffic data and electoral prediction: towards
>theoretically informed models. EPJ Data Science (2016).
>https://doi.org/10.1140/epjds/s13688-016-0083-3
>-
>
>[3] Dynamics and biases of online attention: the case of aircraft
>crashes. Royal Society Open Science (2016).
>https://doi.org/10.1098/rsos.160460
>-
>
>[4] The memory remains: Understanding collective memory in the digital
>age. Science Advances (2018). https://doi.org/10.1126/sciadv.1602368
>-
>
>[5] Inspiration, captivation, and misdirection: Emergent properties in
>networks of online navigation. Springer (2018).
>https://ora.ox.ac.uk/objects/uuid:73baed3c-d3fe-4200-8e90-2d80b11f21cf
>
>
>
> Query for Architecture, Click through Military. Comparing the Roles of
> Search and Navigation on Wikipedia
>
> By Dimitar Dimitrov (GESIS - Leibniz Institute for the Social Sciences)
>
> As one of the richest sources of encyclopedic information on the Web,
> Wikipedia generates an enormous amount of traffic. In this paper, we study
> large-scale article access data of the English Wikipedia in order to
> compare articles with respect to the two main paradigms of information
> seeking, i.e., search by formulating a query, and navigation by following
> hyperlinks. To this end, we propose and employ two main metrics, namely (i)
> searchshare -- the relative amount of views an article received by search
> --, and (ii) resistance -- the ability of an article to relay traffic to
> other Wikipedia articles -- to characterize articles. We demonstrate how
> articles in distinct topical categories differ substantially in terms of
> these properties. For example, architecture-related articles are often
> accessed through search and are simultaneously a "dead end" for traffic,
> whereas historical articles about military events are mainly navigated. We
> further link traffic differences to varying network, content, and editing
> activity features. Lastly, we measure the impact of the article properties
> by modeling access behavior on articles with a gradient boosting approach.
> The results of this paper constitute a step towards understanding human
> information seeking behavior on the Web.
>
>
>-
>
>Different Topic, Different Traffic: How Search and Navigation
>Interplay on Wikipedia. Journal of Web Science (2019).
>https://doi.org/10.34962/jws-71
>
>
> --
> Janna Layton (she/her)
> Administrative Associate - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] [Wikimedia Research Showcase] August 19, 2020: Readership and Navigation

2020-08-13 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, August 19,
at 9:30 AM PDT/16:30 UTC, and will be on the theme of readership and
navigation.

YouTube stream: https://www.youtube.com/watch?v=MeUl0zjHdF8

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

What matters to us most and why? Studying popularity and attention dynamics
via Wikipedia navigation data.

By Taha Yasseri (University College Dublin), Patrick Gildersleve (Oxford
Internet Institute)

While Wikipedia research was initially focused on editorial behaviour or
the content to a great extent, soon researchers realized the value of the
navigation data both as a reflection of readers interest and, more
generally, as a proxy for behaviour of online information seekers. In this
talk we will report on various projects in which we utilized pageview
statistics or readers navigation data to study: movies financial success
[1], electoral popularity [2], disaster triggered collective attention [3]
and collective memory [4], general navigation patterns and article typology
[5], and attention patterns in relation to news breakouts.

   -

   [1] Early Prediction of Movie Box Office Success Based on Wikipedia
   Activity Big Data. PLoS One (2013).
   https://doi.org/10.1371/journal.pone.0071226
   -

   [2] Wikipedia traffic data and electoral prediction: towards
   theoretically informed models. EPJ Data Science (2016).
   https://doi.org/10.1140/epjds/s13688-016-0083-3
   -

   [3] Dynamics and biases of online attention: the case of aircraft
   crashes. Royal Society Open Science (2016).
   https://doi.org/10.1098/rsos.160460
   -

   [4] The memory remains: Understanding collective memory in the digital
   age. Science Advances (2018). https://doi.org/10.1126/sciadv.1602368
   -

   [5] Inspiration, captivation, and misdirection: Emergent properties in
   networks of online navigation. Springer (2018).
   https://ora.ox.ac.uk/objects/uuid:73baed3c-d3fe-4200-8e90-2d80b11f21cf



Query for Architecture, Click through Military. Comparing the Roles of
Search and Navigation on Wikipedia

By Dimitar Dimitrov (GESIS - Leibniz Institute for the Social Sciences)

As one of the richest sources of encyclopedic information on the Web,
Wikipedia generates an enormous amount of traffic. In this paper, we study
large-scale article access data of the English Wikipedia in order to
compare articles with respect to the two main paradigms of information
seeking, i.e., search by formulating a query, and navigation by following
hyperlinks. To this end, we propose and employ two main metrics, namely (i)
searchshare -- the relative amount of views an article received by search
--, and (ii) resistance -- the ability of an article to relay traffic to
other Wikipedia articles -- to characterize articles. We demonstrate how
articles in distinct topical categories differ substantially in terms of
these properties. For example, architecture-related articles are often
accessed through search and are simultaneously a "dead end" for traffic,
whereas historical articles about military events are mainly navigated. We
further link traffic differences to varying network, content, and editing
activity features. Lastly, we measure the impact of the article properties
by modeling access behavior on articles with a gradient boosting approach.
The results of this paper constitute a step towards understanding human
information seeking behavior on the Web.


   -

   Different Topic, Different Traffic: How Search and Navigation Interplay
   on Wikipedia. Journal of Web Science (2019).
   https://doi.org/10.34962/jws-71


-- 
Janna Layton (she/her)
Administrative Associate - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] [Wikimedia Research Showcase] July 15, 2020: Medical Knowledge on Wikipedia

2020-07-15 Thread Janna Layton
Hi all, this will be starting in about 15 minutes.

On Mon, Jul 13, 2020 at 2:20 PM Janna Layton  wrote:

> Hello all,
>
> Reminder that this very topical Research Showcase will be happening on
> Wednesday.
>
> On Thu, Jul 9, 2020 at 12:26 PM Janna Layton 
> wrote:
>
>> Hi all,
>>
>> The next Research Showcase will be live-streamed on Wednesday, July 15,
>> at 9:30 AM PDT/16:30 UTC.
>>
>> Wikipedia is one of the most important online resources for health
>> information. This has been especially highlighted during the Covid-19
>> pandemic: since the beginning of the year more than 5000 articles related
>> to Covid-19 have been created receiving more than 400M pageviews.
>> Therefore, for this month’s showcase our two invited speakers will help us
>> get a better understanding of the state of medical knowledge in Wikipedia.
>> In the first talk, Denise Smith will give an overview on how Wikipedia's
>> health content is used by different audiences (public, students, or
>> practitioners). In the second talk, Giovanni Colavizza will present results
>> on how editors on Wikipedia find, select, and integrate scientific
>> information on Covid-19 into Wikipedia articles.
>>
>> YouTube stream: https://www.youtube.com/watch?v=qIV26lWrD9c
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Wikipedia for health information - Situating Wikipedia as a health
>> information resource
>>
>> By Denise Smith (McMaster University, Health Sciences Library & Western
>> University, Faculty of Information & Media Studies)
>>
>> Wikipedia is the most frequently accessed web site for health
>> information, but the various ways users engage with Wikipedia’s health
>> content has not been thoroughly investigated or reported. This talk will
>> summarize the findings of a comprehensive literature review published in
>> February. It explores all the contexts in which Wikipedia’s health content
>> is used that have been reported in academic literature. The talk will focus
>> on the findings reported in this paper, the potential impact of this study
>> in health and medical librarianship, the practice of medicine, and medical
>> or health education.
>>
>>
>>-
>>
>>D.A. Smith (2020). "Situating Wikipedia as a health information
>>resource in various contexts: A scoping review". PLoS ONE.
>>https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0228786
>>
>>
>>
>> COVID-19 research in Wikipedia
>>
>> By Giovanni Colavizza (University of Amsterdam, Netherlands)
>>
>> Wikipedia is one of the main sources of free knowledge on the Web. During
>> the first few months of the pandemic, over 4,500 new Wikipedia pages on
>> COVID-19 have been created and have accumulated close to 250M pageviews by
>> early April 2020.1 At the same time, an unprecedented amount of scientific
>> articles on COVID-19 and the ongoing pandemic have been published online.
>> Wikipedia’s contents are based on reliable sources, primarily scientific
>> literature. Given its public function, it is crucial for Wikipedia to rely
>> on representative and reliable scientific results, especially so in a time
>> of crisis. We assess the coverage of COVID-19-related research in Wikipedia
>> via citations. We find that Wikipedia editors are integrating new research
>> at an unprecedented fast pace. While doing so, they are able to provide a
>> largely representative coverage of COVID-19-related research. We show that
>> all the main topics discussed in this literature are proportionally
>> represented from Wikipedia, after accounting for article-level effects. We
>> further use regression analyses to model citations from Wikipedia and show
>> that, despite the pressure to keep up with novel results, Wikipedia editors
>> rely on literature which is highly cited, widely shared on social media,
>> and has been peer-reviewed.
>>
>>
>>-
>>
>>G. Colavizza (2020). "COVID-19 research in Wikipedia". bioRxiv.
>>    https://www.biorxiv.org/content/10.1101/2020.05.10.087643v2
>>
>>
>> --
>> Janna Layton (she/her)
>> Administrative Assistant - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>> --
>> Janna Layton (she/her)
>> Administrative Assistant - Product & 

Re: [Wikimedia-l] [Wikimedia Research Showcase] July 15, 2020: Medical Knowledge on Wikipedia

2020-07-13 Thread Janna Layton
Hello all,

Reminder that this very topical Research Showcase will be happening on
Wednesday.

On Thu, Jul 9, 2020 at 12:26 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, July 15, at
> 9:30 AM PDT/16:30 UTC.
>
> Wikipedia is one of the most important online resources for health
> information. This has been especially highlighted during the Covid-19
> pandemic: since the beginning of the year more than 5000 articles related
> to Covid-19 have been created receiving more than 400M pageviews.
> Therefore, for this month’s showcase our two invited speakers will help us
> get a better understanding of the state of medical knowledge in Wikipedia.
> In the first talk, Denise Smith will give an overview on how Wikipedia's
> health content is used by different audiences (public, students, or
> practitioners). In the second talk, Giovanni Colavizza will present results
> on how editors on Wikipedia find, select, and integrate scientific
> information on Covid-19 into Wikipedia articles.
>
> YouTube stream: https://www.youtube.com/watch?v=qIV26lWrD9c
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Wikipedia for health information - Situating Wikipedia as a health
> information resource
>
> By Denise Smith (McMaster University, Health Sciences Library & Western
> University, Faculty of Information & Media Studies)
>
> Wikipedia is the most frequently accessed web site for health information,
> but the various ways users engage with Wikipedia’s health content has not
> been thoroughly investigated or reported. This talk will summarize the
> findings of a comprehensive literature review published in February. It
> explores all the contexts in which Wikipedia’s health content is used that
> have been reported in academic literature. The talk will focus on the
> findings reported in this paper, the potential impact of this study in
> health and medical librarianship, the practice of medicine, and medical or
> health education.
>
>
>-
>
>D.A. Smith (2020). "Situating Wikipedia as a health information
>resource in various contexts: A scoping review". PLoS ONE.
>https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0228786
>
>
>
> COVID-19 research in Wikipedia
>
> By Giovanni Colavizza (University of Amsterdam, Netherlands)
>
> Wikipedia is one of the main sources of free knowledge on the Web. During
> the first few months of the pandemic, over 4,500 new Wikipedia pages on
> COVID-19 have been created and have accumulated close to 250M pageviews by
> early April 2020.1 At the same time, an unprecedented amount of scientific
> articles on COVID-19 and the ongoing pandemic have been published online.
> Wikipedia’s contents are based on reliable sources, primarily scientific
> literature. Given its public function, it is crucial for Wikipedia to rely
> on representative and reliable scientific results, especially so in a time
> of crisis. We assess the coverage of COVID-19-related research in Wikipedia
> via citations. We find that Wikipedia editors are integrating new research
> at an unprecedented fast pace. While doing so, they are able to provide a
> largely representative coverage of COVID-19-related research. We show that
> all the main topics discussed in this literature are proportionally
> represented from Wikipedia, after accounting for article-level effects. We
> further use regression analyses to model citations from Wikipedia and show
> that, despite the pressure to keep up with novel results, Wikipedia editors
> rely on literature which is highly cited, widely shared on social media,
> and has been peer-reviewed.
>
>
>-
>
>G. Colavizza (2020). "COVID-19 research in Wikipedia". bioRxiv.
>https://www.biorxiv.org/content/10.1101/2020.05.10.087643v2
>
>
> --
> Janna Layton (she/her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>
> --
> Janna Layton (she/her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she/her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] July 15, 2020: Medical Knowledge on Wikipedia

2020-07-09 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, July 15, at
9:30 AM PDT/16:30 UTC.

Wikipedia is one of the most important online resources for health
information. This has been especially highlighted during the Covid-19
pandemic: since the beginning of the year more than 5000 articles related
to Covid-19 have been created receiving more than 400M pageviews.
Therefore, for this month’s showcase our two invited speakers will help us
get a better understanding of the state of medical knowledge in Wikipedia.
In the first talk, Denise Smith will give an overview on how Wikipedia's
health content is used by different audiences (public, students, or
practitioners). In the second talk, Giovanni Colavizza will present results
on how editors on Wikipedia find, select, and integrate scientific
information on Covid-19 into Wikipedia articles.

YouTube stream: https://www.youtube.com/watch?v=qIV26lWrD9c

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Wikipedia for health information - Situating Wikipedia as a health
information resource

By Denise Smith (McMaster University, Health Sciences Library & Western
University, Faculty of Information & Media Studies)

Wikipedia is the most frequently accessed web site for health information,
but the various ways users engage with Wikipedia’s health content has not
been thoroughly investigated or reported. This talk will summarize the
findings of a comprehensive literature review published in February. It
explores all the contexts in which Wikipedia’s health content is used that
have been reported in academic literature. The talk will focus on the
findings reported in this paper, the potential impact of this study in
health and medical librarianship, the practice of medicine, and medical or
health education.


   -

   D.A. Smith (2020). "Situating Wikipedia as a health information resource
   in various contexts: A scoping review". PLoS ONE.
   https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0228786



COVID-19 research in Wikipedia

By Giovanni Colavizza (University of Amsterdam, Netherlands)

Wikipedia is one of the main sources of free knowledge on the Web. During
the first few months of the pandemic, over 4,500 new Wikipedia pages on
COVID-19 have been created and have accumulated close to 250M pageviews by
early April 2020.1 At the same time, an unprecedented amount of scientific
articles on COVID-19 and the ongoing pandemic have been published online.
Wikipedia’s contents are based on reliable sources, primarily scientific
literature. Given its public function, it is crucial for Wikipedia to rely
on representative and reliable scientific results, especially so in a time
of crisis. We assess the coverage of COVID-19-related research in Wikipedia
via citations. We find that Wikipedia editors are integrating new research
at an unprecedented fast pace. While doing so, they are able to provide a
largely representative coverage of COVID-19-related research. We show that
all the main topics discussed in this literature are proportionally
represented from Wikipedia, after accounting for article-level effects. We
further use regression analyses to model citations from Wikipedia and show
that, despite the pressure to keep up with novel results, Wikipedia editors
rely on literature which is highly cited, widely shared on social media,
and has been peer-reviewed.


   -

   G. Colavizza (2020). "COVID-19 research in Wikipedia". bioRxiv.
   https://www.biorxiv.org/content/10.1101/2020.05.10.087643v2


-- 
Janna Layton (she/her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>

-- 
Janna Layton (she/her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] June 17, 2020: Credibility and Verifiability

2020-06-16 Thread Janna Layton
Reminder that this is happening tomorrow!

On Thu, Jun 11, 2020 at 3:24 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, June 17, at
> 9:30 AM PDT/16:30 UTC.
>
> In the era of 'information explosion,' we strive to stay informed and
> relevant often too quickly, and hence run into the peril of consuming false
> or distorted facts. This month, our invited speakers will help us
> understand these dynamics, especially in the context of Wikipedia's content
> and readership. First, Connie will talk about an initiative she's been
> leading to source and rank credible information from the news, and its
> overlap with Wikipedia. In the second talk, Tiziano will present his recent
> work on quantifying and understanding how the readers of Wikipedia interact
> with an article's citations to verify specific claims.
>
> YouTube stream: https://www.youtube.com/watch?v=GS9Jc3IFhVQ
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
>
> Today’s News, Tomorrow’s Reference, and The Problem of Information
> Reliability - An Introduction to NewsQ
>
> By: Connie Moon Sehat, NewsQ, Hacks/Hackers
>
> The effort to make Wikipedia more reliable is related to the larger
> challenges facing the information ecosystem overall. These challenges
> include the discovery of and accessibility to reliable news amid the
> transformation of news distribution through platform and social media
> products. Connie will present some of the challenges related to the ranking
> and recommendation of news that are addressed by the NewsQ Initiative, a
> collaboration between the Tow-Knight Center for Entrepreneurial Journalism
> at the Craig Newmark Graduate School of Journalism and Hacks/Hackers. In
> addition, she’ll share some of the ways that the project intersects with
> Wikipedia, such as supporting research around the US Perennial Sources list
> (
> https://en.wikipedia.org/wiki/Wikipedia:Reliable_sources/Perennial_sources
> ).
>
> Related resources
>
>-
>
>NewsQ Initiative site (https://newsq.net/)
>
>
>-
>
>DUE JUNE 15 (Please apply if interested!): Social Science Research
>Council Call for Papers, “News Quality in the Platform Era”
>
> https://www.ssrc.org/programs/component/media-democracy/news-quality-in-the-platform-era/
>
>
>-
>
>M. Bhuiyan, A. Zhang, C. Sehat, T. Mitra, 2020. Investigating "Who" in
>the Crowdsourcing of News Credibility, C+J 2020 (
>
> https://cpb-us-w2.wpmucdn.com/express.northeastern.edu/dist/d/53/files/2020/02/CJ_2020_paper_32.pdf
>)
>
>
>
>
> Quantifying Engagement with Citations on Wikipedia
>
> By: Tiziano Piccardi, EPFL
>
> Wikipedia, the free online encyclopedia that anyone can edit, is one of
> the most visited sites on the Web and a common source of information for
> many users. As an encyclopedia, Wikipedia is not a source of original
> information, but was conceived as a gateway to secondary sources: according
> to Wikipedia's guidelines, facts must be backed up by reliable sources that
> reflect the full spectrum of views on the topic. Although citations lie at
> the very heart of Wikipedia, little is known about how users interact with
> them. To close this gap, we built client-side instrumentation for logging
> all interactions with links leading from English Wikipedia articles to
> cited references for one month and conducted the first analysis of readers'
> interaction with citations on Wikipedia. We find that overall engagement
> with citations is low: about one in 300 page views results in a reference
> click (0.29% overall; 0.56% on desktop; 0.13% on mobile). Matched
> observational studies of the factors associated with reference clicking
> reveal that clicks occur more frequently on shorter pages and on pages of
> lower quality, suggesting that references are consulted more commonly when
> Wikipedia itself does not contain the information sought by the user.
> Moreover, we observe that recent content, open access sources, and
> references about life events (births, deaths, marriages, etc) are
> particularly popular. Taken together, our findings open the door to a
> deeper understanding of Wikipedia's role in a global information economy
> where reliability is ever less certain, and source attribution ever more
> vital.
>
> Paper: https://arxiv.org/abs/2001.08614
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>

[Wikimedia-l] [Wikimedia Research Showcase] June 17, 2020: Credibility and Verifiability

2020-06-11 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, June 17, at
9:30 AM PDT/16:30 UTC.

In the era of 'information explosion,' we strive to stay informed and
relevant often too quickly, and hence run into the peril of consuming false
or distorted facts. This month, our invited speakers will help us
understand these dynamics, especially in the context of Wikipedia's content
and readership. First, Connie will talk about an initiative she's been
leading to source and rank credible information from the news, and its
overlap with Wikipedia. In the second talk, Tiziano will present his recent
work on quantifying and understanding how the readers of Wikipedia interact
with an article's citations to verify specific claims.

YouTube stream: https://www.youtube.com/watch?v=GS9Jc3IFhVQ

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:


Today’s News, Tomorrow’s Reference, and The Problem of Information
Reliability - An Introduction to NewsQ

By: Connie Moon Sehat, NewsQ, Hacks/Hackers

The effort to make Wikipedia more reliable is related to the larger
challenges facing the information ecosystem overall. These challenges
include the discovery of and accessibility to reliable news amid the
transformation of news distribution through platform and social media
products. Connie will present some of the challenges related to the ranking
and recommendation of news that are addressed by the NewsQ Initiative, a
collaboration between the Tow-Knight Center for Entrepreneurial Journalism
at the Craig Newmark Graduate School of Journalism and Hacks/Hackers. In
addition, she’ll share some of the ways that the project intersects with
Wikipedia, such as supporting research around the US Perennial Sources list
(https://en.wikipedia.org/wiki/Wikipedia:Reliable_sources/Perennial_sources
).

Related resources

   -

   NewsQ Initiative site (https://newsq.net/)


   -

   DUE JUNE 15 (Please apply if interested!): Social Science Research
   Council Call for Papers, “News Quality in the Platform Era”
   
https://www.ssrc.org/programs/component/media-democracy/news-quality-in-the-platform-era/


   -

   M. Bhuiyan, A. Zhang, C. Sehat, T. Mitra, 2020. Investigating "Who" in
   the Crowdsourcing of News Credibility, C+J 2020 (
   
https://cpb-us-w2.wpmucdn.com/express.northeastern.edu/dist/d/53/files/2020/02/CJ_2020_paper_32.pdf
   )




Quantifying Engagement with Citations on Wikipedia

By: Tiziano Piccardi, EPFL

Wikipedia, the free online encyclopedia that anyone can edit, is one of the
most visited sites on the Web and a common source of information for many
users. As an encyclopedia, Wikipedia is not a source of original
information, but was conceived as a gateway to secondary sources: according
to Wikipedia's guidelines, facts must be backed up by reliable sources that
reflect the full spectrum of views on the topic. Although citations lie at
the very heart of Wikipedia, little is known about how users interact with
them. To close this gap, we built client-side instrumentation for logging
all interactions with links leading from English Wikipedia articles to
cited references for one month and conducted the first analysis of readers'
interaction with citations on Wikipedia. We find that overall engagement
with citations is low: about one in 300 page views results in a reference
click (0.29% overall; 0.56% on desktop; 0.13% on mobile). Matched
observational studies of the factors associated with reference clicking
reveal that clicks occur more frequently on shorter pages and on pages of
lower quality, suggesting that references are consulted more commonly when
Wikipedia itself does not contain the information sought by the user.
Moreover, we observe that recent content, open access sources, and
references about life events (births, deaths, marriages, etc) are
particularly popular. Taken together, our findings open the door to a
deeper understanding of Wikipedia's role in a global information economy
where reliability is ever less certain, and source attribution ever more
vital.

Paper: https://arxiv.org/abs/2001.08614


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] May 20, 2020: Human in the Loop Machine Learning

2020-05-20 Thread Janna Layton
May's Research Showcase will be starting in 30 minutes!

On Fri, May 15, 2020 at 1:04 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, May 20, at
> 9:30 AM PDT/16:30 UTC.
>
> This month we will learn about recent research on machine learning systems
> that rely on human supervision for their learning and optimization -- a
> research area commonly referred to as Human-in-the-Loop ML. In the first
> talk, Jie Yang will present a computational framework that relies on
> crowdsourcing to identify influencers in Social Networks (Twitter) by
> selectively obtaining labeled data. In the second talk, Estelle Smith will
> discuss the role of the community in maintaining ORES, the machine learning
> system that predicts the quality in Wikipedia applications.
>
> YouTube stream: https://www.youtube.com/watch?v=8nDiu2ebdOI
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> *OpenCrowd: A Human-AI Collaborative Approach for Finding Social
> Influencers via Open-Ended Answers Aggregation*
>
> By: Jie Yang, Amazon (current), Delft University of Technology (starting
> soon)
>
> Finding social influencers is a fundamental task in many online
> applications ranging from brand marketing to opinion mining. Existing
> methods heavily rely on the availability of expert labels, whose collection
> is usually a laborious process even for domain experts. Using open-ended
> questions, crowdsourcing provides a cost-effective way to find a large
> number of social influencers in a short time. Individual crowd workers,
> however, only possess fragmented knowledge that is often of low quality. To
> tackle those issues, we present OpenCrowd, a unified Bayesian framework
> that seamlessly incorporates machine learning and crowdsourcing for
> effectively finding social influencers. To infer a set of influencers,
> OpenCrowd bootstraps the learning process using a small number of expert
> labels and then jointly learns a feature-based answer quality model and the
> reliability of the workers. Model parameters and worker reliability are
> updated iteratively, allowing their learning processes to benefit from each
> other until an agreement on the quality of the answers is reached. We
> derive a principled optimization algorithm based on variational inference
> with efficient updating rules for learning OpenCrowd parameters.
> Experimental results on finding social influencers in different domains
> show that our approach substantially improves the state of the art by 11.5%
> AUC. Moreover, we empirically show that our approach is particularly useful
> in finding micro-influencers, who are very directly engaged with smaller
> audiences.
>
> Paper: https://dl.acm.org/doi/fullHtml/10.1145/3366423.3380254
>
> *Keeping Community in the Machine-Learning Loop*
>
> By:  C. Estelle Smith, MS, PhD Candidate, GroupLens Research Lab at the
> University of Minnesota
>
> On Wikipedia, sophisticated algorithmic tools are used to assess the
> quality of edits and take corrective actions. However, algorithms can fail
> to solve the problems they were designed for if they conflict with the
> values of communities who use them. In this study, we take a
> Value-Sensitive Algorithm Design approach to understanding a
> community-created and -maintained machine learning-based algorithm called
> the Objective Revision Evaluation System (ORES)—a quality prediction system
> used in numerous Wikipedia applications and contexts. Five major values
> converged across stakeholder groups that ORES (and its dependent
> applications) should: (1) reduce the effort of community maintenance, (2)
> maintain human judgement as the final authority, (3) support differing
> peoples’ differing workflows, (4) encourage positive engagement with
> diverse editor groups, and (5) establish trustworthiness of people and
> algorithms within the community. We reveal tensions between these values
> and discuss implications for future research to improve algorithms like
> ORES.
>
> Paper:
> https://commons.wikimedia.org/wiki/File:Keeping_Community_in_the_Loop-_Understanding_Wikipedia_Stakeholder_Values_for_Machine_Learning-Based_Systems.pdf
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list

Re: [Wikimedia-l] [Wikimedia Research Showcase] May 20, 2020: Human in the Loop Machine Learning

2020-05-18 Thread Janna Layton
Just a reminder that this is happening on Wednesday. There will be a Q,
which might be especially nice if you're in an area that's still socially
distancing. ;)

On Fri, May 15, 2020 at 1:04 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, May 20, at
> 9:30 AM PDT/16:30 UTC.
>
> This month we will learn about recent research on machine learning systems
> that rely on human supervision for their learning and optimization -- a
> research area commonly referred to as Human-in-the-Loop ML. In the first
> talk, Jie Yang will present a computational framework that relies on
> crowdsourcing to identify influencers in Social Networks (Twitter) by
> selectively obtaining labeled data. In the second talk, Estelle Smith will
> discuss the role of the community in maintaining ORES, the machine learning
> system that predicts the quality in Wikipedia applications.
>
> YouTube stream: https://www.youtube.com/watch?v=8nDiu2ebdOI
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> *OpenCrowd: A Human-AI Collaborative Approach for Finding Social
> Influencers via Open-Ended Answers Aggregation*
>
> By: Jie Yang, Amazon (current), Delft University of Technology (starting
> soon)
>
> Finding social influencers is a fundamental task in many online
> applications ranging from brand marketing to opinion mining. Existing
> methods heavily rely on the availability of expert labels, whose collection
> is usually a laborious process even for domain experts. Using open-ended
> questions, crowdsourcing provides a cost-effective way to find a large
> number of social influencers in a short time. Individual crowd workers,
> however, only possess fragmented knowledge that is often of low quality. To
> tackle those issues, we present OpenCrowd, a unified Bayesian framework
> that seamlessly incorporates machine learning and crowdsourcing for
> effectively finding social influencers. To infer a set of influencers,
> OpenCrowd bootstraps the learning process using a small number of expert
> labels and then jointly learns a feature-based answer quality model and the
> reliability of the workers. Model parameters and worker reliability are
> updated iteratively, allowing their learning processes to benefit from each
> other until an agreement on the quality of the answers is reached. We
> derive a principled optimization algorithm based on variational inference
> with efficient updating rules for learning OpenCrowd parameters.
> Experimental results on finding social influencers in different domains
> show that our approach substantially improves the state of the art by 11.5%
> AUC. Moreover, we empirically show that our approach is particularly useful
> in finding micro-influencers, who are very directly engaged with smaller
> audiences.
>
> Paper: https://dl.acm.org/doi/fullHtml/10.1145/3366423.3380254
>
> *Keeping Community in the Machine-Learning Loop*
>
> By:  C. Estelle Smith, MS, PhD Candidate, GroupLens Research Lab at the
> University of Minnesota
>
> On Wikipedia, sophisticated algorithmic tools are used to assess the
> quality of edits and take corrective actions. However, algorithms can fail
> to solve the problems they were designed for if they conflict with the
> values of communities who use them. In this study, we take a
> Value-Sensitive Algorithm Design approach to understanding a
> community-created and -maintained machine learning-based algorithm called
> the Objective Revision Evaluation System (ORES)—a quality prediction system
> used in numerous Wikipedia applications and contexts. Five major values
> converged across stakeholder groups that ORES (and its dependent
> applications) should: (1) reduce the effort of community maintenance, (2)
> maintain human judgement as the final authority, (3) support differing
> peoples’ differing workflows, (4) encourage positive engagement with
> diverse editor groups, and (5) establish trustworthiness of people and
> algorithms within the community. We reveal tensions between these values
> and discuss implications for future research to improve algorithms like
> ORES.
>
> Paper:
> https://commons.wikimedia.org/wiki/File:Keeping_Community_in_the_Loop-_Understanding_Wikipedia_Stakeholder_Values_for_Machine_Learning-Based_Systems.pdf
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundat

[Wikimedia-l] [Wikimedia Research Showcase] May 20, 2020: Human in the Loop Machine Learning

2020-05-15 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, May 20, at
9:30 AM PDT/16:30 UTC.

This month we will learn about recent research on machine learning systems
that rely on human supervision for their learning and optimization -- a
research area commonly referred to as Human-in-the-Loop ML. In the first
talk, Jie Yang will present a computational framework that relies on
crowdsourcing to identify influencers in Social Networks (Twitter) by
selectively obtaining labeled data. In the second talk, Estelle Smith will
discuss the role of the community in maintaining ORES, the machine learning
system that predicts the quality in Wikipedia applications.

YouTube stream: https://www.youtube.com/watch?v=8nDiu2ebdOI

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

*OpenCrowd: A Human-AI Collaborative Approach for Finding Social
Influencers via Open-Ended Answers Aggregation*

By: Jie Yang, Amazon (current), Delft University of Technology (starting
soon)

Finding social influencers is a fundamental task in many online
applications ranging from brand marketing to opinion mining. Existing
methods heavily rely on the availability of expert labels, whose collection
is usually a laborious process even for domain experts. Using open-ended
questions, crowdsourcing provides a cost-effective way to find a large
number of social influencers in a short time. Individual crowd workers,
however, only possess fragmented knowledge that is often of low quality. To
tackle those issues, we present OpenCrowd, a unified Bayesian framework
that seamlessly incorporates machine learning and crowdsourcing for
effectively finding social influencers. To infer a set of influencers,
OpenCrowd bootstraps the learning process using a small number of expert
labels and then jointly learns a feature-based answer quality model and the
reliability of the workers. Model parameters and worker reliability are
updated iteratively, allowing their learning processes to benefit from each
other until an agreement on the quality of the answers is reached. We
derive a principled optimization algorithm based on variational inference
with efficient updating rules for learning OpenCrowd parameters.
Experimental results on finding social influencers in different domains
show that our approach substantially improves the state of the art by 11.5%
AUC. Moreover, we empirically show that our approach is particularly useful
in finding micro-influencers, who are very directly engaged with smaller
audiences.

Paper: https://dl.acm.org/doi/fullHtml/10.1145/3366423.3380254

*Keeping Community in the Machine-Learning Loop*

By:  C. Estelle Smith, MS, PhD Candidate, GroupLens Research Lab at the
University of Minnesota

On Wikipedia, sophisticated algorithmic tools are used to assess the
quality of edits and take corrective actions. However, algorithms can fail
to solve the problems they were designed for if they conflict with the
values of communities who use them. In this study, we take a
Value-Sensitive Algorithm Design approach to understanding a
community-created and -maintained machine learning-based algorithm called
the Objective Revision Evaluation System (ORES)—a quality prediction system
used in numerous Wikipedia applications and contexts. Five major values
converged across stakeholder groups that ORES (and its dependent
applications) should: (1) reduce the effort of community maintenance, (2)
maintain human judgement as the final authority, (3) support differing
peoples’ differing workflows, (4) encourage positive engagement with
diverse editor groups, and (5) establish trustworthiness of people and
algorithms within the community. We reveal tensions between these values
and discuss implications for future research to improve algorithms like
ORES.

Paper:
https://commons.wikimedia.org/wiki/File:Keeping_Community_in_the_Loop-_Understanding_Wikipedia_Stakeholder_Values_for_Machine_Learning-Based_Systems.pdf

-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] March 18, 2020: Topic Modeling

2020-03-18 Thread Janna Layton
The Research Showcase and that coveted human interaction via a Q is
starting in about 30 minutes!

On Thu, Mar 12, 2020 at 12:29 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, March 18,
> at 9:30 AM PDT/16:30 UTC. We’ll have a presentation on topic modeling by
> Jordan Boyd-Graber. A question-and-answer session will follow.
>
> YouTube stream: https://www.youtube.com/watch?v=fiD9QTHNVVM
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> Big Data Analysis with Topic Models: Evaluation, Interaction, and
> Multilingual Extensions
>
> By: Jordan Boyd-Graber, University of Maryland
>
> A common information need is to understand large, unstructured datasets:
> millions of e-mails during e-discovery, a decade worth of science
> correspondence, or a day's tweets. In the last decade, topic models have
> become a common tool for navigating such datasets even across languages.
> This talk investigates the foundational research that allows successful
> tools for these data exploration tasks: how to know when you have an
> effective model of the dataset; how to correct bad models; how to measure
> topic model effectiveness; and how to detect framing and spin using these
> techniques. After introducing topic models, I argue why traditional
> measures of topic model quality---borrowed from machine learning---are
> inconsistent with how topic models are actually used. In response, I
> describe interactive topic modeling, a technique that enables users to
> impart their insights and preferences to models in a principled,
> interactive way. I will then address measuring topic model effectiveness in
> real-world tasks.
>
> Overview of topic models:
> https://mimno.infosci.cornell.edu/papers/2017_fntir_tm_applications.pdf
>
> Topic model evaluation: http://umiacs.umd.edu/~jbg//docs/nips2009-rtl.pdf
>
> Interactive topic modeling:
> http://umiacs.umd.edu/~jbg//docs/2014_mlj_itm.pdf
> Topic Models for Categorization:
> http://users.umiacs.umd.edu/~jbg//docs/2016_acl_doclabel.pdf
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] March 18, 2020: Topic Modeling

2020-03-16 Thread Janna Layton
Hi all,

Just a reminder that this fully remote event is happening on Wednesday!

On Thu, Mar 12, 2020 at 12:29 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, March 18,
> at 9:30 AM PDT/16:30 UTC. We’ll have a presentation on topic modeling by
> Jordan Boyd-Graber. A question-and-answer session will follow.
>
> YouTube stream: https://www.youtube.com/watch?v=fiD9QTHNVVM
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> Big Data Analysis with Topic Models: Evaluation, Interaction, and
> Multilingual Extensions
>
> By: Jordan Boyd-Graber, University of Maryland
>
> A common information need is to understand large, unstructured datasets:
> millions of e-mails during e-discovery, a decade worth of science
> correspondence, or a day's tweets. In the last decade, topic models have
> become a common tool for navigating such datasets even across languages.
> This talk investigates the foundational research that allows successful
> tools for these data exploration tasks: how to know when you have an
> effective model of the dataset; how to correct bad models; how to measure
> topic model effectiveness; and how to detect framing and spin using these
> techniques. After introducing topic models, I argue why traditional
> measures of topic model quality---borrowed from machine learning---are
> inconsistent with how topic models are actually used. In response, I
> describe interactive topic modeling, a technique that enables users to
> impart their insights and preferences to models in a principled,
> interactive way. I will then address measuring topic model effectiveness in
> real-world tasks.
>
> Overview of topic models:
> https://mimno.infosci.cornell.edu/papers/2017_fntir_tm_applications.pdf
>
> Topic model evaluation: http://umiacs.umd.edu/~jbg//docs/nips2009-rtl.pdf
>
> Interactive topic modeling:
> http://umiacs.umd.edu/~jbg//docs/2014_mlj_itm.pdf
> Topic Models for Categorization:
> http://users.umiacs.umd.edu/~jbg//docs/2016_acl_doclabel.pdf
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] March 18, 2020: Topic Modeling

2020-03-12 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, March 18, at
9:30 AM PDT/16:30 UTC. We’ll have a presentation on topic modeling by
Jordan Boyd-Graber. A question-and-answer session will follow.

YouTube stream: https://www.youtube.com/watch?v=fiD9QTHNVVM

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentation:

Big Data Analysis with Topic Models: Evaluation, Interaction, and
Multilingual Extensions

By: Jordan Boyd-Graber, University of Maryland

A common information need is to understand large, unstructured datasets:
millions of e-mails during e-discovery, a decade worth of science
correspondence, or a day's tweets. In the last decade, topic models have
become a common tool for navigating such datasets even across languages.
This talk investigates the foundational research that allows successful
tools for these data exploration tasks: how to know when you have an
effective model of the dataset; how to correct bad models; how to measure
topic model effectiveness; and how to detect framing and spin using these
techniques. After introducing topic models, I argue why traditional
measures of topic model quality---borrowed from machine learning---are
inconsistent with how topic models are actually used. In response, I
describe interactive topic modeling, a technique that enables users to
impart their insights and preferences to models in a principled,
interactive way. I will then address measuring topic model effectiveness in
real-world tasks.

Overview of topic models:
https://mimno.infosci.cornell.edu/papers/2017_fntir_tm_applications.pdf

Topic model evaluation: http://umiacs.umd.edu/~jbg//docs/nips2009-rtl.pdf

Interactive topic modeling:
http://umiacs.umd.edu/~jbg//docs/2014_mlj_itm.pdf
Topic Models for Categorization:
http://users.umiacs.umd.edu/~jbg//docs/2016_acl_doclabel.pdf

-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] February 19, 2020: The Humans and Bots of Wikipedia and Wikidata

2020-02-19 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes! Details below.

On Thu, Feb 13, 2020 at 1:32 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, February
> 19, at 9:30 AM PST/17:30 UTC. We’ll have presentations from Jeffrey V.
> Nickerson on human/machine collaboration on Wikipedia, and Lucie-Aimée
> Kaffee on human/machine collaboration on Wikidata. A question-and-answer
> session will follow.
>
> YouTube stream: https://www.youtube.com/watch?v=fj0z20PuGIk
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Autonomous tools and the design of work
>
> By Jeffrey V. Nickerson, Stevens Institute of Technology
>
> Bots and other software tools that exhibit autonomy can appear in an
> organization to be more like employees than commodities. As a result,
> humans delegate to machines. Sometimes the machines turn and delegate part
> of the work back to humans. This talk will discuss how the design of human
> work is changing, drawing on a recent study of editors and bots in
> Wikipedia, as well as a study of game and chip designers. The Wikipedia bot
> ecosystem, and how bots evolve, will be discussed. Humans are working
> together with machines in complex configurations; this puts constraints on
> not only the machines but also the humans. Both software and human skills
> change as a result. Paper
> <https://dl.acm.org/doi/pdf/10.1145/3359317?download=true>
>
>
> When Humans and Machines Collaborate: Cross-lingual Label Editing in
> Wikidata
>
> By Lucie-Aimée Kaffee, University of Southampton
>
> The quality and maintainability of any knowledge graph are strongly
> influenced in the way it is created. In the case of Wikidata, the knowledge
> graph is created and maintained by a hybrid approach of human editing
> supported by automated tools. We analyse the editing of natural language
> data, i.e. labels. Labels are the entry point for humans to understand the
> information, and therefore need to be carefully maintained. Wikidata is a
> good example for a hybrid multilingual knowledge graph as it has a large
> and active community of humans and bots working together covering over 300
> languages. In this work, we analyse the different editor groups and how
> they interact with the different language data to understand the provenance
> of the current label data. This presentation is based on the paper “When
> Humans and Machines Collaborate: Cross-lingual Label Editing in Wikidata”,
> published in OpenSym 2019 in collaboration with Kemele M. Endris and Elena
> Simperl. Paper
> <https://opensym.org/wp-content/uploads/2019/08/os19-paper-A16-kaffee.pdf>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] February 19, 2020: The Humans and Bots of Wikipedia and Wikidata

2020-02-18 Thread Janna Layton
Just a reminder that this Research Showcase will be happening tomorrow.

On Thu, Feb 13, 2020 at 1:32 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, February
> 19, at 9:30 AM PST/17:30 UTC. We’ll have presentations from Jeffrey V.
> Nickerson on human/machine collaboration on Wikipedia, and Lucie-Aimée
> Kaffee on human/machine collaboration on Wikidata. A question-and-answer
> session will follow.
>
> YouTube stream: https://www.youtube.com/watch?v=fj0z20PuGIk
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Autonomous tools and the design of work
>
> By Jeffrey V. Nickerson, Stevens Institute of Technology
>
> Bots and other software tools that exhibit autonomy can appear in an
> organization to be more like employees than commodities. As a result,
> humans delegate to machines. Sometimes the machines turn and delegate part
> of the work back to humans. This talk will discuss how the design of human
> work is changing, drawing on a recent study of editors and bots in
> Wikipedia, as well as a study of game and chip designers. The Wikipedia bot
> ecosystem, and how bots evolve, will be discussed. Humans are working
> together with machines in complex configurations; this puts constraints on
> not only the machines but also the humans. Both software and human skills
> change as a result. Paper
> <https://dl.acm.org/doi/pdf/10.1145/3359317?download=true>
>
>
> When Humans and Machines Collaborate: Cross-lingual Label Editing in
> Wikidata
>
> By Lucie-Aimée Kaffee, University of Southampton
>
> The quality and maintainability of any knowledge graph are strongly
> influenced in the way it is created. In the case of Wikidata, the knowledge
> graph is created and maintained by a hybrid approach of human editing
> supported by automated tools. We analyse the editing of natural language
> data, i.e. labels. Labels are the entry point for humans to understand the
> information, and therefore need to be carefully maintained. Wikidata is a
> good example for a hybrid multilingual knowledge graph as it has a large
> and active community of humans and bots working together covering over 300
> languages. In this work, we analyse the different editor groups and how
> they interact with the different language data to understand the provenance
> of the current label data. This presentation is based on the paper “When
> Humans and Machines Collaborate: Cross-lingual Label Editing in Wikidata”,
> published in OpenSym 2019 in collaboration with Kemele M. Endris and Elena
> Simperl. Paper
> <https://opensym.org/wp-content/uploads/2019/08/os19-paper-A16-kaffee.pdf>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] February 19, 2020: The Humans and Bots of Wikipedia and Wikidata

2020-02-13 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, February 19,
at 9:30 AM PST/17:30 UTC. We’ll have presentations from Jeffrey V.
Nickerson on human/machine collaboration on Wikipedia, and Lucie-Aimée
Kaffee on human/machine collaboration on Wikidata. A question-and-answer
session will follow.

YouTube stream: https://www.youtube.com/watch?v=fj0z20PuGIk

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Autonomous tools and the design of work

By Jeffrey V. Nickerson, Stevens Institute of Technology

Bots and other software tools that exhibit autonomy can appear in an
organization to be more like employees than commodities. As a result,
humans delegate to machines. Sometimes the machines turn and delegate part
of the work back to humans. This talk will discuss how the design of human
work is changing, drawing on a recent study of editors and bots in
Wikipedia, as well as a study of game and chip designers. The Wikipedia bot
ecosystem, and how bots evolve, will be discussed. Humans are working
together with machines in complex configurations; this puts constraints on
not only the machines but also the humans. Both software and human skills
change as a result. Paper
<https://dl.acm.org/doi/pdf/10.1145/3359317?download=true>


When Humans and Machines Collaborate: Cross-lingual Label Editing in
Wikidata

By Lucie-Aimée Kaffee, University of Southampton

The quality and maintainability of any knowledge graph are strongly
influenced in the way it is created. In the case of Wikidata, the knowledge
graph is created and maintained by a hybrid approach of human editing
supported by automated tools. We analyse the editing of natural language
data, i.e. labels. Labels are the entry point for humans to understand the
information, and therefore need to be carefully maintained. Wikidata is a
good example for a hybrid multilingual knowledge graph as it has a large
and active community of humans and bots working together covering over 300
languages. In this work, we analyse the different editor groups and how
they interact with the different language data to understand the provenance
of the current label data. This presentation is based on the paper “When
Humans and Machines Collaborate: Cross-lingual Label Editing in Wikidata”,
published in OpenSym 2019 in collaboration with Kemele M. Endris and Elena
Simperl. Paper
<https://opensym.org/wp-content/uploads/2019/08/os19-paper-A16-kaffee.pdf>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] December 18, 2019 at 9:30 AM PST, 17:30 UTC

2019-12-18 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Mon, Dec 16, 2019 at 4:38 PM Janna Layton  wrote:

> Just a reminder that this Showcase on knowledge bases and the 2016 US
> election will be on Wednesday.
>
> On Thu, Dec 12, 2019 at 9:37 AM Janna Layton 
> wrote:
>
>> Hello everyone,
>>
>> The next Research Showcase will be live-streamed on Wednesday, December
>> 18, at 9:30 AM PST/17:30 UTC. We’ll have a presentation from Fabian
>> Suchanek on incomplete knowledge bases and one from Brian Keegan about
>> Wikipedia and the 2016 US Presidential election.
>>
>> YouTube stream: https://www.youtube.com/watch?v=b4VrphM_TTA
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Making Knowledge Bases More Complete
>>
>> By Fabian Suchanek, Télécom Paris, Institut Polytechnique de Paris
>>
>> A Knowledge Base (KB) is a computer-readable collection of facts about
>> the world (examples are Wikidata, DBpedia, and YAGO). The problem is that
>> these KBs are often missing entities or facts. In this talk, I present some
>> new methods to combat this incompleteness. I will also quickly talk about
>> some other research projects we are currently pursuing, including a new
>> version of YAGO. Publications <https://suchanek.name/work/publications/>
>>
>>
>> The Dynamics of Peer-Produced Political Information During the 2016 U.S.
>> Presidential Campaign
>>
>> By Brian Keegan, Ph.D., Assistant Professor, Department of Information
>> Science, University of Colorado Boulder
>>
>> Wikipedia plays a crucial role for online information seeking and its
>> editors have a remarkable capacity to rapidly revise its content in
>> response to current events. How did the production and consumption of
>> political information on Wikipedia mirror the dynamics of the 2016 U.S.
>> Presidential campaign? Drawing on systems justification theory and methods
>> for measuring the enthusiasm gap among voters, this paper quantitatively
>> analyzes the candidates' biographical and related articles and their
>> editors. Information production and consumption patterns match major events
>> over the course of the campaign, but Trump-related articles show
>> consistently higher levels of engagement than Clinton-related articles.
>> Analysis of the editors' participation and backgrounds show analogous
>> shifts in the composition and durability of the collaborations around each
>> candidate. The implications for using Wikipedia to monitor political
>> engagement are discussed. Paper
>> <http://www.brianckeegan.com/papers/CSCW_2019_Elections.pdf>
>>
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] December 18, 2019 at 9:30 AM PST, 17:30 UTC

2019-12-16 Thread Janna Layton
Just a reminder that this Showcase on knowledge bases and the 2016 US
election will be on Wednesday.

On Thu, Dec 12, 2019 at 9:37 AM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase will be live-streamed on Wednesday, December
> 18, at 9:30 AM PST/17:30 UTC. We’ll have a presentation from Fabian
> Suchanek on incomplete knowledge bases and one from Brian Keegan about
> Wikipedia and the 2016 US Presidential election.
>
> YouTube stream: https://www.youtube.com/watch?v=b4VrphM_TTA
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Making Knowledge Bases More Complete
>
> By Fabian Suchanek, Télécom Paris, Institut Polytechnique de Paris
>
> A Knowledge Base (KB) is a computer-readable collection of facts about the
> world (examples are Wikidata, DBpedia, and YAGO). The problem is that these
> KBs are often missing entities or facts. In this talk, I present some new
> methods to combat this incompleteness. I will also quickly talk about some
> other research projects we are currently pursuing, including a new version
> of YAGO. Publications <https://suchanek.name/work/publications/>
>
>
> The Dynamics of Peer-Produced Political Information During the 2016 U.S.
> Presidential Campaign
>
> By Brian Keegan, Ph.D., Assistant Professor, Department of Information
> Science, University of Colorado Boulder
>
> Wikipedia plays a crucial role for online information seeking and its
> editors have a remarkable capacity to rapidly revise its content in
> response to current events. How did the production and consumption of
> political information on Wikipedia mirror the dynamics of the 2016 U.S.
> Presidential campaign? Drawing on systems justification theory and methods
> for measuring the enthusiasm gap among voters, this paper quantitatively
> analyzes the candidates' biographical and related articles and their
> editors. Information production and consumption patterns match major events
> over the course of the campaign, but Trump-related articles show
> consistently higher levels of engagement than Clinton-related articles.
> Analysis of the editors' participation and backgrounds show analogous
> shifts in the composition and durability of the collaborations around each
> candidate. The implications for using Wikipedia to monitor political
> engagement are discussed. Paper
> <http://www.brianckeegan.com/papers/CSCW_2019_Elections.pdf>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] December 18, 2019 at 9:30 AM PST, 17:30 UTC

2019-12-12 Thread Janna Layton
Hello everyone,

The next Research Showcase will be live-streamed on Wednesday, December 18,
at 9:30 AM PST/17:30 UTC. We’ll have a presentation from Fabian Suchanek on
incomplete knowledge bases and one from Brian Keegan about Wikipedia and
the 2016 US Presidential election.

YouTube stream: https://www.youtube.com/watch?v=b4VrphM_TTA

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Making Knowledge Bases More Complete

By Fabian Suchanek, Télécom Paris, Institut Polytechnique de Paris

A Knowledge Base (KB) is a computer-readable collection of facts about the
world (examples are Wikidata, DBpedia, and YAGO). The problem is that these
KBs are often missing entities or facts. In this talk, I present some new
methods to combat this incompleteness. I will also quickly talk about some
other research projects we are currently pursuing, including a new version
of YAGO. Publications <https://suchanek.name/work/publications/>


The Dynamics of Peer-Produced Political Information During the 2016 U.S.
Presidential Campaign

By Brian Keegan, Ph.D., Assistant Professor, Department of Information
Science, University of Colorado Boulder

Wikipedia plays a crucial role for online information seeking and its
editors have a remarkable capacity to rapidly revise its content in
response to current events. How did the production and consumption of
political information on Wikipedia mirror the dynamics of the 2016 U.S.
Presidential campaign? Drawing on systems justification theory and methods
for measuring the enthusiasm gap among voters, this paper quantitatively
analyzes the candidates' biographical and related articles and their
editors. Information production and consumption patterns match major events
over the course of the campaign, but Trump-related articles show
consistently higher levels of engagement than Clinton-related articles.
Analysis of the editors' participation and backgrounds show analogous
shifts in the composition and durability of the collaborations around each
candidate. The implications for using Wikipedia to monitor political
engagement are discussed. Paper
<http://www.brianckeegan.com/papers/CSCW_2019_Elections.pdf>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] November 20, 2019 at 9:30 AM PST, 17:30 UTC

2019-11-18 Thread Janna Layton
Hello all,

Reminder that the Research Showcase will be this Wednesday. Details below.

On Fri, Nov 15, 2019 at 12:22 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed on Wednesday, November
> 20, 2019, at 9:30 AM PST/17:30 UTC. We’ll have a presentation from Martin
> Potthast of Leipzig University on text reuse in Wikipedia and other
> presentation from the Wikimedia Foundation’s Isaac Johnson on the
> demographics and interests of Wikipedia’s readers.
>
> YouTube stream: https://www.youtube.com/watch?v=tIko_V1k09s
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Wikipedia Text Reuse: Within and Without
>
> By Martin Potthast, Leipzig University
>
> We study text reuse related to Wikipedia at scale by compiling the first
> corpus of text reuse cases within Wikipedia as well as without (i.e., reuse
> of Wikipedia text in a  sample of the Common Crawl). To discover reuse
> beyond verbatim copy and paste, we employ state-of-the-art text reuse
> detection technology, scaling it for the first time to process the entire
> Wikipedia as part of a distributed retrieval pipeline. We further report on
> a pilot analysis of the 100 million reuse cases inside, and the 1.6 million
> reuse cases outside Wikipedia that we discovered. Text reuse inside
> Wikipedia gives rise to new tasks such as article template induction,
> fixing quality flaws, or complementing Wikipedia’s ontology. Text reuse
> outside Wikipedia yields a tangible metric for the emerging field of
> quantifying Wikipedia’s influence on the web. To foster future research
> into these tasks, and for reproducibility’s sake, the Wikipedia text reuse
> corpus and the retrieval pipeline are made freely available. Paper
> <https://webis.de/publications.html#?q=wikipedia%20ecir%202019>, Demo
> <https://demo.webis.de/wikipedia-text-reuse/>
>
>
> Characterizing Wikipedia Reader Demographics and Interests
>
> By Isaac Johnson, Wikimedia Foundation
>
> Building on two past surveys on the motivation and needs of Wikipedia
> readers (Why We Read Wikipedia
> <https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#November_2016>;
> Why the World Reads Wikipedia
> <https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#December_2018>),
> we examine the relationship between Wikipedia reader demographics and their
> interests and needs. Specifically, we run surveys in thirteen different
> languages that ask readers three questions about their motivation for
> reading Wikipedia (motivation, needs, and familiarity) and five questions
> about their demographics (age, gender, education, locale, and native
> language). We link these survey results with the respondents' reading
> sessions -- i.e. sequence of Wikipedia page views -- to gain a more
> fine-grained understanding of how a reader's context relates to their
> activity on Wikipedia. We find that readers have a diversity of backgrounds
> but that the high-level needs of readers do not correlate strongly with
> individual demographics. We also find, however, that there are
> relationships between demographics and specific topic interests that are
> consistent across many cultures and languages. This work provides insights
> into the reach of various Wikipedia language editions and the relationship
> between content or contributor gaps and reader gaps. See the meta page
> <https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Behaviour/Demographics_and_Wikipedia_use_cases#Reader_Surveys>
> for more details.
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] November 20, 2019 at 9:30 AM PST, 17:30 UTC

2019-11-15 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed on Wednesday, November 20,
2019, at 9:30 AM PST/17:30 UTC. We’ll have a presentation from Martin
Potthast of Leipzig University on text reuse in Wikipedia and other
presentation from the Wikimedia Foundation’s Isaac Johnson on the
demographics and interests of Wikipedia’s readers.

YouTube stream: https://www.youtube.com/watch?v=tIko_V1k09s

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Wikipedia Text Reuse: Within and Without

By Martin Potthast, Leipzig University

We study text reuse related to Wikipedia at scale by compiling the first
corpus of text reuse cases within Wikipedia as well as without (i.e., reuse
of Wikipedia text in a  sample of the Common Crawl). To discover reuse
beyond verbatim copy and paste, we employ state-of-the-art text reuse
detection technology, scaling it for the first time to process the entire
Wikipedia as part of a distributed retrieval pipeline. We further report on
a pilot analysis of the 100 million reuse cases inside, and the 1.6 million
reuse cases outside Wikipedia that we discovered. Text reuse inside
Wikipedia gives rise to new tasks such as article template induction,
fixing quality flaws, or complementing Wikipedia’s ontology. Text reuse
outside Wikipedia yields a tangible metric for the emerging field of
quantifying Wikipedia’s influence on the web. To foster future research
into these tasks, and for reproducibility’s sake, the Wikipedia text reuse
corpus and the retrieval pipeline are made freely available. Paper
<https://webis.de/publications.html#?q=wikipedia%20ecir%202019>, Demo
<https://demo.webis.de/wikipedia-text-reuse/>


Characterizing Wikipedia Reader Demographics and Interests

By Isaac Johnson, Wikimedia Foundation

Building on two past surveys on the motivation and needs of Wikipedia
readers (Why We Read Wikipedia
<https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#November_2016>; Why
the World Reads Wikipedia
<https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#December_2018>),
we examine the relationship between Wikipedia reader demographics and their
interests and needs. Specifically, we run surveys in thirteen different
languages that ask readers three questions about their motivation for
reading Wikipedia (motivation, needs, and familiarity) and five questions
about their demographics (age, gender, education, locale, and native
language). We link these survey results with the respondents' reading
sessions -- i.e. sequence of Wikipedia page views -- to gain a more
fine-grained understanding of how a reader's context relates to their
activity on Wikipedia. We find that readers have a diversity of backgrounds
but that the high-level needs of readers do not correlate strongly with
individual demographics. We also find, however, that there are
relationships between demographics and specific topic interests that are
consistent across many cultures and languages. This work provides insights
into the reach of various Wikipedia language editions and the relationship
between content or contributor gaps and reader gaps. See the meta page
<https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Behaviour/Demographics_and_Wikipedia_use_cases#Reader_Surveys>
for more details.

-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] October 16, 2019 at 9:30 AM PDT, 16:30 UTC

2019-10-16 Thread Janna Layton
Reminder that the Research Showcase is happening in 30 minutes.

On Tue, Oct 15, 2019 at 1:48 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed next Wednesday, October
> 16, at 9:30 AM PDT/16:30 UTC. We’ll have two timely presentations from
> Fabrício Benevenuto and Francesca Spezzano about disinformation and how to
> counter it.
>
> YouTube stream: https://www.youtube.com/watch?v=KZ35weAVlIU
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Elections Without Fake: Deploying Real Systems to Counter Misinformation
> Campaigns
>
> By Fabrício Benevenuto, Computer Science Department, Universidade Federal
> de Minas Gerais (UFMG), Brazil
>
> The political debate and electoral dispute in the online space during the
> 2018 Brazilian elections were marked by an information war. In order to
> mitigate the misinformation problem, we created the project Elections
> Without Fake <http://www.eleicoes-sem-fake.dcc.ufmg.br/> and developed a
> few technological solutions able to reduce the abuse of misinformation
> campaigns in the online space. Particularly, we created a system to monitor
> public groups in WhatsApp and a system to monitor ads in Facebook. Our
> systems showed to be fundamental for fact-checking and investigative
> journalism, and are currently being used by over 150 journalists with
> editorial lines and various fact-checking agencies.
>
>
> Protecting Wikipedia from Disinformation: Detecting Malicious Editors and
> Pages to Protect
>
> By Francesca Spezzano, Computer Science Department, Boise State University
>
> Wikipedia is based on the idea that anyone can make edits in order to
> create reliable and crowd-sourced content. Yet with the cover of internet
> anonymity, some users make changes to the online encyclopedia that do not
> align with Wikipedia’s intended uses. In this talk, we present different
> forms of disinformation on Wikipedia including vandalism and spam and
> introduce to the mechanism that Wikipedia implements to protects its
> integrity such as blocking malicious editors and page protection. Next, we
> provide an overview of effective algorithms based on the user editing
> behavior we have developed to detect malicious editors and pages to protect
> across multiple languages.
>
>
> On Thu, Oct 10, 2019 at 2:57 PM Janna Layton 
> wrote:
>
>> Hi all,
>>
>> The next Research Showcase will be live-streamed next Wednesday, October
>> 16, at 9:30 AM PDT/16:30 UTC.
>>
>> YouTube stream: https://www.youtube.com/watch?v=KZ35weAVlIU
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past Research Showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Elections Without Fake: Deploying Real Systems to Counter Misinformation
>> Campaigns
>>
>> By Fabrício Benevenuto, Computer Science Department, Universidade Federal
>> de Minas Gerais (UFMG), Brazil
>>
>> The political debate and electoral dispute in the online space during the
>> 2018 Brazilian elections were marked by an information war. In order to
>> mitigate the misinformation problem, we created the project Elections
>> Without Fake <http://www.eleicoes-sem-fake.dcc.ufmg.br/> and developed a
>> few technological solutions able to reduce the abuse of misinformation
>> campaigns in the online space. Particularly, we created a system to monitor
>> public groups in WhatsApp and a system to monitor ads in Facebook. Our
>> systems showed to be fundamental for fact-checking and investigative
>> journalism, and are currently being used by over 150 journalists with
>> editorial lines and various fact-checking agencies.
>>
>> More info on second talk by Francesca Spezzano to come
>>
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] October 16, 2019 at 9:30 AM PDT, 16:30 UTC

2019-10-15 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed next Wednesday, October
16, at 9:30 AM PDT/16:30 UTC. We’ll have two timely presentations from
Fabrício Benevenuto and Francesca Spezzano about disinformation and how to
counter it.

YouTube stream: https://www.youtube.com/watch?v=KZ35weAVlIU

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Elections Without Fake: Deploying Real Systems to Counter Misinformation
Campaigns

By Fabrício Benevenuto, Computer Science Department, Universidade Federal
de Minas Gerais (UFMG), Brazil

The political debate and electoral dispute in the online space during the
2018 Brazilian elections were marked by an information war. In order to
mitigate the misinformation problem, we created the project Elections
Without Fake <http://www.eleicoes-sem-fake.dcc.ufmg.br/> and developed a
few technological solutions able to reduce the abuse of misinformation
campaigns in the online space. Particularly, we created a system to monitor
public groups in WhatsApp and a system to monitor ads in Facebook. Our
systems showed to be fundamental for fact-checking and investigative
journalism, and are currently being used by over 150 journalists with
editorial lines and various fact-checking agencies.


Protecting Wikipedia from Disinformation: Detecting Malicious Editors and
Pages to Protect

By Francesca Spezzano, Computer Science Department, Boise State University

Wikipedia is based on the idea that anyone can make edits in order to
create reliable and crowd-sourced content. Yet with the cover of internet
anonymity, some users make changes to the online encyclopedia that do not
align with Wikipedia’s intended uses. In this talk, we present different
forms of disinformation on Wikipedia including vandalism and spam and
introduce to the mechanism that Wikipedia implements to protects its
integrity such as blocking malicious editors and page protection. Next, we
provide an overview of effective algorithms based on the user editing
behavior we have developed to detect malicious editors and pages to protect
across multiple languages.


On Thu, Oct 10, 2019 at 2:57 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed next Wednesday, October
> 16, at 9:30 AM PDT/16:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=KZ35weAVlIU
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past Research Showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Elections Without Fake: Deploying Real Systems to Counter Misinformation
> Campaigns
>
> By Fabrício Benevenuto, Computer Science Department, Universidade Federal
> de Minas Gerais (UFMG), Brazil
>
> The political debate and electoral dispute in the online space during the
> 2018 Brazilian elections were marked by an information war. In order to
> mitigate the misinformation problem, we created the project Elections
> Without Fake <http://www.eleicoes-sem-fake.dcc.ufmg.br/> and developed a
> few technological solutions able to reduce the abuse of misinformation
> campaigns in the online space. Particularly, we created a system to monitor
> public groups in WhatsApp and a system to monitor ads in Facebook. Our
> systems showed to be fundamental for fact-checking and investigative
> journalism, and are currently being used by over 150 journalists with
> editorial lines and various fact-checking agencies.
>
> More info on second talk by Francesca Spezzano to come
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] October 16, 2019 at 9:30 AM PDT, 16:30 UTC

2019-10-10 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed next Wednesday, October
16, at 9:30 AM PDT/16:30 UTC.

YouTube stream: https://www.youtube.com/watch?v=KZ35weAVlIU

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past Research Showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Elections Without Fake: Deploying Real Systems to Counter Misinformation
Campaigns

By Fabrício Benevenuto, Computer Science Department, Universidade Federal
de Minas Gerais (UFMG), Brazil

The political debate and electoral dispute in the online space during the
2018 Brazilian elections were marked by an information war. In order to
mitigate the misinformation problem, we created the project Elections
Without Fake <http://www.eleicoes-sem-fake.dcc.ufmg.br/> and developed a
few technological solutions able to reduce the abuse of misinformation
campaigns in the online space. Particularly, we created a system to monitor
public groups in WhatsApp and a system to monitor ads in Facebook. Our
systems showed to be fundamental for fact-checking and investigative
journalism, and are currently being used by over 150 journalists with
editorial lines and various fact-checking agencies.

More info on second talk by Francesca Spezzano to come


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC

2019-09-18 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes.

On Mon, Sep 16, 2019 at 10:59 AM Janna Layton  wrote:

> Hello everyone,
>
> This is just a reminder that the Research Showcase will be this Wednesday,
> with Miriam Redi and Jonathan Morgan from the Foundation presenting.
>
> On Wed, Sep 11, 2019 at 3:10 PM Janna Layton 
> wrote:
>
>> Hello everyone,
>>
>> The next Research Showcase will be live-streamed next Wednesday,
>> September 18, at 9:30 AM PT/16:30 UTC. This will be the new time going
>> forward for Research Showcases in order to give more access to other
>> timezones.
>>
>> YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
>> Verifiability
>>
>> By Miriam Redi, Research, Wikimedia Foundation
>>
>> Among Wikipedia's core guiding principles, verifiability policies have a
>> particularly important role. Verifiability requires that information
>> included in a Wikipedia article be corroborated against reliable secondary
>> sources. Because of the manual labor needed to curate and fact-check
>> Wikipedia at scale, however, its contents do not always evenly comply with
>> these policies. Citations (i.e. reference to external sources) may not
>> conform to verifiability requirements or may be missing altogether,
>> potentially weakening the reliability of specific topic areas of the free
>> encyclopedia. In this project
>> <https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
>> we aimed to provide an empirical characterization of the reasons why and
>> how Wikipedia cites external sources to comply with its own verifiability
>> guidelines. First, we constructed a taxonomy of reasons why inline
>> citations are required by collecting labeled data from editors of multiple
>> Wikipedia language editions. We then collected a large-scale crowdsourced
>> dataset of Wikipedia sentences annotated with categories derived from this
>> taxonomy. Finally, we designed and evaluated algorithmic models to
>> determine if a statement requires a citation, and to predict the citation
>> reason based on our taxonomy. We evaluated the robustness of such models
>> across different classes of Wikipedia articles of varying quality, as well
>> as on an additional dataset of claims annotated for fact-checking purposes.
>>
>> Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
>> Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
>> In The World Wide Web Conference (pp. 1567-1578). ACM.
>> https://arxiv.org/abs/1902.6
>>
>>
>> Patrolling on Wikipedia
>>
>> By Jonathan T. Morgan, Research, Wikimedia Foundation
>>
>> I will present initial findings from an ongoing research study
>> <https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
>> patrolling workflows on Wikimedia projects. Editors patrol recent pages and
>> edits to ensure that Wikimedia projects maintains high quality as new
>> content comes in. Patrollers revert vandalism and review newly-created
>> articles and article drafts. Patrolling of new pages and edits is vital
>> work. In addition to making sure that new content conforms to Wikipedia
>> project policies, patrollers are the first line of defense against
>> disinformation, copyright infringement, libel and slander, personal
>> threats, and other forms of vandalism on Wikimedia projects. This research
>> project is focused on understanding the needs, priorities, and workflows of
>> editors who patrol new content on Wikimedia projects. The findings of this
>> research can inform the development of better patrolling tools as well as
>> non-technological interventions intended to support patrollers and the
>> activity of patrolling.
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC

2019-09-16 Thread Janna Layton
Hello everyone,

This is just a reminder that the Research Showcase will be this Wednesday,
with Miriam Redi and Jonathan Morgan from the Foundation presenting.

On Wed, Sep 11, 2019 at 3:10 PM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase will be live-streamed next Wednesday, September
> 18, at 9:30 AM PT/16:30 UTC. This will be the new time going forward for
> Research Showcases in order to give more access to other timezones.
>
> YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
> Verifiability
>
> By Miriam Redi, Research, Wikimedia Foundation
>
> Among Wikipedia's core guiding principles, verifiability policies have a
> particularly important role. Verifiability requires that information
> included in a Wikipedia article be corroborated against reliable secondary
> sources. Because of the manual labor needed to curate and fact-check
> Wikipedia at scale, however, its contents do not always evenly comply with
> these policies. Citations (i.e. reference to external sources) may not
> conform to verifiability requirements or may be missing altogether,
> potentially weakening the reliability of specific topic areas of the free
> encyclopedia. In this project
> <https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
> we aimed to provide an empirical characterization of the reasons why and
> how Wikipedia cites external sources to comply with its own verifiability
> guidelines. First, we constructed a taxonomy of reasons why inline
> citations are required by collecting labeled data from editors of multiple
> Wikipedia language editions. We then collected a large-scale crowdsourced
> dataset of Wikipedia sentences annotated with categories derived from this
> taxonomy. Finally, we designed and evaluated algorithmic models to
> determine if a statement requires a citation, and to predict the citation
> reason based on our taxonomy. We evaluated the robustness of such models
> across different classes of Wikipedia articles of varying quality, as well
> as on an additional dataset of claims annotated for fact-checking purposes.
>
> Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
> Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
> In The World Wide Web Conference (pp. 1567-1578). ACM.
> https://arxiv.org/abs/1902.6
>
>
> Patrolling on Wikipedia
>
> By Jonathan T. Morgan, Research, Wikimedia Foundation
>
> I will present initial findings from an ongoing research study
> <https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
> patrolling workflows on Wikimedia projects. Editors patrol recent pages and
> edits to ensure that Wikimedia projects maintains high quality as new
> content comes in. Patrollers revert vandalism and review newly-created
> articles and article drafts. Patrolling of new pages and edits is vital
> work. In addition to making sure that new content conforms to Wikipedia
> project policies, patrollers are the first line of defense against
> disinformation, copyright infringement, libel and slander, personal
> threats, and other forms of vandalism on Wikimedia projects. This research
> project is focused on understanding the needs, priorities, and workflows of
> editors who patrol new content on Wikimedia projects. The findings of this
> research can inform the development of better patrolling tools as well as
> non-technological interventions intended to support patrollers and the
> activity of patrolling.
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC

2019-09-11 Thread Janna Layton
Hello everyone,

The next Research Showcase will be live-streamed next Wednesday, September
18, at 9:30 AM PT/16:30 UTC. This will be the new time going forward for
Research Showcases in order to give more access to other timezones.

YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
Verifiability

By Miriam Redi, Research, Wikimedia Foundation

Among Wikipedia's core guiding principles, verifiability policies have a
particularly important role. Verifiability requires that information
included in a Wikipedia article be corroborated against reliable secondary
sources. Because of the manual labor needed to curate and fact-check
Wikipedia at scale, however, its contents do not always evenly comply with
these policies. Citations (i.e. reference to external sources) may not
conform to verifiability requirements or may be missing altogether,
potentially weakening the reliability of specific topic areas of the free
encyclopedia. In this project
<https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
we aimed to provide an empirical characterization of the reasons why and
how Wikipedia cites external sources to comply with its own verifiability
guidelines. First, we constructed a taxonomy of reasons why inline
citations are required by collecting labeled data from editors of multiple
Wikipedia language editions. We then collected a large-scale crowdsourced
dataset of Wikipedia sentences annotated with categories derived from this
taxonomy. Finally, we designed and evaluated algorithmic models to
determine if a statement requires a citation, and to predict the citation
reason based on our taxonomy. We evaluated the robustness of such models
across different classes of Wikipedia articles of varying quality, as well
as on an additional dataset of claims annotated for fact-checking purposes.

Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
In The World Wide Web Conference (pp. 1567-1578). ACM.
https://arxiv.org/abs/1902.6


Patrolling on Wikipedia

By Jonathan T. Morgan, Research, Wikimedia Foundation

I will present initial findings from an ongoing research study
<https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
patrolling workflows on Wikimedia projects. Editors patrol recent pages and
edits to ensure that Wikimedia projects maintains high quality as new
content comes in. Patrollers revert vandalism and review newly-created
articles and article drafts. Patrolling of new pages and edits is vital
work. In addition to making sure that new content conforms to Wikipedia
project policies, patrollers are the first line of defense against
disinformation, copyright infringement, libel and slander, personal
threats, and other forms of vandalism on Wikimedia projects. This research
project is focused on understanding the needs, priorities, and workflows of
editors who patrol new content on Wikimedia projects. The findings of this
research can inform the development of better patrolling tools as well as
non-technological interventions intended to support patrollers and the
activity of patrolling.

-- 
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] July 17, 2019 at 11:30 AM PDT, 18:30 UTC

2019-07-17 Thread Janna Layton
Hi everyone,

This month's Research Showcase, with presentations about civility and
rationales in Wikipedia, will be starting in about 30 minutes.

On Thu, Jul 11, 2019 at 4:45 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed next Wednesday, July 17,
> at 11:30 AM PDT/18:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=i9vvwV5KfW4
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Characterizing Incivility on Wikipedia
>
> Elizabeth Whittaker, University of Michigan School of Information
>
> In a society whose citizens have a variety of viewpoints, there is a
> question of how citizens can govern themselves in ways that allow these
> viewpoints to co-exist. Online deliberation has been posited as a problem
> solving mechanism in this context, and civility can be thought of as a
> mechanism that facilitates this deliberation. Civility can thus be thought
> of as a method of interaction that encourages collaboration, while
> incivility disrupts collaboration. However, it is important to note that
> the nature of online civility is shaped by its history and the technical
> architecture scaffolding it. Civility as a concept has been used both to
> promote equal deliberation and to exclude the marginalized from
> deliberation, so we should be careful to ensure that our conceptualizations
> of incivility reflect what we intend them to in order to avoid
> unintentionally reinforcing inequality.
>
> To this end, we examined Wikipedia editors’ perceptions of interactions
> that disrupt collaboration through 15 semi-structured interviews. Wikipedia
> is a highly deliberative platform, as editors need to reach consensus about
> what will appear on the article page, a process that often involves
> deliberation to coordinate, and any disruption to this process should be
> apparent. We found that incivility on Wikipedia typically occurs in one of
> three ways: through weaponization of Wikipedia’s policies, weaponization of
> Wikipedia’s technical features, and through more typical vitriolic content.
> These methods of incivility were gendered, and had the practical effect of
> discouraging women from editing. We implicate this pattern as one of the
> underlying causes of Wikipedia’s gender gap.
>
> Hidden Gems in the Wikipedia Discussions: The Wikipedians’ Rationales
>
> Lu Xiao, Syracuse University School of Information Studies
>
> I will present a series of completed and ongoing studies that are aimed at
> understanding the role of the Wikipedians’ rationales in Wikipedia
> discussions. We define a rationale as one’s justification of her viewpoint
> and suggestions. Our studies demonstrate the potential of leveraging the
> Wikipedians’ rationales in discussions as resources for future
> decision-making and as resources for eliciting knowledge about the
> community’s norms, practices and policies. Viewed as rich digital traces in
> these environments, we consider them to be beneficial for the community
> members, such as helping newcomers familiarize themselves on the commonly
> accepted justificatory reasoning styles. We call for more research
> attention to the discussion content from this rationale study perspective.
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] July 17, 2019 at 11:30 AM PDT, 18:30 UTC

2019-07-15 Thread Janna Layton
Just a reminder that the Research Showcase is this week!

On Thu, Jul 11, 2019 at 4:45 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed next Wednesday, July 17,
> at 11:30 AM PDT/18:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=i9vvwV5KfW4
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Characterizing Incivility on Wikipedia
>
> Elizabeth Whittaker, University of Michigan School of Information
>
> In a society whose citizens have a variety of viewpoints, there is a
> question of how citizens can govern themselves in ways that allow these
> viewpoints to co-exist. Online deliberation has been posited as a problem
> solving mechanism in this context, and civility can be thought of as a
> mechanism that facilitates this deliberation. Civility can thus be thought
> of as a method of interaction that encourages collaboration, while
> incivility disrupts collaboration. However, it is important to note that
> the nature of online civility is shaped by its history and the technical
> architecture scaffolding it. Civility as a concept has been used both to
> promote equal deliberation and to exclude the marginalized from
> deliberation, so we should be careful to ensure that our conceptualizations
> of incivility reflect what we intend them to in order to avoid
> unintentionally reinforcing inequality.
>
> To this end, we examined Wikipedia editors’ perceptions of interactions
> that disrupt collaboration through 15 semi-structured interviews. Wikipedia
> is a highly deliberative platform, as editors need to reach consensus about
> what will appear on the article page, a process that often involves
> deliberation to coordinate, and any disruption to this process should be
> apparent. We found that incivility on Wikipedia typically occurs in one of
> three ways: through weaponization of Wikipedia’s policies, weaponization of
> Wikipedia’s technical features, and through more typical vitriolic content.
> These methods of incivility were gendered, and had the practical effect of
> discouraging women from editing. We implicate this pattern as one of the
> underlying causes of Wikipedia’s gender gap.
>
> Hidden Gems in the Wikipedia Discussions: The Wikipedians’ Rationales
>
> Lu Xiao, Syracuse University School of Information Studies
>
> I will present a series of completed and ongoing studies that are aimed at
> understanding the role of the Wikipedians’ rationales in Wikipedia
> discussions. We define a rationale as one’s justification of her viewpoint
> and suggestions. Our studies demonstrate the potential of leveraging the
> Wikipedians’ rationales in discussions as resources for future
> decision-making and as resources for eliciting knowledge about the
> community’s norms, practices and policies. Viewed as rich digital traces in
> these environments, we consider them to be beneficial for the community
> members, such as helping newcomers familiarize themselves on the commonly
> accepted justificatory reasoning styles. We call for more research
> attention to the discussion content from this rationale study perspective.
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] July 17, 2019 at 11:30 AM PDT, 18:30 UTC

2019-07-11 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed next Wednesday, July 17,
at 11:30 AM PDT/18:30 UTC.

YouTube stream: https://www.youtube.com/watch?v=i9vvwV5KfW4

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Characterizing Incivility on Wikipedia

Elizabeth Whittaker, University of Michigan School of Information

In a society whose citizens have a variety of viewpoints, there is a
question of how citizens can govern themselves in ways that allow these
viewpoints to co-exist. Online deliberation has been posited as a problem
solving mechanism in this context, and civility can be thought of as a
mechanism that facilitates this deliberation. Civility can thus be thought
of as a method of interaction that encourages collaboration, while
incivility disrupts collaboration. However, it is important to note that
the nature of online civility is shaped by its history and the technical
architecture scaffolding it. Civility as a concept has been used both to
promote equal deliberation and to exclude the marginalized from
deliberation, so we should be careful to ensure that our conceptualizations
of incivility reflect what we intend them to in order to avoid
unintentionally reinforcing inequality.

To this end, we examined Wikipedia editors’ perceptions of interactions
that disrupt collaboration through 15 semi-structured interviews. Wikipedia
is a highly deliberative platform, as editors need to reach consensus about
what will appear on the article page, a process that often involves
deliberation to coordinate, and any disruption to this process should be
apparent. We found that incivility on Wikipedia typically occurs in one of
three ways: through weaponization of Wikipedia’s policies, weaponization of
Wikipedia’s technical features, and through more typical vitriolic content.
These methods of incivility were gendered, and had the practical effect of
discouraging women from editing. We implicate this pattern as one of the
underlying causes of Wikipedia’s gender gap.

Hidden Gems in the Wikipedia Discussions: The Wikipedians’ Rationales

Lu Xiao, Syracuse University School of Information Studies

I will present a series of completed and ongoing studies that are aimed at
understanding the role of the Wikipedians’ rationales in Wikipedia
discussions. We define a rationale as one’s justification of her viewpoint
and suggestions. Our studies demonstrate the potential of leveraging the
Wikipedians’ rationales in discussions as resources for future
decision-making and as resources for eliciting knowledge about the
community’s norms, practices and policies. Viewed as rich digital traces in
these environments, we consider them to be beneficial for the community
members, such as helping newcomers familiarize themselves on the commonly
accepted justificatory reasoning styles. We call for more research
attention to the discussion content from this rationale study perspective.

-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] June 26, 2019 at 11:30 AM PST, 19:30 UTC

2019-06-26 Thread Janna Layton
Hello everyone,

Just a reminder that this event will be happening in about half an hour!
Here's the Youtube link again: https://www.youtube.com/watch?v=WiUfpmeJG7E

On Tue, Jun 25, 2019 at 9:14 AM Janna Layton  wrote:

> Time correction:
>
> The next Research Showcase will be live-streamed next Wednesday, June 26,
> at *11:30 AM PDT/18:30 UTC*.
>
> On Mon, Jun 24, 2019 at 4:11 PM Janna Layton 
> wrote:
>
>> Hi all,
>>
>> The next Research Showcase will be live-streamed this Wednesday, June 26,
>> at 11:30 AM PST/19:30 UTC. We will have three presentations this showcase,
>> all relating to Wikipedia blocks.
>>
>> YouTube stream: https://www.youtube.com/watch?v=WiUfpmeJG7E
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Trajectories of Blocked Community Members: Redemption, Recidivism and
>> Departure
>>
>> By Jonathan Chang, Cornell University
>>
>> Community norm violations can impair constructive communication and
>> collaboration online. As a defense mechanism, community moderators often
>> address such transgressions by temporarily blocking the perpetrator. Such
>> actions, however, come with the cost of potentially alienating community
>> members. Given this tradeoff, it is essential to understand to what extent,
>> and in which situations, this common moderation practice is effective in
>> reinforcing community rules. In this work, we introduce a computational
>> framework for studying the future behavior of blocked users on Wikipedia.
>> After their block expires, they can take several distinct paths: they can
>> reform and adhere to the rules, but they can also recidivate, or
>> straight-out abandon the community. We reveal that these trajectories are
>> tied to factors rooted both in the characteristics of the blocked
>> individual and in whether they perceived the block to be fair and
>> justified. Based on these insights, we formulate a series of prediction
>> tasks aiming to determine which of these paths a user is likely to take
>> after being blocked for their first offense, and demonstrate the
>> feasibility of these new tasks. Overall, this work builds towards a more
>> nuanced approach to moderation by highlighting the tradeoffs that are in
>> play.
>>
>>
>> Automatic Detection of Online Abuse in Wikipedia
>>
>> By Lane Rasberry, University of Virginia
>>
>> Researchers analyzed all English Wikipedia blocks prior to 2018 using
>> machine learning. With insights gained, the researchers examined all
>> English Wikipedia users who are not blocked against the identified
>> characteristics of blocked users. The results were a ranked set of
>> predictions of users who are not blocked, but who have a history of conduct
>> similar to that of blocked users. This research and process models a system
>> for the use of computing to aid human moderators in identifying conduct on
>> English Wikipedia which merits a block.
>>
>> Project page:
>> https://meta.wikimedia.org/wiki/University_of_Virginia/Automatic_Detection_of_Online_Abuse
>>
>> Video: https://www.youtube.com/watch?v=AIhdb4-hKBo
>>
>>
>> First Insights from Partial Blocks in Wikimedia Wikis
>>
>> By Morten Warncke-Wang, Wikimedia Foundation
>>
>> The Anti-Harassment Tools team at the Wikimedia Foundation released the
>> partial block feature in early 2019. Where previously blocks on Wikimedia
>> wikis were sitewide (users were blocked from editing an entire wiki),
>> partial blocks makes it possible to block users from editing specific pages
>> and/or namespaces. The Italian Wikipedia was the first wiki to start using
>> this feature, and it has since been rolled out to other wikis as well. In
>> this presentation, we will look at how this feature has been used in the
>> first few months since release.
>>
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Audiences & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] June 26, 2019 at 11:30 AM PST, 19:30 UTC

2019-06-25 Thread Janna Layton
Time correction:

The next Research Showcase will be live-streamed next Wednesday, June 26,
at *11:30 AM PDT/18:30 UTC*.

On Mon, Jun 24, 2019 at 4:11 PM Janna Layton  wrote:

> Hi all,
>
> The next Research Showcase will be live-streamed this Wednesday, June 26,
> at 11:30 AM PST/19:30 UTC. We will have three presentations this showcase,
> all relating to Wikipedia blocks.
>
> YouTube stream: https://www.youtube.com/watch?v=WiUfpmeJG7E
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Trajectories of Blocked Community Members: Redemption, Recidivism and
> Departure
>
> By Jonathan Chang, Cornell University
>
> Community norm violations can impair constructive communication and
> collaboration online. As a defense mechanism, community moderators often
> address such transgressions by temporarily blocking the perpetrator. Such
> actions, however, come with the cost of potentially alienating community
> members. Given this tradeoff, it is essential to understand to what extent,
> and in which situations, this common moderation practice is effective in
> reinforcing community rules. In this work, we introduce a computational
> framework for studying the future behavior of blocked users on Wikipedia.
> After their block expires, they can take several distinct paths: they can
> reform and adhere to the rules, but they can also recidivate, or
> straight-out abandon the community. We reveal that these trajectories are
> tied to factors rooted both in the characteristics of the blocked
> individual and in whether they perceived the block to be fair and
> justified. Based on these insights, we formulate a series of prediction
> tasks aiming to determine which of these paths a user is likely to take
> after being blocked for their first offense, and demonstrate the
> feasibility of these new tasks. Overall, this work builds towards a more
> nuanced approach to moderation by highlighting the tradeoffs that are in
> play.
>
>
> Automatic Detection of Online Abuse in Wikipedia
>
> By Lane Rasberry, University of Virginia
>
> Researchers analyzed all English Wikipedia blocks prior to 2018 using
> machine learning. With insights gained, the researchers examined all
> English Wikipedia users who are not blocked against the identified
> characteristics of blocked users. The results were a ranked set of
> predictions of users who are not blocked, but who have a history of conduct
> similar to that of blocked users. This research and process models a system
> for the use of computing to aid human moderators in identifying conduct on
> English Wikipedia which merits a block.
>
> Project page:
> https://meta.wikimedia.org/wiki/University_of_Virginia/Automatic_Detection_of_Online_Abuse
>
> Video: https://www.youtube.com/watch?v=AIhdb4-hKBo
>
>
> First Insights from Partial Blocks in Wikimedia Wikis
>
> By Morten Warncke-Wang, Wikimedia Foundation
>
> The Anti-Harassment Tools team at the Wikimedia Foundation released the
> partial block feature in early 2019. Where previously blocks on Wikimedia
> wikis were sitewide (users were blocked from editing an entire wiki),
> partial blocks makes it possible to block users from editing specific pages
> and/or namespaces. The Italian Wikipedia was the first wiki to start using
> this feature, and it has since been rolled out to other wikis as well. In
> this presentation, we will look at how this feature has been used in the
> first few months since release.
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] June 26, 2019 at 11:30 AM PST, 19:30 UTC

2019-06-24 Thread Janna Layton
Hi all,

The next Research Showcase will be live-streamed this Wednesday, June 26,
at 11:30 AM PST/19:30 UTC. We will have three presentations this showcase,
all relating to Wikipedia blocks.

YouTube stream: https://www.youtube.com/watch?v=WiUfpmeJG7E

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Trajectories of Blocked Community Members: Redemption, Recidivism and
Departure

By Jonathan Chang, Cornell University

Community norm violations can impair constructive communication and
collaboration online. As a defense mechanism, community moderators often
address such transgressions by temporarily blocking the perpetrator. Such
actions, however, come with the cost of potentially alienating community
members. Given this tradeoff, it is essential to understand to what extent,
and in which situations, this common moderation practice is effective in
reinforcing community rules. In this work, we introduce a computational
framework for studying the future behavior of blocked users on Wikipedia.
After their block expires, they can take several distinct paths: they can
reform and adhere to the rules, but they can also recidivate, or
straight-out abandon the community. We reveal that these trajectories are
tied to factors rooted both in the characteristics of the blocked
individual and in whether they perceived the block to be fair and
justified. Based on these insights, we formulate a series of prediction
tasks aiming to determine which of these paths a user is likely to take
after being blocked for their first offense, and demonstrate the
feasibility of these new tasks. Overall, this work builds towards a more
nuanced approach to moderation by highlighting the tradeoffs that are in
play.


Automatic Detection of Online Abuse in Wikipedia

By Lane Rasberry, University of Virginia

Researchers analyzed all English Wikipedia blocks prior to 2018 using
machine learning. With insights gained, the researchers examined all
English Wikipedia users who are not blocked against the identified
characteristics of blocked users. The results were a ranked set of
predictions of users who are not blocked, but who have a history of conduct
similar to that of blocked users. This research and process models a system
for the use of computing to aid human moderators in identifying conduct on
English Wikipedia which merits a block.

Project page:
https://meta.wikimedia.org/wiki/University_of_Virginia/Automatic_Detection_of_Online_Abuse

Video: https://www.youtube.com/watch?v=AIhdb4-hKBo


First Insights from Partial Blocks in Wikimedia Wikis

By Morten Warncke-Wang, Wikimedia Foundation

The Anti-Harassment Tools team at the Wikimedia Foundation released the
partial block feature in early 2019. Where previously blocks on Wikimedia
wikis were sitewide (users were blocked from editing an entire wiki),
partial blocks makes it possible to block users from editing specific pages
and/or namespaces. The Italian Wikipedia was the first wiki to start using
this feature, and it has since been rolled out to other wikis as well. In
this presentation, we will look at how this feature has been used in the
first few months since release.


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] April 17 at 11:30 AM PDT, 19:30 UTC

2019-04-17 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes. Topics today
are WikiProject and the "Thanks" feature. Info below:

On Mon, Apr 15, 2019 at 2:57 PM Janna Layton  wrote:

> Just a reminder that this Showcase will be happening on Wednesday.
>
> On Thu, Apr 11, 2019 at 4:49 PM Janna Layton 
> wrote:
>
>> Hello, everyone,
>>
>> The next Research Showcase, “Group Membership and Contributions to Public
>> Information Goods: The Case of WikiProject” and “Thanks for Stopping By:
>> A Study of ‘Thanks’ Usage on Wikimedia,” will be live-streamed next
>> Wednesday, April 17, 2019, at 11:30 AM PDT/19:30 UTC.
>>
>> YouTube stream: https://www.youtube.com/watch?v=zmb5LoJzOoE
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>>
>>
>> Group Membership and Contributions to Public Information Goods: The Case
>> of WikiProject
>>
>> By Ark Fangzhou Zhang
>>
>> Abstract:
>>
>> We investigate the effects of group identity on contribution behavior on
>> the English Wikipedia, the largest online encyclopedia that gives free
>> access to the public. Using an instrumental variable approach that exploits
>> the variations in one’s exposure to WikiProject, we find that joining a
>> WikiProject has a significant impact on one’s level of contribution, with
>> an average increase of 79 revisions or 8,672 character per month. To
>> uncover the potential mechanism underlying the treatment effect, we use the
>> size of home page for WikiProject as a proxy for the number of
>> recommendations from a project. The results show that the users who join a
>> WikiProject with more recommendations significantly increase their
>> contribution to articles under the joined project, but not to articles
>> under other projects.
>>
>>
>>
>> Thanks for Stopping By: A Study of ‘Thanks’ Usage on Wikimedia
>>
>> By Swati Goel
>>
>> Abstract:
>>
>> The Thanks feature on Wikipedia, also known as "Thanks," is a tool with
>> which editors can quickly and easily send one other positive feedback. The
>> aim of this project is to better understand this feature: its scope, the
>> characteristics of a typical "Thanks" interaction, and the effects of
>> receiving a thank on individual editors. We study the motivational impacts
>> of "Thanks" because maintaining editor engagement is a central problem for
>> crowdsourced repositories of knowledge such as Wikimedia. Our main findings
>> are that most editors have not been exposed to the Thanks feature (meaning
>> they have never given nor received a thank), thanks are typically sent
>> upwards (from less experienced to more experienced editors), and receiving
>> a thank is correlated with having high levels of editor engagement. Though
>> the prevalence of "Thanks" usage varies by editor experience, the impact of
>> receiving a thank seems mostly consistent for all users. We empirically
>> demonstrate that receiving a thank has a strong positive effect on
>> short-term editor activity across the board and provide preliminary
>> evidence that thanks could compound to have long-term effects as well.
>>
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Audiences & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] April 17 at 11:30 AM PDT, 19:30 UTC

2019-04-15 Thread Janna Layton
Just a reminder that this Showcase will be happening on Wednesday.

On Thu, Apr 11, 2019 at 4:49 PM Janna Layton  wrote:

> Hello, everyone,
>
> The next Research Showcase, “Group Membership and Contributions to Public
> Information Goods: The Case of WikiProject” and “Thanks for Stopping By:
> A Study of ‘Thanks’ Usage on Wikimedia,” will be live-streamed next
> Wednesday, April 17, 2019, at 11:30 AM PDT/19:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=zmb5LoJzOoE
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
>
>
> Group Membership and Contributions to Public Information Goods: The Case
> of WikiProject
>
> By Ark Fangzhou Zhang
>
> Abstract:
>
> We investigate the effects of group identity on contribution behavior on
> the English Wikipedia, the largest online encyclopedia that gives free
> access to the public. Using an instrumental variable approach that exploits
> the variations in one’s exposure to WikiProject, we find that joining a
> WikiProject has a significant impact on one’s level of contribution, with
> an average increase of 79 revisions or 8,672 character per month. To
> uncover the potential mechanism underlying the treatment effect, we use the
> size of home page for WikiProject as a proxy for the number of
> recommendations from a project. The results show that the users who join a
> WikiProject with more recommendations significantly increase their
> contribution to articles under the joined project, but not to articles
> under other projects.
>
>
>
> Thanks for Stopping By: A Study of ‘Thanks’ Usage on Wikimedia
>
> By Swati Goel
>
> Abstract:
>
> The Thanks feature on Wikipedia, also known as "Thanks," is a tool with
> which editors can quickly and easily send one other positive feedback. The
> aim of this project is to better understand this feature: its scope, the
> characteristics of a typical "Thanks" interaction, and the effects of
> receiving a thank on individual editors. We study the motivational impacts
> of "Thanks" because maintaining editor engagement is a central problem for
> crowdsourced repositories of knowledge such as Wikimedia. Our main findings
> are that most editors have not been exposed to the Thanks feature (meaning
> they have never given nor received a thank), thanks are typically sent
> upwards (from less experienced to more experienced editors), and receiving
> a thank is correlated with having high levels of editor engagement. Though
> the prevalence of "Thanks" usage varies by editor experience, the impact of
> receiving a thank seems mostly consistent for all users. We empirically
> demonstrate that receiving a thank has a strong positive effect on
> short-term editor activity across the board and provide preliminary
> evidence that thanks could compound to have long-term effects as well.
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] April 17 at 11:30 AM PDT, 19:30 UTC

2019-04-11 Thread Janna Layton
Hello, everyone,

The next Research Showcase, “Group Membership and Contributions to Public
Information Goods: The Case of WikiProject” and “Thanks for Stopping By: A
Study of ‘Thanks’ Usage on Wikimedia,” will be live-streamed next
Wednesday, April 17, 2019, at 11:30 AM PDT/19:30 UTC.

YouTube stream: https://www.youtube.com/watch?v=zmb5LoJzOoE

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:



Group Membership and Contributions to Public Information Goods: The Case of
WikiProject

By Ark Fangzhou Zhang

Abstract:

We investigate the effects of group identity on contribution behavior on
the English Wikipedia, the largest online encyclopedia that gives free
access to the public. Using an instrumental variable approach that exploits
the variations in one’s exposure to WikiProject, we find that joining a
WikiProject has a significant impact on one’s level of contribution, with
an average increase of 79 revisions or 8,672 character per month. To
uncover the potential mechanism underlying the treatment effect, we use the
size of home page for WikiProject as a proxy for the number of
recommendations from a project. The results show that the users who join a
WikiProject with more recommendations significantly increase their
contribution to articles under the joined project, but not to articles
under other projects.



Thanks for Stopping By: A Study of ‘Thanks’ Usage on Wikimedia

By Swati Goel

Abstract:

The Thanks feature on Wikipedia, also known as "Thanks," is a tool with
which editors can quickly and easily send one other positive feedback. The
aim of this project is to better understand this feature: its scope, the
characteristics of a typical "Thanks" interaction, and the effects of
receiving a thank on individual editors. We study the motivational impacts
of "Thanks" because maintaining editor engagement is a central problem for
crowdsourced repositories of knowledge such as Wikimedia. Our main findings
are that most editors have not been exposed to the Thanks feature (meaning
they have never given nor received a thank), thanks are typically sent
upwards (from less experienced to more experienced editors), and receiving
a thank is correlated with having high levels of editor engagement. Though
the prevalence of "Thanks" usage varies by editor experience, the impact of
receiving a thank seems mostly consistent for all users. We empirically
demonstrate that receiving a thank has a strong positive effect on
short-term editor activity across the board and provide preliminary
evidence that thanks could compound to have long-term effects as well.


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] March 20 at 11:30 AM PST, 18:30 UTC

2019-03-20 Thread Janna Layton
Hello everyone,

I'm just sending a reminder that the below Showcase will be starting in
half an hour.

-Janna Layton

Hi all,

The next Research Showcase, “Learning How to Correct a Knowledge Base
from the Edit History” and “TableNet: An Approach for Determining
Fine-grained Relations for Wikipedia Tables” will be live-streamed
this Wednesday, March 20, 2019, at 11:30 AM PST/18:30 UTC (Please note
the change in time in UTC due to daylight saving changes in the U.S.).
The first presentation is about using edit history to automatically
correct constraint violations in Wikidata, and the second is about
interlinking Wikipedia tables.

YouTube stream: https://www.youtube.com/watch?v=6p62PMhkVNM

As usual, you can join the conversation on IRC at #wikimedia-research.
You can also watch our past research showcases at
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase .

This month's presentations:

Learning How to Correct a Knowledge Base from the Edit History

By Thomas Pellissier Tanon (Télécom ParisTech), Camille Bourgaux (DI
ENS, CNRS, ENS, PSL Univ. & Inria), Fabian Suchanek (Télécom
ParisTech), WWW'19.

The curation of Wikidata (and other knowledge bases) is crucial to
keep the data consistent, to fight vandalism and to correct good faith
mistakes. However, manual curation of the data is costly. In this
work, we propose to take advantage of the edit history of the
knowledge base in order to learn how to correct constraint violations
automatically. Our method is based on rule mining, and uses the edits
that solved violations in the past to infer how to solve similar
violations in the present. For example, our system is able to learn
that the value of the [[d:Property:P21|sex or gender]] property
[[d:Q467|woman]] should be replaced by [[d:Q6581072|female]]. We
provide [https://tools.wmflabs.org/wikidata-game/distributed/#game=43
a Wikidata game] that suggests our corrections to the users in order
to improve Wikidata. Both the evaluation of our method on past
corrections, and the Wikidata game statistics show significant
improvements over baselines.


TableNet: An Approach for Determining Fine-grained Relations for
Wikipedia Tables

By Besnik Fetahu

Wikipedia tables represent an important resource, where information is
organized w.r.t table schemas consisting of columns. In turn each
column, may contain instance values that point to other Wikipedia
articles or primitive values (e.g. numbers, strings etc.). In this
work, we focus on the problem of interlinking Wikipedia tables for two
types of table relations: equivalent and subPartOf. Through such
relations, we can further harness semantically related information by
accessing related tables or facts therein. Determining the relation
type of a table pair is not trivial, as it is dependent on the
schemas, the values therein, and the semantic overlap of the cell
values in the corresponding tables. We propose TableNet, an approach
that constructs a knowledge graph of interlinked tables with subPartOf
and equivalent relations. TableNet consists of two main steps: (i) for
any source table we provide an efficient algorithm to find all
candidate related tables with high coverage, and (ii) a neural based
approach, which takes into account the table schemas, and the
corresponding table data, we determine with high accuracy the table
relation for a table pair. We perform an extensive experimental
evaluation on the entire Wikipedia with more than 3.2 million tables.
We show that with more than 88% we retain relevant candidate tables
pairs for alignment. Consequentially, with an accuracy of 90% we are
able to align tables with subPartOf or equivalent relations.
Comparisons with existing competitors show that TableNet has superior
performance in terms of coverage and alignment accuracy.

-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] February 20 at 11:30 AM PST, 19:30 UTC

2019-02-20 Thread Janna Layton
The Research Showcase will be starting in about 30 minutes! Info below:

On Thu, Feb 14, 2019 at 11:20 AM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase, “The_Tower_of_Babel.jpg” and “A Warm Welcome,
> Not a Cold Start,” will be live-streamed next Wednesday, February 20, 2019,
> at 11:30 AM PST/19:30 UTC. The first presentation is about how images are
> used across language editions, and the second is about new editors.
>
>
> YouTube stream: https://www.youtube.com/watch?v=_jpJIFXwlEg
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> The_Tower_of_Babel.jpg: Diversity of Visual Encyclopedic Knowledge Across
> Wikipedia Language Editions
>
>
> By Shiqing He (presenting, University of Michigan), Brent Hecht
> (presenting, Northwestern University), Allen Yilun Lin (Northwestern
> University), Eytan Adar (University of Michigan), ICWSM'18.
>
>
> Across all Wikipedia language editions, millions of images augment text in
> critical ways. This visual encyclopedic knowledge is an important form of
> wikiwork for editors, a critical part of reader experience, an emerging
> resource for machine learning, and a lens into cultural differences.
> However, Wikipedia research--and cross-language edition Wikipedia research
> in particular--has thus far been limited to text. In this paper, we assess
> the diversity of visual encyclopedic knowledge across 25 language editions
> and compare our findings to those reported for textual content. Unlike
> text, translation in images is largely unnecessary. Additionally, the
> Wikimedia Foundation, through the Wikipedia Commons, has taken steps to
> simplify cross-language image sharing. While we may expect that these
> factors would reduce image diversity, we find that cross-language image
> diversity rivals, and often exceeds, that found in text. We find that
> diversity varies between language pairs and content types, but that many
> images are unique to different language editions. Our findings have
> implications for readers (in what imagery they see), for editors (in
> deciding what images to use), for researchers (who study cultural
> variations), and for machine learning developers (who use Wikipedia for
> training models).
>
>
> A Warm Welcome, Not a Cold Start: Eliciting New Editors' Interests via
> Questionnaires
>
>
> By Ramtin Yazdanian (presenting, Ecole Polytechnique Federale de Lausanne)
>
> Every day, thousands of users sign up as new Wikipedia contributors. Once
> joined, these users have to decide which articles to contribute to, which
> users to reach out to and learn from or collaborate with, etc. Any such
> task is a hard and potentially frustrating one given the sheer size of
> Wikipedia. Supporting newcomers in their first steps by recommending
> articles they would enjoy editing or editors they would enjoy collaborating
> with is thus a promising route toward converting them into long-term
> contributors. Standard recommender systems, however, rely on users'
> histories of previous interactions with the platform. As such, these
> systems cannot make high-quality recommendations to newcomers without any
> previous interactions -- the so-called cold-start problem. Our aim is to
> address the cold-start problem on Wikipedia by developing a method for
> automatically building short questionnaires that, when completed by a newly
> registered Wikipedia user, can be used for a variety of purposes, including
> article recommendations that can help new editors get started. Our
> questionnaires are constructed based on the text of Wikipedia articles as
> well as the history of contributions by the already onboarded Wikipedia
> editors. We have assessed the quality of our questionnaire-based
> recommendations in an offline evaluation using historical data, as well as
> an online evaluation with hundreds of real Wikipedia newcomers, concluding
> that our method provides cohesive, human-readable questions that perform
> well against several baselines. By addressing the cold-start problem, this
> work can help with the sustainable growth and maintenance of Wikipedia's
> diverse editor community.
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] February 20 at 11:30 AM PST, 19:30 UTC

2019-02-19 Thread Janna Layton
This is just a reminder that the event below will be happening tomorrow,
February 20.

On Thu, Feb 14, 2019 at 11:20 AM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase, “The_Tower_of_Babel.jpg” and “A Warm Welcome,
> Not a Cold Start,” will be live-streamed next Wednesday, February 20, 2019,
> at 11:30 AM PST/19:30 UTC. The first presentation is about how images are
> used across language editions, and the second is about new editors.
>
>
> YouTube stream: https://www.youtube.com/watch?v=_jpJIFXwlEg
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> The_Tower_of_Babel.jpg: Diversity of Visual Encyclopedic Knowledge Across
> Wikipedia Language Editions
>
>
> By Shiqing He (presenting, University of Michigan), Brent Hecht
> (presenting, Northwestern University), Allen Yilun Lin (Northwestern
> University), Eytan Adar (University of Michigan), ICWSM'18.
>
>
> Across all Wikipedia language editions, millions of images augment text in
> critical ways. This visual encyclopedic knowledge is an important form of
> wikiwork for editors, a critical part of reader experience, an emerging
> resource for machine learning, and a lens into cultural differences.
> However, Wikipedia research--and cross-language edition Wikipedia research
> in particular--has thus far been limited to text. In this paper, we assess
> the diversity of visual encyclopedic knowledge across 25 language editions
> and compare our findings to those reported for textual content. Unlike
> text, translation in images is largely unnecessary. Additionally, the
> Wikimedia Foundation, through the Wikipedia Commons, has taken steps to
> simplify cross-language image sharing. While we may expect that these
> factors would reduce image diversity, we find that cross-language image
> diversity rivals, and often exceeds, that found in text. We find that
> diversity varies between language pairs and content types, but that many
> images are unique to different language editions. Our findings have
> implications for readers (in what imagery they see), for editors (in
> deciding what images to use), for researchers (who study cultural
> variations), and for machine learning developers (who use Wikipedia for
> training models).
>
>
> A Warm Welcome, Not a Cold Start: Eliciting New Editors' Interests via
> Questionnaires
>
>
> By Ramtin Yazdanian (presenting, Ecole Polytechnique Federale de Lausanne)
>
> Every day, thousands of users sign up as new Wikipedia contributors. Once
> joined, these users have to decide which articles to contribute to, which
> users to reach out to and learn from or collaborate with, etc. Any such
> task is a hard and potentially frustrating one given the sheer size of
> Wikipedia. Supporting newcomers in their first steps by recommending
> articles they would enjoy editing or editors they would enjoy collaborating
> with is thus a promising route toward converting them into long-term
> contributors. Standard recommender systems, however, rely on users'
> histories of previous interactions with the platform. As such, these
> systems cannot make high-quality recommendations to newcomers without any
> previous interactions -- the so-called cold-start problem. Our aim is to
> address the cold-start problem on Wikipedia by developing a method for
> automatically building short questionnaires that, when completed by a newly
> registered Wikipedia user, can be used for a variety of purposes, including
> article recommendations that can help new editors get started. Our
> questionnaires are constructed based on the text of Wikipedia articles as
> well as the history of contributions by the already onboarded Wikipedia
> editors. We have assessed the quality of our questionnaire-based
> recommendations in an offline evaluation using historical data, as well as
> an online evaluation with hundreds of real Wikipedia newcomers, concluding
> that our method provides cohesive, human-readable questions that perform
> well against several baselines. By addressing the cold-start problem, this
> work can help with the sustainable growth and maintenance of Wikipedia's
> diverse editor community.
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] February 20 at 11:30 AM PST, 19:30 UTC

2019-02-14 Thread Janna Layton
Hello everyone,

The next Research Showcase, “The_Tower_of_Babel.jpg” and “A Warm Welcome,
Not a Cold Start,” will be live-streamed next Wednesday, February 20, 2019,
at 11:30 AM PST/19:30 UTC. The first presentation is about how images are
used across language editions, and the second is about new editors.


YouTube stream: https://www.youtube.com/watch?v=_jpJIFXwlEg

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

The_Tower_of_Babel.jpg: Diversity of Visual Encyclopedic Knowledge Across
Wikipedia Language Editions


By Shiqing He (presenting, University of Michigan), Brent Hecht
(presenting, Northwestern University), Allen Yilun Lin (Northwestern
University), Eytan Adar (University of Michigan), ICWSM'18.


Across all Wikipedia language editions, millions of images augment text in
critical ways. This visual encyclopedic knowledge is an important form of
wikiwork for editors, a critical part of reader experience, an emerging
resource for machine learning, and a lens into cultural differences.
However, Wikipedia research--and cross-language edition Wikipedia research
in particular--has thus far been limited to text. In this paper, we assess
the diversity of visual encyclopedic knowledge across 25 language editions
and compare our findings to those reported for textual content. Unlike
text, translation in images is largely unnecessary. Additionally, the
Wikimedia Foundation, through the Wikipedia Commons, has taken steps to
simplify cross-language image sharing. While we may expect that these
factors would reduce image diversity, we find that cross-language image
diversity rivals, and often exceeds, that found in text. We find that
diversity varies between language pairs and content types, but that many
images are unique to different language editions. Our findings have
implications for readers (in what imagery they see), for editors (in
deciding what images to use), for researchers (who study cultural
variations), and for machine learning developers (who use Wikipedia for
training models).


A Warm Welcome, Not a Cold Start: Eliciting New Editors' Interests via
Questionnaires


By Ramtin Yazdanian (presenting, Ecole Polytechnique Federale de Lausanne)

Every day, thousands of users sign up as new Wikipedia contributors. Once
joined, these users have to decide which articles to contribute to, which
users to reach out to and learn from or collaborate with, etc. Any such
task is a hard and potentially frustrating one given the sheer size of
Wikipedia. Supporting newcomers in their first steps by recommending
articles they would enjoy editing or editors they would enjoy collaborating
with is thus a promising route toward converting them into long-term
contributors. Standard recommender systems, however, rely on users'
histories of previous interactions with the platform. As such, these
systems cannot make high-quality recommendations to newcomers without any
previous interactions -- the so-called cold-start problem. Our aim is to
address the cold-start problem on Wikipedia by developing a method for
automatically building short questionnaires that, when completed by a newly
registered Wikipedia user, can be used for a variety of purposes, including
article recommendations that can help new editors get started. Our
questionnaires are constructed based on the text of Wikipedia articles as
well as the history of contributions by the already onboarded Wikipedia
editors. We have assessed the quality of our questionnaire-based
recommendations in an offline evaluation using historical data, as well as
an online evaluation with hundreds of real Wikipedia newcomers, concluding
that our method provides cohesive, human-readable questions that perform
well against several baselines. By addressing the cold-start problem, this
work can help with the sustainable growth and maintenance of Wikipedia's
diverse editor community.


-- 
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday, January 16 at 11:30 AM PST, 19:30 UTC

2019-01-16 Thread Janna Layton
Reminder that this event is starting in about 30 minutes!

YouTube stream: https://www.youtube.com/watch?v=Fc51jE_KNTc

As usual, you can join the conversation on IRC at #wikimedia-research.

This month's presentation:

*Understanding participation in Wikipedia: Studies on the relationship
between new editors’ motivations and activity*
By Martina Balestra, New York University

Peer production communities like Wikipedia often struggle to retain
contributors beyond their initial engagement. Theory suggests this may be
related to their levels of motivation, though prior studies either center
on contributors’ activity or use cross-sectional survey methods, and
overlook accompanied changes in motivation. In this talk, I will present a
series of studies aimed at filling this gap. We begin by looking at how
Wikipedia editors’ early motivations influence the activities that they
come to engage in, and how these motivations change over the first three
months of participation in Wikipedia. We then look at the relationship
between editing activity and intrinsic motivation specifically over time.
We find that new editors’ early motivations are predictive of their future
activity, but that these motivations tend to change with time. Moreover,
newcomers’ intrinsic motivation is reinforced by the amount of activity
they engage in over time: editors who had a high level of intrinsic
motivation entered a virtuous cycle where the more they edited the more
motivated they became, whereas those who initially had low intrinsic
motivation entered a vicious cycle. Our findings shed new light on the
importance of early experiences and reveal that the relationship between
motivation and activity is more complex than previously understood.

Geography and knowledge. Reviving an old relationship with Wiki
AtlasBy Anastasios
Noulas, New York University
Wiki Atlas <https://www.wiki-atlas.org/> is an interactive cartography
tool. The tool renders Wikipedia content in a 3-dimensional, web-based
cartographic environment. The map acts as a medium that enables the
discovery and exploration of articles in a manner that explicitly
associates geography and information. At its current prototype form, a
Wikipedia article is represented on the map as a 3D element whose height
property is proportional to the number of views the article has on the
website. This property enables the discovery of relevant content, in a
manner that reflects the significance of the target element by means of
collective attention by the site’s audience.

On Thu, Jan 10, 2019 at 10:49 AM Janna Layton  wrote:

> Hello, everyone,
>
> The next Research Showcase, *Understanding participation in Wikipedia*,
> will be live-streamed next Wednesday, January 16, at 11:30 AM PST/19:30
> UTC. This presentation is about new editors.
>
> YouTube stream: https://www.youtube.com/watch?v=Fc51jE_KNTc
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> *Understanding participation in Wikipedia: Studies on the relationship
> between new editors’ motivations and activity*
>
> By Martina Balestra, New York University
>
> Peer production communities like Wikipedia often struggle to retain
> contributors beyond their initial engagement. Theory suggests this may be
> related to their levels of motivation, though prior studies either center
> on contributors’ activity or use cross-sectional survey methods, and
> overlook accompanied changes in motivation. In this talk, I will present a
> series of studies aimed at filling this gap. We begin by looking at how
> Wikipedia editors’ early motivations influence the activities that they
> come to engage in, and how these motivations change over the first three
> months of participation in Wikipedia. We then look at the relationship
> between editing activity and intrinsic motivation specifically over time.
> We find that new editors’ early motivations are predictive of their future
> activity, but that these motivations tend to change with time. Moreover,
> newcomers’ intrinsic motivation is reinforced by the amount of activity
> they engage in over time: editors who had a high level of intrinsic
> motivation entered a virtuous cycle where the more they edited the more
> motivated they became, whereas those who initially had low intrinsic
> motivation entered a vicious cycle. Our findings shed new light on the
> importance of early experiences and reveal that the relationship between
> motivation and activity is more complex than previously understood.
>
> --
> Janna Layton
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton
Administrative Assistant - Aud

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday, January 16 at 11:30 AM PST, 19:30 UTC

2019-01-14 Thread Janna Layton
Hello all,

Just a reminder about our Research Showcase this Wednesday, and an update
that we now have a 2nd presenter:

Geography and knowledge. Reviving an old relationship with Wiki
AtlasBy *Anastasios
Noulas, New York University*Wiki Atlas <https://www.wiki-atlas.org/> is an
interactive cartography tool. The tool renders Wikipedia content in a
3-dimensional, web-based cartographic environment. The map acts as a medium
that enables the discovery and exploration of articles in a manner that
explicitly associates geography and information. At its current prototype
form, a Wikipedia article is represented on the map as a 3D element whose
height property is proportional to the number of views the article has on
the website. This property enables the discovery of relevant content, in a
manner that reflects the significance of the target element by means of
collective attention by the site’s audience.

On Thu, Jan 10, 2019 at 10:49 AM Janna Layton  wrote:

> Hello, everyone,
>
> The next Research Showcase, *Understanding participation in Wikipedia*,
> will be live-streamed next Wednesday, January 16, at 11:30 AM PST/19:30
> UTC. This presentation is about new editors.
>
> YouTube stream: https://www.youtube.com/watch?v=Fc51jE_KNTc
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> *Understanding participation in Wikipedia: Studies on the relationship
> between new editors’ motivations and activity*
>
> By Martina Balestra, New York University
>
> Peer production communities like Wikipedia often struggle to retain
> contributors beyond their initial engagement. Theory suggests this may be
> related to their levels of motivation, though prior studies either center
> on contributors’ activity or use cross-sectional survey methods, and
> overlook accompanied changes in motivation. In this talk, I will present a
> series of studies aimed at filling this gap. We begin by looking at how
> Wikipedia editors’ early motivations influence the activities that they
> come to engage in, and how these motivations change over the first three
> months of participation in Wikipedia. We then look at the relationship
> between editing activity and intrinsic motivation specifically over time.
> We find that new editors’ early motivations are predictive of their future
> activity, but that these motivations tend to change with time. Moreover,
> newcomers’ intrinsic motivation is reinforced by the amount of activity
> they engage in over time: editors who had a high level of intrinsic
> motivation entered a virtuous cycle where the more they edited the more
> motivated they became, whereas those who initially had low intrinsic
> motivation entered a vicious cycle. Our findings shed new light on the
> importance of early experiences and reveal that the relationship between
> motivation and activity is more complex than previously understood.
>
> --
> Janna Layton
> Administrative Assistant - Audiences & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


-- 
Janna Layton
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] Wednesday, January 16 at 11:30 AM PST, 19:30 UTC

2019-01-10 Thread Janna Layton
Hello, everyone,

The next Research Showcase, *Understanding participation in Wikipedia*,
will be live-streamed next Wednesday, January 16, at 11:30 AM PST/19:30
UTC. This presentation is about new editors.

YouTube stream: https://www.youtube.com/watch?v=Fc51jE_KNTc

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentation:

*Understanding participation in Wikipedia: Studies on the relationship
between new editors’ motivations and activity*

By Martina Balestra, New York University

Peer production communities like Wikipedia often struggle to retain
contributors beyond their initial engagement. Theory suggests this may be
related to their levels of motivation, though prior studies either center
on contributors’ activity or use cross-sectional survey methods, and
overlook accompanied changes in motivation. In this talk, I will present a
series of studies aimed at filling this gap. We begin by looking at how
Wikipedia editors’ early motivations influence the activities that they
come to engage in, and how these motivations change over the first three
months of participation in Wikipedia. We then look at the relationship
between editing activity and intrinsic motivation specifically over time.
We find that new editors’ early motivations are predictive of their future
activity, but that these motivations tend to change with time. Moreover,
newcomers’ intrinsic motivation is reinforced by the amount of activity
they engage in over time: editors who had a high level of intrinsic
motivation entered a virtuous cycle where the more they edited the more
motivated they became, whereas those who initially had low intrinsic
motivation entered a vicious cycle. Our findings shed new light on the
importance of early experiences and reveal that the relationship between
motivation and activity is more complex than previously understood.

-- 
Janna Layton
Administrative Assistant - Audiences & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday, December 12 at 11:30 AM PST, 19:30 UTC

2018-12-12 Thread Janna Layton
This event will start in 30 minutes.

On Thu, Dec 6, 2018 at 1:43 PM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase, *Why the World Reads Wikipedia*, will be
> live-streamed this Wednesday, December 12, 2018, at 11:30 AM PST/19:30 UTC.
> This presentation is about Wikipedia usage across languages.
>
> YouTube stream: https://www.youtube.com/watch?v=RKMFvi_CCB0
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> *Why the World Reads Wikipedia*
>
> By Florian Lemmerich, RWTH Aachen University; Diego Sáez-Trumper,
> Wikimedia Foundation; Robert West, EPFL; and Leila Zia, Wikimedia Foundation
>
> So far, little is known about why users across the world read Wikipedia's
> various language editions. To bridge this gap, we conducted a comparative
> study by combining a large-scale survey of Wikipedia readers across 14
> language editions with a log-based analysis of user activity. For analysis,
> we proceeded in three steps: First, we analyzed the survey results to
> compare the prevalence of Wikipedia use cases across languages, discovering
> commonalities, but also substantial differences, among Wikipedia languages
> with respect to their usage. Second, we matched survey responses to the
> respondents' traces in Wikipedia's server logs to characterize behavioral
> patterns associated with specific use cases, finding that distinctive
> patterns consistently mark certain use cases across language editions.
> Third, we could show that certain Wikipedia use cases are more common in
> countries with certain socio-economic characteristics; e.g., in-depth
> reading of Wikipedia articles is substantially more common in countries
> with a low Human Development Index. The outcomes of this study provide a
> deeper understanding of Wikipedia readership in a wide range of languages,
> which is important for Wikipedia editors, developers, and the reusers of
> Wikipedia content.
>
> --
> Janna Layton
> Administrative Assistant - Audiences & Technology
>
> Wikimedia Foundation
> 1 Montgomery St. Suite 1600
> San Francisco, CA 94104
>


-- 
Janna Layton
Administrative Assistant - Audiences & Technology

Wikimedia Foundation
1 Montgomery St. Suite 1600
San Francisco, CA 94104
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday, December 12 at 11:30 AM PST, 19:30 UTC

2018-12-10 Thread Janna Layton
Hello again, everyone. Just a reminder that *Why the World Reads Wikipedia*,
this month's Research Showcase, will be livestreaming this Wednesday. More
info below.
On Thu, Dec 6, 2018 at 1:43 PM Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase, *Why the World Reads Wikipedia*, will be
> live-streamed this Wednesday, December 12, 2018, at 11:30 AM PST/19:30 UTC.
> This presentation is about Wikipedia usage across languages.
>
> YouTube stream: https://www.youtube.com/watch?v=RKMFvi_CCB0
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> *Why the World Reads Wikipedia*
>
> By Florian Lemmerich, RWTH Aachen University; Diego Sáez-Trumper,
> Wikimedia Foundation; Robert West, EPFL; and Leila Zia, Wikimedia Foundation
>
> So far, little is known about why users across the world read Wikipedia's
> various language editions. To bridge this gap, we conducted a comparative
> study by combining a large-scale survey of Wikipedia readers across 14
> language editions with a log-based analysis of user activity. For analysis,
> we proceeded in three steps: First, we analyzed the survey results to
> compare the prevalence of Wikipedia use cases across languages, discovering
> commonalities, but also substantial differences, among Wikipedia languages
> with respect to their usage. Second, we matched survey responses to the
> respondents' traces in Wikipedia's server logs to characterize behavioral
> patterns associated with specific use cases, finding that distinctive
> patterns consistently mark certain use cases across language editions.
> Third, we could show that certain Wikipedia use cases are more common in
> countries with certain socio-economic characteristics; e.g., in-depth
> reading of Wikipedia articles is substantially more common in countries
> with a low Human Development Index. The outcomes of this study provide a
> deeper understanding of Wikipedia readership in a wide range of languages,
> which is important for Wikipedia editors, developers, and the reusers of
> Wikipedia content.
>
> --
> Janna Layton
> Administrative Assistant - Audiences & Technology
>
> Wikimedia Foundation
> 1 Montgomery St. Suite 1600
> San Francisco, CA 94104
>


-- 
Janna Layton
Administrative Assistant - Audiences & Technology

Wikimedia Foundation
1 Montgomery St. Suite 1600
San Francisco, CA 94104
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday October 17, 2018 at 11:30 AM (PST) 18:30 UTC

2018-10-17 Thread Janna Layton
Just a reminder that this event will be starting in about 30 minutes.

YouTube stream: https://www.youtube.com/watch?v=UJrJLWuNvXo

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here: https://www.mediawiki.or
g/wiki/Wikimedia_Research/Showcase

This month's presentation:

*"Welcome" Changes? Descriptive and Injunctive Norms in a Wikipedia
Sub-Community*

*By Jonathan T. Morgan, Wikimedia Foundation and Anna Filippova, GitHub*

Open online communities rely on social norms for behavior regulation, group
cohesion, and sustainability. Research on the role of social norms online
has mainly focused on one source of influence at a time, making it
difficult to separate different normative influences and understand their
interactions. In this study, we use the Focus Theory to examine
interactions between several sources of normative influence in a Wikipedia
sub-community: local descriptive norms, local injunctive norms, and norms
imported from similar sub- communities. We find that exposure to injunctive
norms has a stronger effect than descriptive norms, that the likelihood of
performing a behavior is higher when both injunctive and descriptive norms
are congruent, and that conflicting social norms may negatively impact
pro-normative behavior. We contextualize these findings through member
interviews, and discuss their implications for both future research on
normative influence in online groups and the design of systems that support
open collaboration.


*The pipeline of online participation inequalities: The case of Wikipedia
Editing*

*By Aaron Shaw, Northwestern University and Eszter Hargittai, University of
Zurich*

Participatory platforms like the Wikimedia projects have unique potential
to facilitate more equitable knowledge production. However, digital
inequalities such as the Wikipedia gender gap undermine this democratizing
potential. In this talk, I present new research in which Eszter Hargittai
and I conceptualize a "pipeline" of online participation and model distinct
levels of awareness and behaviors necessary to become a contributor to the
participatory web. We test the theory in the case of Wikipedia editing,
using new survey data from a diverse, national sample of adult internet
users in the U.S.

The results show that Wikipedia participation consistently reflects
inequalities of education and internet experiences and skills. We find that
the gender gap only emerges later in the pipeline whereas gaps along racial
and socioeconomic lines explain variations earlier in the pipeline. Our
findings underscore the multidimensionality of digital inequalities and
suggest new pathways toward closing knowledge gaps by highlighting the
importance of education and Internet skills.

We conclude that future research and interventions to overcome digital
participation gaps should not focus exclusively on gender or class
differences in content creation, but expand to address multiple aspects of
digital inequality across pipelines of participation. In particular, when
it comes to overcoming gender gaps in the case of Wikipedia, our results
suggest that continued emphasis on recruiting female editors should include
efforts to disseminate the knowledge that Wikipedia can be edited. Our
findings support broader efforts to overcome knowledge- and skill-based
barriers to entry among potential contributors to the open web.


On Mon, Oct 15, 2018 at 11:35 AM, Janna Layton 
wrote:

> Hello all,
>
> Just a reminder that the Research Showcase (info below) will be this
> Wednesday.
>
> On Thu, Oct 11, 2018 at 3:04 PM, Janna Layton 
> wrote:
>
>> Hello everyone,
>>
>> The next Research Showcase will be live-streamed this Wednesday, October
>> 17, 2018 at 11:30 AM (PST) 18:30 UTC.
>>
>> YouTube stream: https://www.youtube.com/watch?v=UJrJLWuNvXo
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentation:
>>
>> *"Welcome" Changes? Descriptive and Injunctive Norms in a Wikipedia
>> Sub-Community*
>>
>> *By Jonathan T. Morgan, Wikimedia Foundation and Anna Filippova, GitHub*
>>
>> Open online communities rely on social norms for behavior regulation,
>> group cohesion, and sustainability. Research on the role of social norms
>> online has mainly focused on one source of influence at a time, making it
>> difficult to separate different normative influences and understand their
>> interactions. In this study, we use the Focus Theory to examine
>> interactions between several sources of normative influence in a Wikipedia
>> sub-community: local descriptive norms, local injunctive norms, and nor

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday October 17, 2018 at 11:30 AM (PST) 18:30 UTC

2018-10-15 Thread Janna Layton
Hello all,

Just a reminder that the Research Showcase (info below) will be this
Wednesday.

On Thu, Oct 11, 2018 at 3:04 PM, Janna Layton  wrote:

> Hello everyone,
>
> The next Research Showcase will be live-streamed this Wednesday, October
> 17, 2018 at 11:30 AM (PST) 18:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=UJrJLWuNvXo
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here: https://www.mediawiki.or
> g/wiki/Wikimedia_Research/Showcase
>
> This month's presentation:
>
> *"Welcome" Changes? Descriptive and Injunctive Norms in a Wikipedia
> Sub-Community*
>
> *By Jonathan T. Morgan, Wikimedia Foundation and Anna Filippova, GitHub*
>
> Open online communities rely on social norms for behavior regulation,
> group cohesion, and sustainability. Research on the role of social norms
> online has mainly focused on one source of influence at a time, making it
> difficult to separate different normative influences and understand their
> interactions. In this study, we use the Focus Theory to examine
> interactions between several sources of normative influence in a Wikipedia
> sub-community: local descriptive norms, local injunctive norms, and norms
> imported from similar sub- communities. We find that exposure to injunctive
> norms has a stronger effect than descriptive norms, that the likelihood of
> performing a behavior is higher when both injunctive and descriptive norms
> are congruent, and that conflicting social norms may negatively impact
> pro-normative behavior. We contextualize these findings through member
> interviews, and discuss their implications for both future research on
> normative influence in online groups and the design of systems that support
> open collaboration.
>
>
> *The pipeline of online participation inequalities: The case of Wikipedia
> Editing*
>
> *By Aaron Shaw, Northwestern University and Eszter Hargittai, University
> of Zurich*
>
> Participatory platforms like the Wikimedia projects have unique potential
> to facilitate more equitable knowledge production. However, digital
> inequalities such as the Wikipedia gender gap undermine this democratizing
> potential. In this talk, I present new research in which Eszter Hargittai
> and I conceptualize a "pipeline" of online participation and model distinct
> levels of awareness and behaviors necessary to become a contributor to the
> participatory web. We test the theory in the case of Wikipedia editing,
> using new survey data from a diverse, national sample of adult internet
> users in the U.S.
>
> The results show that Wikipedia participation consistently reflects
> inequalities of education and internet experiences and skills. We find that
> the gender gap only emerges later in the pipeline whereas gaps along racial
> and socioeconomic lines explain variations earlier in the pipeline. Our
> findings underscore the multidimensionality of digital inequalities and
> suggest new pathways toward closing knowledge gaps by highlighting the
> importance of education and Internet skills.
>
> We conclude that future research and interventions to overcome digital
> participation gaps should not focus exclusively on gender or class
> differences in content creation, but expand to address multiple aspects of
> digital inequality across pipelines of participation. In particular, when
> it comes to overcoming gender gaps in the case of Wikipedia, our results
> suggest that continued emphasis on recruiting female editors should include
> efforts to disseminate the knowledge that Wikipedia can be edited. Our
> findings support broader efforts to overcome knowledge- and skill-based
> barriers to entry among potential contributors to the open web.
>
>
> --
> Janna Layton
> Administrative Assistant - Audiences & Technology
>
> Wikimedia Foundation
> 1 Montgomery St. Suite 1600
> San Francisco, CA 94104
>



-- 
Janna Layton
Administrative Assistant - Audiences & Technology

Wikimedia Foundation
1 Montgomery St. Suite 1600
San Francisco, CA 94104
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

[Wikimedia-l] [Wikimedia Research Showcase] Wednesday October 17, 2018 at 11:30 AM (PST) 18:30 UTC

2018-10-11 Thread Janna Layton
Hello everyone,

The next Research Showcase will be live-streamed this Wednesday, October
17, 2018 at 11:30 AM (PST) 18:30 UTC.

YouTube stream: https://www.youtube.com/watch?v=UJrJLWuNvXo

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here: https://www.mediawiki.
org/wiki/Wikimedia_Research/Showcase

This month's presentation:

*"Welcome" Changes? Descriptive and Injunctive Norms in a Wikipedia
Sub-Community*

*By Jonathan T. Morgan, Wikimedia Foundation and Anna Filippova, GitHub*

Open online communities rely on social norms for behavior regulation, group
cohesion, and sustainability. Research on the role of social norms online
has mainly focused on one source of influence at a time, making it
difficult to separate different normative influences and understand their
interactions. In this study, we use the Focus Theory to examine
interactions between several sources of normative influence in a Wikipedia
sub-community: local descriptive norms, local injunctive norms, and norms
imported from similar sub- communities. We find that exposure to injunctive
norms has a stronger effect than descriptive norms, that the likelihood of
performing a behavior is higher when both injunctive and descriptive norms
are congruent, and that conflicting social norms may negatively impact
pro-normative behavior. We contextualize these findings through member
interviews, and discuss their implications for both future research on
normative influence in online groups and the design of systems that support
open collaboration.


*The pipeline of online participation inequalities: The case of Wikipedia
Editing*

*By Aaron Shaw, Northwestern University and Eszter Hargittai, University of
Zurich*

Participatory platforms like the Wikimedia projects have unique potential
to facilitate more equitable knowledge production. However, digital
inequalities such as the Wikipedia gender gap undermine this democratizing
potential. In this talk, I present new research in which Eszter Hargittai
and I conceptualize a "pipeline" of online participation and model distinct
levels of awareness and behaviors necessary to become a contributor to the
participatory web. We test the theory in the case of Wikipedia editing,
using new survey data from a diverse, national sample of adult internet
users in the U.S.

The results show that Wikipedia participation consistently reflects
inequalities of education and internet experiences and skills. We find that
the gender gap only emerges later in the pipeline whereas gaps along racial
and socioeconomic lines explain variations earlier in the pipeline. Our
findings underscore the multidimensionality of digital inequalities and
suggest new pathways toward closing knowledge gaps by highlighting the
importance of education and Internet skills.

We conclude that future research and interventions to overcome digital
participation gaps should not focus exclusively on gender or class
differences in content creation, but expand to address multiple aspects of
digital inequality across pipelines of participation. In particular, when
it comes to overcoming gender gaps in the case of Wikipedia, our results
suggest that continued emphasis on recruiting female editors should include
efforts to disseminate the knowledge that Wikipedia can be edited. Our
findings support broader efforts to overcome knowledge- and skill-based
barriers to entry among potential contributors to the open web.


-- 
Janna Layton
Administrative Assistant - Audiences & Technology

Wikimedia Foundation
1 Montgomery St. Suite 1600
San Francisco, CA 94104
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

Re: [Wikimedia-l] [Wikimedia Research Showcase] Wednesday September 19, 2018 at 11:30 AM (PDT) 18:30 UTC

2018-09-19 Thread Janna Layton
Hello everyone,

Just a reminder that the Wikimedia Research Showcase will be happening in
about 30 minutes.

Thank you!

On Thu, Sep 13, 2018 at 1:43 PM, Sarah R  wrote:

> Hi Everyone,
>
> The next Wikimedia Research Showcase will be live-streamed Wednesday,
> September 19 2018 at 11:30 AM (PDT) 18:30 UTC.
>
> YouTube stream: https://www.youtube.com/watch?v=OY8vZ6wES9o
>
> As usual, you can join the conversation on IRC at #wikimedia-research.
> And, you can watch our past research showcases here.
> <https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#Upcoming_Showcase>
>
> Hope to see you there!
>
> This month's presentations is:
>
> The impact of news exposure on collective attention in the United States
> during the 2016 Zika epidemicBy *Michele Tizzoni, André Panisson, Daniela
> Paolotti, Ciro Cattuto*In recent years, many studies have drawn attention
> to the important role of collective awareness and human behaviour during
> epidemic outbreaks. A number of modelling efforts have investigated the
> interaction between the disease transmission dynamics and human behaviour
> change mediated by news coverage and by information spreading in the
> population. Yet, given the scarcity of data on public awareness during an
> epidemic, few studies have relied on empirical data. Here, we use
> fine-grained, geo-referenced data from three online sources - Wikipedia,
> the GDELT Project and the Internet Archive - to quantify population-scale
> information seeking about the 2016 Zika virus epidemic in the U.S.,
> explicitly linking such behavioural signal to epidemiological data.
> Geo-localized Wikipedia pageview data reveal that visiting patterns of
> Zika-related pages in Wikipedia were highly synchronized across the United
> States and largely explained by exposure to national television broadcast.
> Contrary to the assumption of some theoretical models, news volume and
> Wikipedia visiting patterns were not significantly correlated with the
> magnitude or the extent of the epidemic. Attention to Zika, in terms of
> Zika-related Wikipedia pageviews, was high at the beginning of the
> outbreak, when public health agencies raised an international alert and
> triggered media coverage, but subsequently exhibited an activity profile
> that suggests nonlinear dependencies and memory effects in the relationship
> between information seeking, media pressure, and disease dynamics. This
> calls for a new and more general modelling framework to describe the
> interaction between media exposure, public awareness, and disease dynamics
> during epidemic outbreaks.
>
>
> Deliberation and resolution on WikipediaA case study of requests for
> commentsBy *Amy Zhang, Jane Im*Resolving disputes in a timely manner is
> crucial for any online production group. We present an analysis of Requests
> for Comments (RfCs), one of the main vehicles on Wikipedia for formally
> resolving a policy or content dispute. We collected an exhaustive dataset
> of 7,316 RfCs on English Wikipedia over the course of 7 years and conducted
> a qualitative and quantitative analysis into what issues affect the RfC
> process. Our analysis was informed by 10 interviews with frequent RfC
> closers. We found that a major issue affecting the RfC process is the
> prevalence of RfCs that could have benefited from formal closure but that
> linger indefinitely without one, with factors including participants'
> interest and expertise impacting the likelihood of resolution. From these
> findings, we developed a model that predicts whether
>
> --
> Sarah R. Rodlund
> Technical Writer, Developer Advocacy
> <https://meta.wikimedia.org/wiki/Developer_Advocacy>
> srodl...@wikimedia.org
>
>
>
>
>


-- 
Janna Layton
Administrative Assistant - Audiences & Technology
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>