A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
This paper [1] is thoroughly structured and combines the theory of web genres with dialogue theory to examine Wikipedia talk pages. Since Wikipedia is a web genre, "Wikicussions" (as the authors call them) form a subgenre. In this context, talk pages are examined further, including the quality of cooperation between Wikipedia users, that can be linked to social differentiation regarding roles and statuses of Wikipedians (content- vs. administration-related users). These group-related processes can be seen as a mediating layer between external parameters (system requirements for Wikipedia's user community) and the structure and dynamics of WP's subgenres.
Unlike face-to-face dialogue, the authors argue that Wikicussions stand out due to a publicly available common ground (derived from dialogue theory), which may provide a reason for the structures they found.
The paper is enriched with a number of high-quality figures that support and underpin the findings.
Our intuition might tell us that government censorship causes reduced access to online information. But recent research indicates that the effect can be exactly the opposite. Using data gathered from Wikipedia page views and other sources, researchers William Hobbs and Margaret Roberts found that:
“ | [...] citizens accustomed to acquiring this [forbidden] information will be incentivized to learn methods of censorship evasion [...] millions of Chinese users acquire[d] virtual private networks, and subsequently [...] began browsing blocked political pages on Wikipedia, following Chinese political activists on Twitter, and discussing highly politicized topics such as opposition protests in Hong Kong. [2]: 1 | ” |
Specifically, the authors studied the impact of a block of Instagram in China on September 29, 2014, following protests in Hong Kong, on Chinese Wikipedia pages that were already blocked in the country. (This predates the 2015 total block of the Chinese Wikipedia and the switch of all Wikimedia sites to full encryption with HTTPS around the same time, which made such per-page blocking impossible.) The censored Chinese Wikipedia pages with the largest increase in views "shows that new viewers accessed pages that had long been censored including those related to the 1989 Tiananmen Square protests", [2]: 12 i.e. "viewing patterns that would be more typical of new users who had just jumped the firewall, rather than of old VPN users who had presumably consumed this information long ago." [2]: 11 Here is an excerpt of the full list examined in the research, the top 10 for the second day of the block, linked here to their English Wikipedia equivalents:
The researchers propose to name this phenomenon the "gateway effect", a "mechanism through which repression can backfire inadvertently, without political or strategic motivation", [2]: 3 because it incentivizes people to learn how to evade censorship and thus "have more, not less, access to information and begin engaging in conversations, social media sites, and networks that have long been off-limits to them." [2]: 15 They distinguish it from the Streisand effect, where individuals specifically seek out information that is being hidden.
The second author of the study, Margaret Roberts, is also the author of Censored: Distraction and Diversion Inside China's Great Firewall (Princeton University Press, 2018; print ISBN 978-0-691-17886-8, e-book 978-1-400-89005-7).
This study was able to "characterize" the interests of Wikipedia editors and the editors' social media activity on Twitter to facilitate:
“ | [...] building rich user profiles, which can be conveniently used in order to provide personalized contents and offers." and "[...] profiling, i.e., the detection of the user's core interests and, therefore, allows for product and service recommendations far more tailored than those stemming from other (usually) extemporary actions on the Internet, like flight ticket purchases and hotel reservations. In this light, it is important to notice that such a profiling potential associated to social login remains nowadays largely unused and enabling its exploitation is one of the main goals of the present work. [3] | ” |
See the community-curated research events page on Meta-wiki for other upcoming conferences and events, including submission deadlines.
Recent presentations at the monthly Research showcase hosted by the Wikimedia Foundation included the following:
Antisocial behavior can exist in online social systems and may include harassment and personal attacks. A new paper [4] by seven researchers from Cornell University, Jigsaw, and the Wikimedia Foundation describes how the prediction of undesirable negative exchanges may be able to prevent the deterioration of a discussion. Prediction may be possible at the start of a conversation to prevent its deterioration. One of the authors also gave an interview published on the Wikimedia Foundation's blog, [supp 1] and the paper was covered in popular media; see In the media § In brief.
From the announcement (by Aaron Halfaker):
“ | ORES is an open, transparent, and auditable machine prediction platform for Wikipedians to help them do their work. It's currently used in 33 different Wikimedia projects to measure the quality of content, detect vandalism, recommend changes to articles, and to identify good faith newcomers. The primary way that Wikipedians use ORES' predictions is through the tools developed by volunteers. These javascript gadgets, MediaWiki extensions, and web-based tools make up a complex ecosystem of Wikipedian processes – encoded into software. | ” |
The presentation covered "three key tools that Wikipedians have developed that make use of ORES": Wikidata's damage detection models, exposed through Recent Changes; Spanish Wikipedia's PatruBOT; and WikiEdu tools from User:Ragesoss that incorporate article quality models.
Other recent publications that could not be covered in time for this issue include the items listed below. Contributions are always welcome for reviewing or summarizing newly published research.
{{
cite journal}}
: External link in |volume=
(
help)
A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
This paper [1] is thoroughly structured and combines the theory of web genres with dialogue theory to examine Wikipedia talk pages. Since Wikipedia is a web genre, "Wikicussions" (as the authors call them) form a subgenre. In this context, talk pages are examined further, including the quality of cooperation between Wikipedia users, that can be linked to social differentiation regarding roles and statuses of Wikipedians (content- vs. administration-related users). These group-related processes can be seen as a mediating layer between external parameters (system requirements for Wikipedia's user community) and the structure and dynamics of WP's subgenres.
Unlike face-to-face dialogue, the authors argue that Wikicussions stand out due to a publicly available common ground (derived from dialogue theory), which may provide a reason for the structures they found.
The paper is enriched with a number of high-quality figures that support and underpin the findings.
Our intuition might tell us that government censorship causes reduced access to online information. But recent research indicates that the effect can be exactly the opposite. Using data gathered from Wikipedia page views and other sources, researchers William Hobbs and Margaret Roberts found that:
“ | [...] citizens accustomed to acquiring this [forbidden] information will be incentivized to learn methods of censorship evasion [...] millions of Chinese users acquire[d] virtual private networks, and subsequently [...] began browsing blocked political pages on Wikipedia, following Chinese political activists on Twitter, and discussing highly politicized topics such as opposition protests in Hong Kong. [2]: 1 | ” |
Specifically, the authors studied the impact of a block of Instagram in China on September 29, 2014, following protests in Hong Kong, on Chinese Wikipedia pages that were already blocked in the country. (This predates the 2015 total block of the Chinese Wikipedia and the switch of all Wikimedia sites to full encryption with HTTPS around the same time, which made such per-page blocking impossible.) The censored Chinese Wikipedia pages with the largest increase in views "shows that new viewers accessed pages that had long been censored including those related to the 1989 Tiananmen Square protests", [2]: 12 i.e. "viewing patterns that would be more typical of new users who had just jumped the firewall, rather than of old VPN users who had presumably consumed this information long ago." [2]: 11 Here is an excerpt of the full list examined in the research, the top 10 for the second day of the block, linked here to their English Wikipedia equivalents:
The researchers propose to name this phenomenon the "gateway effect", a "mechanism through which repression can backfire inadvertently, without political or strategic motivation", [2]: 3 because it incentivizes people to learn how to evade censorship and thus "have more, not less, access to information and begin engaging in conversations, social media sites, and networks that have long been off-limits to them." [2]: 15 They distinguish it from the Streisand effect, where individuals specifically seek out information that is being hidden.
The second author of the study, Margaret Roberts, is also the author of Censored: Distraction and Diversion Inside China's Great Firewall (Princeton University Press, 2018; print ISBN 978-0-691-17886-8, e-book 978-1-400-89005-7).
This study was able to "characterize" the interests of Wikipedia editors and the editors' social media activity on Twitter to facilitate:
“ | [...] building rich user profiles, which can be conveniently used in order to provide personalized contents and offers." and "[...] profiling, i.e., the detection of the user's core interests and, therefore, allows for product and service recommendations far more tailored than those stemming from other (usually) extemporary actions on the Internet, like flight ticket purchases and hotel reservations. In this light, it is important to notice that such a profiling potential associated to social login remains nowadays largely unused and enabling its exploitation is one of the main goals of the present work. [3] | ” |
See the community-curated research events page on Meta-wiki for other upcoming conferences and events, including submission deadlines.
Recent presentations at the monthly Research showcase hosted by the Wikimedia Foundation included the following:
Antisocial behavior can exist in online social systems and may include harassment and personal attacks. A new paper [4] by seven researchers from Cornell University, Jigsaw, and the Wikimedia Foundation describes how the prediction of undesirable negative exchanges may be able to prevent the deterioration of a discussion. Prediction may be possible at the start of a conversation to prevent its deterioration. One of the authors also gave an interview published on the Wikimedia Foundation's blog, [supp 1] and the paper was covered in popular media; see In the media § In brief.
From the announcement (by Aaron Halfaker):
“ | ORES is an open, transparent, and auditable machine prediction platform for Wikipedians to help them do their work. It's currently used in 33 different Wikimedia projects to measure the quality of content, detect vandalism, recommend changes to articles, and to identify good faith newcomers. The primary way that Wikipedians use ORES' predictions is through the tools developed by volunteers. These javascript gadgets, MediaWiki extensions, and web-based tools make up a complex ecosystem of Wikipedian processes – encoded into software. | ” |
The presentation covered "three key tools that Wikipedians have developed that make use of ORES": Wikidata's damage detection models, exposed through Recent Changes; Spanish Wikipedia's PatruBOT; and WikiEdu tools from User:Ragesoss that incorporate article quality models.
Other recent publications that could not be covered in time for this issue include the items listed below. Contributions are always welcome for reviewing or summarizing newly published research.
{{
cite journal}}
: External link in |volume=
(
help)
Discuss this story