A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
A special issue of the American Behavioral Scientist is devoted to "open collaboration".
A fourth paper in this special issue, titled "The Rise and Decline of an Open Collaboration System: How Wikipedia’s Reaction to Popularity Is Causing Its Decline", found considerable media attention this month, starting with an article in USA Today. It was already reviewed in the September issue of the research report.
While the size and growth rate, editorial workflow, and topical coverage of Wikipedia have been vastly studied, there is little work done on the understanding of public attention to the Wikipedia articles. In a working paper by a team from the Barcelona Media Foundation and the University of Twente, placed on arXiv just before Christmas, [4] the number of clicks on the featured articles promoted to the Wikipedia Main page is analysed and modeled.
A total of 684 featured articles are considered and the page view statistics of them is rescaled by the average circadian view rate extracted from a larger set of 871 395 articles in a period of 844 days. The 4-day lifetime of the promoted articles on the Main page is characterised by four phases. A very rapid growth in the number of article clicks just after the article appears on the Main page, followed by a rather homogeneous period of the first day of the promotion. As the article is replaced by a new featured article, and placed in the "recently featured" part of the Main page, the rate of clicks drops dramatically, and finally, the fourth flat phase is experienced during the remaining 3 days at this location.
In the next step, the authors introduce a rather intuitive model based on a few parameters to fully describe the 4 days cycle in a mathematical framework. The model is tuned based on the data of a set of 100 featured articles to predict the number of page hits for the rest of the sample, given the number of the clicks after the first hour of promotion for each article. The model is relatively accurate in predicting the number of clicks, and this accuracy could be even improved by feeding the model with the number of clicks at the end of the first day instead of the first hour after promotion. While the paper is very clear in describing the methodology, it fails to discuss and provide a deeper understanding of the social mechanisms of popularity and public attention, as it is mentioned repeatedly by the authors.
A paper in Information Processing and Management titled "College students’ credibility judgments and heuristics concerning Wikipedia" [5] used the theory of bounded rationality and a heuristic-systematic model to analyze American college students’ credibility judgments and heuristics concerning Wikipedia. Not surprisingly, authors observe that students used a heuristic (a mental shortcut, such as An article with a long list of references is more credible than with of a short one) in assessing the credibility of Wikipedia. Students (regardless of their knowledge) were much more likely to focus on the number of references than on their quality, and the same article would be seen as more credible depending on how many references it had. The authors conclude that educators need to teach students how to judge the quality of Wikipedia articles that goes beyond checking whether the article has references (and how many). The authors recommend that Wikipedia makes its own assessments (such as the Featured Article star, currently visible only as a small bronze star icon on the top right-hand corner of the article’s page) much more prominent. ( This reviewer strongly agrees with the conclusion, but unfortunately the last community discussion appears to have achieved little.)
More interestingly, the authors also find that people with more knowledge found Wikipedia more credible, suggesting that people with low knowledge may be more uneasy with Wikipedia. The authors suggest that the reliability of Wikipedia would be increased if more professional associations implemented programs such as Association for Psychological Science Wikipedia Initiative. In addition to getting the experts more involved in Wikipedia content creation, the authors suggest that a good idea may be for "professional associations themselves [to] provide their own endorsement for the quality of articles in their fields."
The authors also note that peer endorsement is an important factor in credibility, and that the Wikipedia:Article Feedback Tool is a step in the right direction, as it provided another credibility assessment for the readers. They note, however, that compared to similar tools implemented on other sites (such as Amazon), "Wikipedia readers need to click on ‘‘View Page Rating,’’ which requires one more step to find out that information. The average reader may not be inclined to do so. It would be useful to display ratings without clicking".
{Directed by, Produced by, Written by, Starring}.
" The authors report that their clustering algorithm, "WIClust", performed successfully on a sample of "48,000 infoboxes spanning 862 infobox templates", and that in some cases it corrects shortcomings of
DBpedia, e.g. by discovering "that the templates Infobox Movie, Bond film, Japanese film, Chinese film, and Korean film belong to the same group as Infobox Film."A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
A special issue of the American Behavioral Scientist is devoted to "open collaboration".
A fourth paper in this special issue, titled "The Rise and Decline of an Open Collaboration System: How Wikipedia’s Reaction to Popularity Is Causing Its Decline", found considerable media attention this month, starting with an article in USA Today. It was already reviewed in the September issue of the research report.
While the size and growth rate, editorial workflow, and topical coverage of Wikipedia have been vastly studied, there is little work done on the understanding of public attention to the Wikipedia articles. In a working paper by a team from the Barcelona Media Foundation and the University of Twente, placed on arXiv just before Christmas, [4] the number of clicks on the featured articles promoted to the Wikipedia Main page is analysed and modeled.
A total of 684 featured articles are considered and the page view statistics of them is rescaled by the average circadian view rate extracted from a larger set of 871 395 articles in a period of 844 days. The 4-day lifetime of the promoted articles on the Main page is characterised by four phases. A very rapid growth in the number of article clicks just after the article appears on the Main page, followed by a rather homogeneous period of the first day of the promotion. As the article is replaced by a new featured article, and placed in the "recently featured" part of the Main page, the rate of clicks drops dramatically, and finally, the fourth flat phase is experienced during the remaining 3 days at this location.
In the next step, the authors introduce a rather intuitive model based on a few parameters to fully describe the 4 days cycle in a mathematical framework. The model is tuned based on the data of a set of 100 featured articles to predict the number of page hits for the rest of the sample, given the number of the clicks after the first hour of promotion for each article. The model is relatively accurate in predicting the number of clicks, and this accuracy could be even improved by feeding the model with the number of clicks at the end of the first day instead of the first hour after promotion. While the paper is very clear in describing the methodology, it fails to discuss and provide a deeper understanding of the social mechanisms of popularity and public attention, as it is mentioned repeatedly by the authors.
A paper in Information Processing and Management titled "College students’ credibility judgments and heuristics concerning Wikipedia" [5] used the theory of bounded rationality and a heuristic-systematic model to analyze American college students’ credibility judgments and heuristics concerning Wikipedia. Not surprisingly, authors observe that students used a heuristic (a mental shortcut, such as An article with a long list of references is more credible than with of a short one) in assessing the credibility of Wikipedia. Students (regardless of their knowledge) were much more likely to focus on the number of references than on their quality, and the same article would be seen as more credible depending on how many references it had. The authors conclude that educators need to teach students how to judge the quality of Wikipedia articles that goes beyond checking whether the article has references (and how many). The authors recommend that Wikipedia makes its own assessments (such as the Featured Article star, currently visible only as a small bronze star icon on the top right-hand corner of the article’s page) much more prominent. ( This reviewer strongly agrees with the conclusion, but unfortunately the last community discussion appears to have achieved little.)
More interestingly, the authors also find that people with more knowledge found Wikipedia more credible, suggesting that people with low knowledge may be more uneasy with Wikipedia. The authors suggest that the reliability of Wikipedia would be increased if more professional associations implemented programs such as Association for Psychological Science Wikipedia Initiative. In addition to getting the experts more involved in Wikipedia content creation, the authors suggest that a good idea may be for "professional associations themselves [to] provide their own endorsement for the quality of articles in their fields."
The authors also note that peer endorsement is an important factor in credibility, and that the Wikipedia:Article Feedback Tool is a step in the right direction, as it provided another credibility assessment for the readers. They note, however, that compared to similar tools implemented on other sites (such as Amazon), "Wikipedia readers need to click on ‘‘View Page Rating,’’ which requires one more step to find out that information. The average reader may not be inclined to do so. It would be useful to display ratings without clicking".
{Directed by, Produced by, Written by, Starring}.
" The authors report that their clustering algorithm, "WIClust", performed successfully on a sample of "48,000 infoboxes spanning 862 infobox templates", and that in some cases it corrects shortcomings of
DBpedia, e.g. by discovering "that the templates Infobox Movie, Bond film, Japanese film, Chinese film, and Korean film belong to the same group as Infobox Film."
Discuss this story