Creative Commons (CC) is currently working on version 4.0 of its suite of copyright licenses, which include the CC-BY-SA and CC-BY licenses used by the Wikimedia projects. Wikimedia adopted BY-SA-3.0 in 2009, and we hope that the 4.0 version will be superior for all license users, including Wikimedia. But to meet its goals, CC needs your input into the revision process.
The CC wiki lists five ambitious goals for the revision:
Wikipedia was launched in January 2001, almost two years before CC published its first licenses. All Wikipedias were initially licensed under the GFDL, a Free Software Foundation (FSF) license intended for software documentation; the main advantage was its "copyleft" terms, which allow any user to reuse and remix GFDL works as long as the result is shared under the same license.
But before Wikipedia, GFDL had not been widely used for cultural works outside the realm of free software, and some of its requirements weren't well-suited for the uses people were making of freely licensed content. Other licenses existed, but were incompatible with the GFDL and with each other.
Meanwhile, CC quickly rose to prominence, gaining wide adoption among communities of creators, including other wiki projects such as Wikitravel and WikiEducator. Many Wikipedia users were already choosing to dual-license their contributions under both GFDL and one or more of the CC licenses (Wikinews was already using the non-copyleft CC-BY license). Wikimedia worked with CC and the FSF to bring the two licenses into closer harmony, ultimately leading to the release of GFDL version 1.3, which allowed collaborative works licensed under it to be relicensed under CC-BY-SA. Wikimedia held a successful community referendum on adopting 1.3, and began dual licensing with the CC-BY-SA-3.0 in June 2009.
CC published the 3.0 license suite in early 2007. Over the past five years, those licenses have been widely used for works that are free to share without all of the restrictions of standard copyright. They've been adopted by cultural institutions, national and local governments, media-hosting websites, educational projects, and popular artists. Wikimedia is one of the largest and most prominent users, with a community whose goals to make available the free and open sharing of knowledge are closely aligned with those of CC, so the needs of the Wikimedia communities are an important consideration for CC.
In the past several years, use by the Wikimedia communities and others has revealed opportunities for improvement. For example, the specific requirements for attribution have proved difficult to follow, even for the most diligent, good-faith reusers. Many users have been concerned that the licenses don't adequately address database rights, moral rights, and copyright-like rights, to ensure they create the right expectations for both licensors and reusers. And while CC licenses have been officially "ported" to many jurisdictions to make them more closely aligned with local laws, the international (formerly "unported") license is in wide use globally; to make it as good a legal tool as possible for a worldwide community of users, it needs revision to better address the legal requirements of all national jurisdictions.
All of this is done keeping in mind the need to be responsible stewards of the license, and that the new version needs to continue to uphold the expectations of those using them to extend the commons. CC has been actively consulting with organizations such as Wikimedia, the Free Software Foundation, and the Open Knowledge Foundation to ensure that changes to the licenses don't inadvertently harm the freedoms those licenses are intended to help in the first place.
CC general counsel Diane Peters explained the goals in more detail in her blog post following last year's CC Global Summit.
To achieve these ends, the CC community is currently discussing several open questions on its mailing lists ( community and licenses) and wiki. Many members of the Wikimedia communities have already contributed to those discussions, including individual volunteers and Wikimedians who are part of CC's international affiliate teams. The first public draft is now open for comment and discussion. Throughout the drafting process, CC will make more focused calls for input, asking specific questions. (The most recent call was five open questions on attribution here.)
Wikimedia has already been involved in the drafting process. I attended the CC Global Summit last September on behalf of Wikimedia and began talking to the CC legal team about the variety of issues Wikimedia faces with licenses. Wikimedia's Legal and Community Advocacy team (especially legal counsel Michelle Paulson) has been giving input on the process since the announcement in September.
But for the licenses to be suitable for diverse uses, it takes more than just a few heads coming together. Copyright mavens outside the US are especially needed to look at jurisdiction-specific issues to ensure the licenses are valid worldwide. Many of the open questions depend on knowledge of a wide range of community practices. Do you work with print reusers, GLAMs or other national institutions, or mirrors and forks of Wikimedia content? Do you handle photo submission requests, or use freely-licensed photos in MediaWiki skins? Every volunteer has a particular area of expertise that is difficult for others to know about without your help. Where do you see the greatest opportunities for improvement in the licenses, to best encourage sharing and reuse?
Even if you're not a licensing expert, you can help by sharing the calls for comment with parts of the community who would be interested and haven't seen it yet, and by translating the calls for information and posting them on your language's community forums.
According to the draft timeline, the second draft will be published next month, with another comment period before the third draft in September; by that stage, the process should be nearly complete. Final comments will be taken after the third draft, and if all goes as scheduled, the final draft of the licenses will be published sometime around December 2012. (The earlier that proposed changes are discussed, the more likely it is that they can be addressed and potentially included!).
After the final revision is published, Wikimedia will begin a process of deciding whether to adopt the later version of the CC-BY-SA license as the primary license for its projects. With board, staff, and community input from the earliest stages, we hope this will be a smooth process, and that potential problems will be raised and discussed well before the final draft is published.
By taking a legal counsel job with CC, joining its small legal team, I'm thrilled to have the chance to work on these issues full-time. The most frequent question put to me about the job is "will you have to leave Wikimedia?" I'm happy to say that the answer is no. Instead, I'm looking forward to using my knowledge of Wikimedia and its legal and strategic challenges to help CC achieve its goals of creating infrastructure for sharing knowledge and culture.
One challenge I'll have is being clear who I'm speaking for when talking about licensing. (Here, I have my Wikimedia hat on!) I'll also recuse myself from board decisions involving CC and CC licensing. But in practical terms, I'm hoping to face very few actual conflicts: one of the most rewarding things about being part of Wikimedia is that I think that Wikimedia's goals really do serve the public interest, and I think the same of CC. This licensing process is intended to be the last revision for a long while; what is at stake is powerful long-term effects on the ability to share and reuse material in the commons all over the world.
Reader comments
The Commons Picture of Year Committee has just announced the winner of The Sixth Annual Wikimedia Commons POTY Contest: Lake Bondhus Norway 2862, shot by German Wikipedian Heinrich Pniok using a Canon EOS 5D Mark II with 24 mm focal length, then digitally retouched. Known as User:Alchemist-hp on WMF projects, Heinrich is a familiar participant at the featured-picture processes on Commons and the English and German Wikipedias, and has gifted to us an array of fine pictures of the chemical elements, inorganic compounds, minerals, insects and animals, and plants, landscapes, and places.
Heinrich told the Signpost he made the picture from three single images with different exposures to produce a more realistic dynamic. "The eyes can see better than the best camera," he says, "but not if I can use good software to achieve a similar dynamic view. I tested a lot of different software to be able to produce pictures like Lake Bondhus. Photomatix Pro is my favoured tool for making HDR/ tone mapping, or put simply, images in which you blend different exposures." Ironically, Heinrich's capturing of how the unusual scene appeared to his eyes – by the use of varying focus throughout the image and by digital retouching – led to a few opposes among many positive reviewers at the Commons featured-picture nomination page. Reviewer George Chernilevsky commented that the effect is "mystical", to which Heinrich replied that the place itself was mystical (not just his image of it).Heinrich told us that on 23 July last year he and his wife went on "a two and a half hour walking tour along a road about 50 km southeast of Bergen, Norway's second-largest city. We had mixed weather that day, both sunny and rainy. When we arrived at this place we were very happy and surprised to find such a beautiful scene: dreamlike and mystical, with a fantastic light." ( Zoomable Google Map.)
With 143 votes, Lake Bondhus was the stand-out over editors' second and third choices, with 118 and 57 votes respectively. Why was it so popular? One Commons editor gave this explanation: "Take a look at the composition: the glacier angle reversing into the angle of the boat; the seemingly random scattering of the rocks in the water, counterposed with the rocky foreground; the rather elegant line of posts; the binary reflection of clouds, rocks, and mountains, and the variety of textures. The most striking aspect is the serenity of the boat and the water versus the ragged clouds that seem to impinge on the scene." Lake Bondhus is a featured picture on the German Wikipedia, and appeared on the main page of Commons on 15 May.
The people's second choice was a self-portrait by NASA flight engineer Tracy Caldwell Dyson in the Cupola module of the International Space Station during Expedition 24, taken 11 September 2010 using a NIKON D2X with 16 mm focal length, from a distance of just over a metre. The image won high praise from reviewers at the English Wikipedia's featured-picture nomination page, despite a few queries about EV (encyclopedic value). The photographer has completed three spacewalks, is a private pilot and a former track-and-field athlete, and surprisingly, is lead vocalist for the all-astronaut band Max Q.
The third choice was an image by francophone Belgian Wikimedian Luc Viatour, whose photography comprises a stunning variety of subjects, from the astronomical to landscapes, wildlife, and buildings – amply demonstrated in Luc's gallery of his Commons featured pictures. Cueva de los Verdes (Spanish for the Verdes' cave, named after the former owners, the Verdes family) is a lava tube and tourist attraction in the Canary Islands, off the west coast of Africa. The cave was created around 3,000 years ago by lava flows from the nearby volcano Monte Corona, flowing across the Malpaís de la Corona toward the sea. When the lava drained away, the solidified upper part remained to form the roof of the caves, which extend for 7.5 kilometres. In earlier centuries, islanders hid in this cave to protect themselves from pirates and slave raiders. Luc's images have been finalists in the competition for five years in a row. He told the Signpost he took the photo during his vacation on the island in 2011. "I used a Nikon D3s, 14–24 mm 2,8, tripod. The water you see was fresh, and with the artificial lighting gave a beautiful reflection of the cave ceiling."
Wikimania, the annual international Wikimedia community conference, will be held in Washington DC on 12–14 July. This will be the first time since Boston in 2006 that the conference has been held in the US.
The pre-program will start with Wikimania Takes Manhattan, 6–9 July, and will come to a local peak in the next wave of New York's Wiknic event, the Wiki World's Fair on 7 July on Governors Island in New York Harbor.
On 10–11 July, MediaWiki hackers, Toolserver users, gadget developers and others will meet for the annual Wikimania Hackathon and will revisit issues they looked at during their Berlin meeting earlier this month. Alongside the technology event, the Ada Initiative will host a camp to promote women’s participation in open technology, and the Wikimedia chapters will meet to finally work out the basics of their new umbrella organization, the Wikimedia Chapters Association. On the eve of the main event, Google will host a reception of its own.
During the four-day conference the schedule will cover a wide range of issues, sorted in thematic categories; these will include chapters, education, GLAM, and technology and infrastructure. In addition to the main schedule, the National Archives and other local institutions will offer tours, and Wikimedians will meet with library representatives to work on collaborative outreach projects ( Wiki loves libraries).
On 15 July, an unconference will take place and the WMF US education program working group will look at how to reform collaborative projects with US universities. Online registration is open until 23:59 EDT, 4 July; on-site registration will be available.
On 20 June, a move aiming at reform of the Requests for Adminship process (RfA) got under way, thereby reviving last year's reform efforts, which delivered among other things advice for candidates but did not make it to a Request for Comment (RfC).
Three main general problem areas to be tackled on wiki have been identified so far: Unearthing qualified candidates, Snow and NOTNOW candidates, and problems in finding consensus in an RfC.
Ten proposals have recently been published to overhaul the RfA process, most of them focusing on procedural rather than technical remedies. Under consideration are expert committees, empowered to select administrators in place of the current polling method, and remodelling the RfA process by adding additional stages or dividing the process into two stages.
On 24 June, Jc37 proposed a technical solution by the creation of a new user group. He pointed out that this new set of user rights to promote content-related admin activities would reduce backlogs in areas such as AfD and Cfd. Tools related to the management of user behaviour, like blocking and protecting, would not be part of this package.
Excising these tools, so the argument goes, could turn down the volume in RfAs so that candidates are assessed on the merits of their "understanding of how to determine consensus in discussions, various content-related policies and guidelines, and also on the trust requisite with only the particular tools they would be receiving". The proposal prompted wide-ranging discussions, and commands some support.
Users interested in contributing to the ongoing debates are listed here.
Jimmy Wales has called on the United Kingdom's Home Secretary, Theresa May, to stop the extradition of Richard O'Dwyer to the United States for his alleged breach of American copyright law.
O'Dwyer is being charged by the American federal government with criminal copyright infringement related to his former websites TVShack.net and TVShack.cc. The prosecutors allege that he was "involved in the illegal distribution of copyrighted movies and television programs over the Internet". As O'Dwyer resides in the United Kingdom, the United States' Justice Department asked for his extradition in May 2011 under the UK's Extradition Act 2003. The case resides in murky legal ground, however; O'Dwyer's defense team argues that American laws should not apply to a website hosted in the UK. They also argued that his TVShack websites "simply provided a link" to the content, rather than actually hosting and curating the offending material—essentially, they believe that the site functioned as an online service provider as envisioned under the American 1998 Digital Millennium Copyright Act.
Describing O'Dwyer as a "clean-cut, geeky kid" and "precisely the kind of person one can imagine launching the next big thing on the internet", Wales sees O'Dwyer's fight against extradition as another battle between the large television/film industry (Wales' "content industry") and the wider public. Previous battles included the popular movement against two proposed American laws, the PROTECT IP Act and the Stop Online Piracy Act, also known as PIPA and SOPA, respectively. Actions taken to protest the bills included the blackout of several major websites, including Wikipedia, on 18 January 2012 (see previous Signpost coverage: 16 January, 23 January). Wales called O'Dwyer the "human face" of this war, and warned that "if he's extradited and convicted, he will bear the human cost." ( more information in the Guardian; Wales' change.org petition)
On 18 June, the Washington Post reported on a study by Northwestern University's Shane Greenstein and the University of Southern California's Feng Zhu, "Collective Intelligence and Neutral Point of View: The Case of Wikipedia", which examined the viability of Linus' Law ("Given enough eyeballs, all bugs are shallow") through the case study of Wikipedia articles on American federal politics. They chose the topic because it would be an area "where Linus' Law would face challenges due to the presence of controversial topics and lack of verified and/or lack of objective information."
The Post claimed that the results showed that "only a handful of [Wikipedia articles] were politically neutral," though the study was positive in their belief that "Wikipedia's entries lack much slant and contain less bias than observed earlier." The pair came to this conclusion by analyzing a decade's worth of Wikipedia articles on American politics. It noted that while a large number of users sought to remove bias from the articles, most articles receive little attention from most users and, more often than not, they retain their political bias, which will often be that of the original contributor. (See also the review of an earlier version of the paper in the Signpost's "Recent research" section: " Given enough eyeballs, do articles become neutral?") Whatever the reason, if these accusations are true then Wikipedia is breaking its own commitment to a neutral point-of-view.
The pair used a technical index to determine the political slant of articles which measure how often one thousand phrases were used. These were taken from all of the remarks made by both Democrats and Republicans, the two main American political parties, in 2005. Essentially, the index uses the logic that an article written from a Democrat's point of view will include phrases like 'civil rights' and 'trade deficit' more often, as opposed to an article with a Republican bias, which would have 'economic growth' and 'illegal immigration'. However, the Post notes that "the vocabulary of partisans has doubtlessly shifted somewhat since 2005."
It is not just recently that accusations have been made of Wikipedia being politically biased. Early versions of Wikipedia were seen as very liberal, while in 2006, the American PBS (Public Broadcasting Service) ran an article stating that, according to conservative blogger Robert Cox of the National Debate, Wikipedia had 'a liberal bias in many hot-button topic entries'. Jimmy Wales replied that this was thanks to Wikipedia's global community, and this tendency was natural when the "international community of English speakers is slightly more liberal than the U.S. population." When asked if he felt this affected the site's goal, he said that "the idea that neutrality can only be achieved if we have some exact demographic matchup to United States of America is preposterous" and that Wikipedia should have a view that would be interpreted as neutral worldwide, not just in the US. (see previous Signpost coverage; more information from PBS)
It should be noted that many of these posts originate from American sources regarding articles on American politics—yet the US political system is much more conservative than that of other English-speaking countries. For example, the national health service supported by all major parties in countries such as the UK and Canada has faced vociferous opposition in the US. Therefore, what may seem neutral in some countries could seem left-wing in the US.
"Dynamics of Conflicts in Wikipedia" [1] develops an interesting "measure of controversiality", something that might be of interest to editors at large if it were a more widely popularized and dynamically updated statistic. The paper analyzes patterns of edit warring over Wikipedia articles. The authors conclude that edit warriors are usually willing to reach consensus, and that the rare cases of never-ending warring are those that continually attract new editors who have not yet joined the consensus.
The authors' decision to exclude from the study articles with under 100 edits because they are "evidently conflict-free" is questionable. Articles with fewer than 100 edits have been subject to clear, if not overly long, edit warring. A recent example is Concerns and controversies related to UEFA Euro 2012. It is also unfortunate that "memory effects" – a term mentioned only in the abstract and lead, and which the authors suggest is significant in understanding the conflict dynamic – is not explained in the article. The term "memory", by itself, appears four times in the body, but is not operationalized anywhere.
A press release accompanied the paper, entitled " Wikipedia 'edit wars' show dynamics of conflict emergence and resolution". An MSNBC tech news headline misleadingly, but sensationally, summarized it as " Wikipedia is editorial warzone, says study".
In a recent blog post by Wibidata, an analytics startup based in San Francisco, the authors set out to shed light on the often-quoted claim that most of Wikipedia was written by a small number of editors, noting other editorial patterns along the way. [2] Using the entire revision history of English Wikipedia (they wanted to show that their platform can scale), the authors looked at the distribution of edits across editor cohorts, grouped by number of total edits. They found that from a pure count perspective, the most active 1% of editors had contributed over 50% of the total edits. (see original plot here)
In response to the suggestion that the strongly skewed distribution of edits might just be due to a core set of editors who primarily make only minor formatting modifications, they looked at the net number of characters contributed by each editor. Grouping editors by total number of edits as before, they showed an even more strongly skewed distribution, with the top 1% contributing well over 100% of the total number characters on Wikipedia (i.e. an amount of text that is larger than the current Wikipedia) and the bottom 95% of editors deleting more on average than they contributed ( original plot). Next, the authors separated logged in users from non-logged in "users" (identified only by IP addresses) and recomputed the distribution of net character contributions. By edit-count cohort, logged-in users tended to contribute significantly more than their anonymous counterparts, and non-logged-in users tended to delete significantly more ( original plot).
In summary, low-activity and new editors, along with anonymous users, tend to delete more than they contribute; this reinforces the notion that Wikipedia is largely the product of a small number of core editors.
Published in proceedings of *SEM, a computational semantics conference, researchers from the University of North Texas and Ohio University looked into the nature of interlingual links on Wikipedia, both reviewing the quality of existing links and exploring possibilities for automatic link discovery. [3] The researchers took the directed graph of interlingual links on Wikipedia and used the lens of set-theoretic operations to structure an evaluation of existing links, to build a system for automatic link creation. For example, they suggest that the properties of symmetry and transitivity should hold for the relation of interlingual linking. This means that if there is an interlingual link from language A to B, there should also be a link from B to A, and if there is a link from language A to B, and language B to C, then there should be a link from language A to C. (This assumption is routinely made by the many existing Interwiki bots.) They further refine the notion of transitivity, by grouping article pairs by the number of transitive 'hops' required to connect a candidate article pair.
Their methodology revolves around the creation of a sizeable annotated gold data set. Using these labels, they first evaluated the quality of existing links, finding between one half and one third to fail their criteria for legitimate translations. They then evaluated the quality of various implied links. For example, reverse links where they do not already exist satisfy their criteria for faithful translation only 68% of the time.
The gold data set was used to train a boosted decision-tree classifier for selecting good candidate pairs of articles. They used various network topology features to encode the information in interlingual links for a given topic and found that they can significantly beat the baseline, which uses only the presence of direct links (73.97% compared with 69.35% accuracy).
Various conference papers and posters from the upcoming "Wikipedia Academy" (hosted by the German Wikimedia chapter from June 29 to July 1 in Berlin) are already available online. A brief overview of those which are presenting new research about Wikipedia:
Researcher Felipe Ortega blogged [16] about a new parser for Wikipedia dumps, to be integrated into "WikiDAT (Wikipedia Data Analysis Toolkit) ... a new integrated framework to facilitate the analysis of Wikipedia data using Python, MySQL and R. Following the pragmatic paradigm 'avoid reinventing the wheel', WikiDAT integrates some of the most efficient approaches for Wikipedia data analysis found in libre software code up to now", which will be featured in a workshop at the conference.
The open-access journal "Digithum" (subtitled "The Humanities in the Digital Era") has published a special issue containing five papers about Wikipedia from various disciplines, with a multilingual emphasis (including research about non-English Wikipedias, and Catalan and Spanish versions of the papers alongside the English versions):
This week, we spent some time with WikiProject Athletics which covers a variety of athletic competitions including running, jumping, and throwing. Started in May 2009, WikiProject Athletics is relatively young among the sport projects. It is home to 3 Featured Articles, 4 Featured Lists, and 18 Good Articles. The project maintains the Athletics Portal and various lists of articles that either do not exist or need considerable improvement. We interviewed Trackinfo and project founder Sillyfolkboy (SFB).
What motivated you to join WikiProject Athletics? Have you coached or competed in any athletic events?
Are some aspects of athletics better covered on Wikipedia than others? Are there any glaring holes in Wikipedia's coverage of athletics?
Most of the project's Featured and Good Articles are biographies of athletes. What are some challenges faced by editors trying to improve athletics-related topics to FA or GA status?
How difficult is it to obtain images for athletics articles? Are there any specific pictures that the project is searching for?
Does WikiProject Athletics collaborate with any other projects? Are there ways the various sports and games projects could aid and reinforce each other?
Which articles will be the most vital to visitors drawn to Wikipedia after watching media coverage of major athletic events like the upcoming European Athletics Championships or the Summer Olympic Games? What needs to be done to prepare these articles for the spotlight?
What are the project's most pressing needs? How can a new member help today?
Anything else you'd like to add?
WikiProject Athletics is the first in a series of sport-related projects the WikiProject Report will be highlighting in the next two months to celebrate major sporting events and summer pastimes (winter for our friends in the
south). Next week's project will show off its need for speed. In the meantime, tune up your engine in the
archive.
Reader comments
Eleven featured articles were promoted this week:
Eight featured lists were promoted this week:
Six featured pictures were promoted this week:
The Committee neither closed nor opened any cases, leaving the total at three.
The case concerns alleged misconduct by Fæ, brought by MBisanz. Proposed decisions are due tomorrow (Tuesday 26 June).
In response to a Workshop proposal calling for his desysopping, Fæ's administrator rights were removed at his request on 18 June; he has declared he will not pursue RfA until June 2013, and that should another user nominate him and he feels confident to run, he will launch a reconfirmation RfA rather than requesting the tools back without community process.
Proposed decisions are due by 30 June.
The case, filed by P.T. Aufrette, concerns the suitability of the new move review forum, after a contentious requested move discussion (initiated by the filer) was closed as successful by JHunterJ; the close was a matter of much contention, with allegations that the move was not supported by consensus. After a series of reverts by Deacon of Pndapetzim, Kwamikagami and Gnangarra, the partiality of JHunterJ's decision was discussed, as was the intensity of Deacon of Pndapetzim's academic interests in the topic.
Evidence submissions and proposed decisions are due 28 June and 12 July, respectively.
Reader comments
“ | There is plenty of evidence that wiki-markup is a substantial barrier that prevents many people from contributing to Wikipedia and our other projects. Formal user tests, direct feedback from new editors, and anecdotal evidence collected over the past several years have made the need for a visual editor clear ... It’s the biggest and most important change to our user experience we’ve ever undertaken. | ” |
— The Visual Editor Team, Wikimedia Foundation, November 2011 |
A second prototype of the "Visual" ( what you see is what you get) editor being developed by the Wikimedia Foundation went live to MediaWiki.org this week ( Wikimedia blog), seven months after the first prototype (see previous Signpost coverage). The project is being assisted by developers for the wiki farm site Wikia, many of whose wikis use an existing, less powerful WYSIWYG editor at present.
Work on the editor had been delayed by a late decision to switch the "behind the scenes" framework used to power it; as such, despite the passage of time, developers aimed only for "feature parity" with last December's prototype, though the newer version does add the ability to save articles after editing, the potential for mobile editing, and integration with browser spell-check features. It is further hoped that the newer framework should allow for all remaining features – including tables, images and reference sections – to be rapidly integrated from now. Nevertheless, publication of details of the new live test version has already provoked a long string of bug reports. It seems likely that the deployment of the visual editor to its first live wiki will be pushed back further, possibly until the late northern autumn.
Just like the first prototype, the most significant limitation with this second demonstration version undoubtedly surrounds its inability to understand potentially difficult wikitext constructs (manual override mode has been limited to administrators during the testing period for precisely this reason). Indeed, it has been this concern over backwards compatibility that has long been seen as the major challenge for developers of WYSIWYG editors. The difference this time, developers say, is that the introduction of the radically improved new parser will make all the difference when it comes to the provision of a truly comprehensive editor. Even so, its deployment will almost certainly be accompanied by the "phasing out" of particularly complex wikitext structures.
Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for several weeks.
Creative Commons (CC) is currently working on version 4.0 of its suite of copyright licenses, which include the CC-BY-SA and CC-BY licenses used by the Wikimedia projects. Wikimedia adopted BY-SA-3.0 in 2009, and we hope that the 4.0 version will be superior for all license users, including Wikimedia. But to meet its goals, CC needs your input into the revision process.
The CC wiki lists five ambitious goals for the revision:
Wikipedia was launched in January 2001, almost two years before CC published its first licenses. All Wikipedias were initially licensed under the GFDL, a Free Software Foundation (FSF) license intended for software documentation; the main advantage was its "copyleft" terms, which allow any user to reuse and remix GFDL works as long as the result is shared under the same license.
But before Wikipedia, GFDL had not been widely used for cultural works outside the realm of free software, and some of its requirements weren't well-suited for the uses people were making of freely licensed content. Other licenses existed, but were incompatible with the GFDL and with each other.
Meanwhile, CC quickly rose to prominence, gaining wide adoption among communities of creators, including other wiki projects such as Wikitravel and WikiEducator. Many Wikipedia users were already choosing to dual-license their contributions under both GFDL and one or more of the CC licenses (Wikinews was already using the non-copyleft CC-BY license). Wikimedia worked with CC and the FSF to bring the two licenses into closer harmony, ultimately leading to the release of GFDL version 1.3, which allowed collaborative works licensed under it to be relicensed under CC-BY-SA. Wikimedia held a successful community referendum on adopting 1.3, and began dual licensing with the CC-BY-SA-3.0 in June 2009.
CC published the 3.0 license suite in early 2007. Over the past five years, those licenses have been widely used for works that are free to share without all of the restrictions of standard copyright. They've been adopted by cultural institutions, national and local governments, media-hosting websites, educational projects, and popular artists. Wikimedia is one of the largest and most prominent users, with a community whose goals to make available the free and open sharing of knowledge are closely aligned with those of CC, so the needs of the Wikimedia communities are an important consideration for CC.
In the past several years, use by the Wikimedia communities and others has revealed opportunities for improvement. For example, the specific requirements for attribution have proved difficult to follow, even for the most diligent, good-faith reusers. Many users have been concerned that the licenses don't adequately address database rights, moral rights, and copyright-like rights, to ensure they create the right expectations for both licensors and reusers. And while CC licenses have been officially "ported" to many jurisdictions to make them more closely aligned with local laws, the international (formerly "unported") license is in wide use globally; to make it as good a legal tool as possible for a worldwide community of users, it needs revision to better address the legal requirements of all national jurisdictions.
All of this is done keeping in mind the need to be responsible stewards of the license, and that the new version needs to continue to uphold the expectations of those using them to extend the commons. CC has been actively consulting with organizations such as Wikimedia, the Free Software Foundation, and the Open Knowledge Foundation to ensure that changes to the licenses don't inadvertently harm the freedoms those licenses are intended to help in the first place.
CC general counsel Diane Peters explained the goals in more detail in her blog post following last year's CC Global Summit.
To achieve these ends, the CC community is currently discussing several open questions on its mailing lists ( community and licenses) and wiki. Many members of the Wikimedia communities have already contributed to those discussions, including individual volunteers and Wikimedians who are part of CC's international affiliate teams. The first public draft is now open for comment and discussion. Throughout the drafting process, CC will make more focused calls for input, asking specific questions. (The most recent call was five open questions on attribution here.)
Wikimedia has already been involved in the drafting process. I attended the CC Global Summit last September on behalf of Wikimedia and began talking to the CC legal team about the variety of issues Wikimedia faces with licenses. Wikimedia's Legal and Community Advocacy team (especially legal counsel Michelle Paulson) has been giving input on the process since the announcement in September.
But for the licenses to be suitable for diverse uses, it takes more than just a few heads coming together. Copyright mavens outside the US are especially needed to look at jurisdiction-specific issues to ensure the licenses are valid worldwide. Many of the open questions depend on knowledge of a wide range of community practices. Do you work with print reusers, GLAMs or other national institutions, or mirrors and forks of Wikimedia content? Do you handle photo submission requests, or use freely-licensed photos in MediaWiki skins? Every volunteer has a particular area of expertise that is difficult for others to know about without your help. Where do you see the greatest opportunities for improvement in the licenses, to best encourage sharing and reuse?
Even if you're not a licensing expert, you can help by sharing the calls for comment with parts of the community who would be interested and haven't seen it yet, and by translating the calls for information and posting them on your language's community forums.
According to the draft timeline, the second draft will be published next month, with another comment period before the third draft in September; by that stage, the process should be nearly complete. Final comments will be taken after the third draft, and if all goes as scheduled, the final draft of the licenses will be published sometime around December 2012. (The earlier that proposed changes are discussed, the more likely it is that they can be addressed and potentially included!).
After the final revision is published, Wikimedia will begin a process of deciding whether to adopt the later version of the CC-BY-SA license as the primary license for its projects. With board, staff, and community input from the earliest stages, we hope this will be a smooth process, and that potential problems will be raised and discussed well before the final draft is published.
By taking a legal counsel job with CC, joining its small legal team, I'm thrilled to have the chance to work on these issues full-time. The most frequent question put to me about the job is "will you have to leave Wikimedia?" I'm happy to say that the answer is no. Instead, I'm looking forward to using my knowledge of Wikimedia and its legal and strategic challenges to help CC achieve its goals of creating infrastructure for sharing knowledge and culture.
One challenge I'll have is being clear who I'm speaking for when talking about licensing. (Here, I have my Wikimedia hat on!) I'll also recuse myself from board decisions involving CC and CC licensing. But in practical terms, I'm hoping to face very few actual conflicts: one of the most rewarding things about being part of Wikimedia is that I think that Wikimedia's goals really do serve the public interest, and I think the same of CC. This licensing process is intended to be the last revision for a long while; what is at stake is powerful long-term effects on the ability to share and reuse material in the commons all over the world.
Reader comments
The Commons Picture of Year Committee has just announced the winner of The Sixth Annual Wikimedia Commons POTY Contest: Lake Bondhus Norway 2862, shot by German Wikipedian Heinrich Pniok using a Canon EOS 5D Mark II with 24 mm focal length, then digitally retouched. Known as User:Alchemist-hp on WMF projects, Heinrich is a familiar participant at the featured-picture processes on Commons and the English and German Wikipedias, and has gifted to us an array of fine pictures of the chemical elements, inorganic compounds, minerals, insects and animals, and plants, landscapes, and places.
Heinrich told the Signpost he made the picture from three single images with different exposures to produce a more realistic dynamic. "The eyes can see better than the best camera," he says, "but not if I can use good software to achieve a similar dynamic view. I tested a lot of different software to be able to produce pictures like Lake Bondhus. Photomatix Pro is my favoured tool for making HDR/ tone mapping, or put simply, images in which you blend different exposures." Ironically, Heinrich's capturing of how the unusual scene appeared to his eyes – by the use of varying focus throughout the image and by digital retouching – led to a few opposes among many positive reviewers at the Commons featured-picture nomination page. Reviewer George Chernilevsky commented that the effect is "mystical", to which Heinrich replied that the place itself was mystical (not just his image of it).Heinrich told us that on 23 July last year he and his wife went on "a two and a half hour walking tour along a road about 50 km southeast of Bergen, Norway's second-largest city. We had mixed weather that day, both sunny and rainy. When we arrived at this place we were very happy and surprised to find such a beautiful scene: dreamlike and mystical, with a fantastic light." ( Zoomable Google Map.)
With 143 votes, Lake Bondhus was the stand-out over editors' second and third choices, with 118 and 57 votes respectively. Why was it so popular? One Commons editor gave this explanation: "Take a look at the composition: the glacier angle reversing into the angle of the boat; the seemingly random scattering of the rocks in the water, counterposed with the rocky foreground; the rather elegant line of posts; the binary reflection of clouds, rocks, and mountains, and the variety of textures. The most striking aspect is the serenity of the boat and the water versus the ragged clouds that seem to impinge on the scene." Lake Bondhus is a featured picture on the German Wikipedia, and appeared on the main page of Commons on 15 May.
The people's second choice was a self-portrait by NASA flight engineer Tracy Caldwell Dyson in the Cupola module of the International Space Station during Expedition 24, taken 11 September 2010 using a NIKON D2X with 16 mm focal length, from a distance of just over a metre. The image won high praise from reviewers at the English Wikipedia's featured-picture nomination page, despite a few queries about EV (encyclopedic value). The photographer has completed three spacewalks, is a private pilot and a former track-and-field athlete, and surprisingly, is lead vocalist for the all-astronaut band Max Q.
The third choice was an image by francophone Belgian Wikimedian Luc Viatour, whose photography comprises a stunning variety of subjects, from the astronomical to landscapes, wildlife, and buildings – amply demonstrated in Luc's gallery of his Commons featured pictures. Cueva de los Verdes (Spanish for the Verdes' cave, named after the former owners, the Verdes family) is a lava tube and tourist attraction in the Canary Islands, off the west coast of Africa. The cave was created around 3,000 years ago by lava flows from the nearby volcano Monte Corona, flowing across the Malpaís de la Corona toward the sea. When the lava drained away, the solidified upper part remained to form the roof of the caves, which extend for 7.5 kilometres. In earlier centuries, islanders hid in this cave to protect themselves from pirates and slave raiders. Luc's images have been finalists in the competition for five years in a row. He told the Signpost he took the photo during his vacation on the island in 2011. "I used a Nikon D3s, 14–24 mm 2,8, tripod. The water you see was fresh, and with the artificial lighting gave a beautiful reflection of the cave ceiling."
Wikimania, the annual international Wikimedia community conference, will be held in Washington DC on 12–14 July. This will be the first time since Boston in 2006 that the conference has been held in the US.
The pre-program will start with Wikimania Takes Manhattan, 6–9 July, and will come to a local peak in the next wave of New York's Wiknic event, the Wiki World's Fair on 7 July on Governors Island in New York Harbor.
On 10–11 July, MediaWiki hackers, Toolserver users, gadget developers and others will meet for the annual Wikimania Hackathon and will revisit issues they looked at during their Berlin meeting earlier this month. Alongside the technology event, the Ada Initiative will host a camp to promote women’s participation in open technology, and the Wikimedia chapters will meet to finally work out the basics of their new umbrella organization, the Wikimedia Chapters Association. On the eve of the main event, Google will host a reception of its own.
During the four-day conference the schedule will cover a wide range of issues, sorted in thematic categories; these will include chapters, education, GLAM, and technology and infrastructure. In addition to the main schedule, the National Archives and other local institutions will offer tours, and Wikimedians will meet with library representatives to work on collaborative outreach projects ( Wiki loves libraries).
On 15 July, an unconference will take place and the WMF US education program working group will look at how to reform collaborative projects with US universities. Online registration is open until 23:59 EDT, 4 July; on-site registration will be available.
On 20 June, a move aiming at reform of the Requests for Adminship process (RfA) got under way, thereby reviving last year's reform efforts, which delivered among other things advice for candidates but did not make it to a Request for Comment (RfC).
Three main general problem areas to be tackled on wiki have been identified so far: Unearthing qualified candidates, Snow and NOTNOW candidates, and problems in finding consensus in an RfC.
Ten proposals have recently been published to overhaul the RfA process, most of them focusing on procedural rather than technical remedies. Under consideration are expert committees, empowered to select administrators in place of the current polling method, and remodelling the RfA process by adding additional stages or dividing the process into two stages.
On 24 June, Jc37 proposed a technical solution by the creation of a new user group. He pointed out that this new set of user rights to promote content-related admin activities would reduce backlogs in areas such as AfD and Cfd. Tools related to the management of user behaviour, like blocking and protecting, would not be part of this package.
Excising these tools, so the argument goes, could turn down the volume in RfAs so that candidates are assessed on the merits of their "understanding of how to determine consensus in discussions, various content-related policies and guidelines, and also on the trust requisite with only the particular tools they would be receiving". The proposal prompted wide-ranging discussions, and commands some support.
Users interested in contributing to the ongoing debates are listed here.
Jimmy Wales has called on the United Kingdom's Home Secretary, Theresa May, to stop the extradition of Richard O'Dwyer to the United States for his alleged breach of American copyright law.
O'Dwyer is being charged by the American federal government with criminal copyright infringement related to his former websites TVShack.net and TVShack.cc. The prosecutors allege that he was "involved in the illegal distribution of copyrighted movies and television programs over the Internet". As O'Dwyer resides in the United Kingdom, the United States' Justice Department asked for his extradition in May 2011 under the UK's Extradition Act 2003. The case resides in murky legal ground, however; O'Dwyer's defense team argues that American laws should not apply to a website hosted in the UK. They also argued that his TVShack websites "simply provided a link" to the content, rather than actually hosting and curating the offending material—essentially, they believe that the site functioned as an online service provider as envisioned under the American 1998 Digital Millennium Copyright Act.
Describing O'Dwyer as a "clean-cut, geeky kid" and "precisely the kind of person one can imagine launching the next big thing on the internet", Wales sees O'Dwyer's fight against extradition as another battle between the large television/film industry (Wales' "content industry") and the wider public. Previous battles included the popular movement against two proposed American laws, the PROTECT IP Act and the Stop Online Piracy Act, also known as PIPA and SOPA, respectively. Actions taken to protest the bills included the blackout of several major websites, including Wikipedia, on 18 January 2012 (see previous Signpost coverage: 16 January, 23 January). Wales called O'Dwyer the "human face" of this war, and warned that "if he's extradited and convicted, he will bear the human cost." ( more information in the Guardian; Wales' change.org petition)
On 18 June, the Washington Post reported on a study by Northwestern University's Shane Greenstein and the University of Southern California's Feng Zhu, "Collective Intelligence and Neutral Point of View: The Case of Wikipedia", which examined the viability of Linus' Law ("Given enough eyeballs, all bugs are shallow") through the case study of Wikipedia articles on American federal politics. They chose the topic because it would be an area "where Linus' Law would face challenges due to the presence of controversial topics and lack of verified and/or lack of objective information."
The Post claimed that the results showed that "only a handful of [Wikipedia articles] were politically neutral," though the study was positive in their belief that "Wikipedia's entries lack much slant and contain less bias than observed earlier." The pair came to this conclusion by analyzing a decade's worth of Wikipedia articles on American politics. It noted that while a large number of users sought to remove bias from the articles, most articles receive little attention from most users and, more often than not, they retain their political bias, which will often be that of the original contributor. (See also the review of an earlier version of the paper in the Signpost's "Recent research" section: " Given enough eyeballs, do articles become neutral?") Whatever the reason, if these accusations are true then Wikipedia is breaking its own commitment to a neutral point-of-view.
The pair used a technical index to determine the political slant of articles which measure how often one thousand phrases were used. These were taken from all of the remarks made by both Democrats and Republicans, the two main American political parties, in 2005. Essentially, the index uses the logic that an article written from a Democrat's point of view will include phrases like 'civil rights' and 'trade deficit' more often, as opposed to an article with a Republican bias, which would have 'economic growth' and 'illegal immigration'. However, the Post notes that "the vocabulary of partisans has doubtlessly shifted somewhat since 2005."
It is not just recently that accusations have been made of Wikipedia being politically biased. Early versions of Wikipedia were seen as very liberal, while in 2006, the American PBS (Public Broadcasting Service) ran an article stating that, according to conservative blogger Robert Cox of the National Debate, Wikipedia had 'a liberal bias in many hot-button topic entries'. Jimmy Wales replied that this was thanks to Wikipedia's global community, and this tendency was natural when the "international community of English speakers is slightly more liberal than the U.S. population." When asked if he felt this affected the site's goal, he said that "the idea that neutrality can only be achieved if we have some exact demographic matchup to United States of America is preposterous" and that Wikipedia should have a view that would be interpreted as neutral worldwide, not just in the US. (see previous Signpost coverage; more information from PBS)
It should be noted that many of these posts originate from American sources regarding articles on American politics—yet the US political system is much more conservative than that of other English-speaking countries. For example, the national health service supported by all major parties in countries such as the UK and Canada has faced vociferous opposition in the US. Therefore, what may seem neutral in some countries could seem left-wing in the US.
"Dynamics of Conflicts in Wikipedia" [1] develops an interesting "measure of controversiality", something that might be of interest to editors at large if it were a more widely popularized and dynamically updated statistic. The paper analyzes patterns of edit warring over Wikipedia articles. The authors conclude that edit warriors are usually willing to reach consensus, and that the rare cases of never-ending warring are those that continually attract new editors who have not yet joined the consensus.
The authors' decision to exclude from the study articles with under 100 edits because they are "evidently conflict-free" is questionable. Articles with fewer than 100 edits have been subject to clear, if not overly long, edit warring. A recent example is Concerns and controversies related to UEFA Euro 2012. It is also unfortunate that "memory effects" – a term mentioned only in the abstract and lead, and which the authors suggest is significant in understanding the conflict dynamic – is not explained in the article. The term "memory", by itself, appears four times in the body, but is not operationalized anywhere.
A press release accompanied the paper, entitled " Wikipedia 'edit wars' show dynamics of conflict emergence and resolution". An MSNBC tech news headline misleadingly, but sensationally, summarized it as " Wikipedia is editorial warzone, says study".
In a recent blog post by Wibidata, an analytics startup based in San Francisco, the authors set out to shed light on the often-quoted claim that most of Wikipedia was written by a small number of editors, noting other editorial patterns along the way. [2] Using the entire revision history of English Wikipedia (they wanted to show that their platform can scale), the authors looked at the distribution of edits across editor cohorts, grouped by number of total edits. They found that from a pure count perspective, the most active 1% of editors had contributed over 50% of the total edits. (see original plot here)
In response to the suggestion that the strongly skewed distribution of edits might just be due to a core set of editors who primarily make only minor formatting modifications, they looked at the net number of characters contributed by each editor. Grouping editors by total number of edits as before, they showed an even more strongly skewed distribution, with the top 1% contributing well over 100% of the total number characters on Wikipedia (i.e. an amount of text that is larger than the current Wikipedia) and the bottom 95% of editors deleting more on average than they contributed ( original plot). Next, the authors separated logged in users from non-logged in "users" (identified only by IP addresses) and recomputed the distribution of net character contributions. By edit-count cohort, logged-in users tended to contribute significantly more than their anonymous counterparts, and non-logged-in users tended to delete significantly more ( original plot).
In summary, low-activity and new editors, along with anonymous users, tend to delete more than they contribute; this reinforces the notion that Wikipedia is largely the product of a small number of core editors.
Published in proceedings of *SEM, a computational semantics conference, researchers from the University of North Texas and Ohio University looked into the nature of interlingual links on Wikipedia, both reviewing the quality of existing links and exploring possibilities for automatic link discovery. [3] The researchers took the directed graph of interlingual links on Wikipedia and used the lens of set-theoretic operations to structure an evaluation of existing links, to build a system for automatic link creation. For example, they suggest that the properties of symmetry and transitivity should hold for the relation of interlingual linking. This means that if there is an interlingual link from language A to B, there should also be a link from B to A, and if there is a link from language A to B, and language B to C, then there should be a link from language A to C. (This assumption is routinely made by the many existing Interwiki bots.) They further refine the notion of transitivity, by grouping article pairs by the number of transitive 'hops' required to connect a candidate article pair.
Their methodology revolves around the creation of a sizeable annotated gold data set. Using these labels, they first evaluated the quality of existing links, finding between one half and one third to fail their criteria for legitimate translations. They then evaluated the quality of various implied links. For example, reverse links where they do not already exist satisfy their criteria for faithful translation only 68% of the time.
The gold data set was used to train a boosted decision-tree classifier for selecting good candidate pairs of articles. They used various network topology features to encode the information in interlingual links for a given topic and found that they can significantly beat the baseline, which uses only the presence of direct links (73.97% compared with 69.35% accuracy).
Various conference papers and posters from the upcoming "Wikipedia Academy" (hosted by the German Wikimedia chapter from June 29 to July 1 in Berlin) are already available online. A brief overview of those which are presenting new research about Wikipedia:
Researcher Felipe Ortega blogged [16] about a new parser for Wikipedia dumps, to be integrated into "WikiDAT (Wikipedia Data Analysis Toolkit) ... a new integrated framework to facilitate the analysis of Wikipedia data using Python, MySQL and R. Following the pragmatic paradigm 'avoid reinventing the wheel', WikiDAT integrates some of the most efficient approaches for Wikipedia data analysis found in libre software code up to now", which will be featured in a workshop at the conference.
The open-access journal "Digithum" (subtitled "The Humanities in the Digital Era") has published a special issue containing five papers about Wikipedia from various disciplines, with a multilingual emphasis (including research about non-English Wikipedias, and Catalan and Spanish versions of the papers alongside the English versions):
This week, we spent some time with WikiProject Athletics which covers a variety of athletic competitions including running, jumping, and throwing. Started in May 2009, WikiProject Athletics is relatively young among the sport projects. It is home to 3 Featured Articles, 4 Featured Lists, and 18 Good Articles. The project maintains the Athletics Portal and various lists of articles that either do not exist or need considerable improvement. We interviewed Trackinfo and project founder Sillyfolkboy (SFB).
What motivated you to join WikiProject Athletics? Have you coached or competed in any athletic events?
Are some aspects of athletics better covered on Wikipedia than others? Are there any glaring holes in Wikipedia's coverage of athletics?
Most of the project's Featured and Good Articles are biographies of athletes. What are some challenges faced by editors trying to improve athletics-related topics to FA or GA status?
How difficult is it to obtain images for athletics articles? Are there any specific pictures that the project is searching for?
Does WikiProject Athletics collaborate with any other projects? Are there ways the various sports and games projects could aid and reinforce each other?
Which articles will be the most vital to visitors drawn to Wikipedia after watching media coverage of major athletic events like the upcoming European Athletics Championships or the Summer Olympic Games? What needs to be done to prepare these articles for the spotlight?
What are the project's most pressing needs? How can a new member help today?
Anything else you'd like to add?
WikiProject Athletics is the first in a series of sport-related projects the WikiProject Report will be highlighting in the next two months to celebrate major sporting events and summer pastimes (winter for our friends in the
south). Next week's project will show off its need for speed. In the meantime, tune up your engine in the
archive.
Reader comments
Eleven featured articles were promoted this week:
Eight featured lists were promoted this week:
Six featured pictures were promoted this week:
The Committee neither closed nor opened any cases, leaving the total at three.
The case concerns alleged misconduct by Fæ, brought by MBisanz. Proposed decisions are due tomorrow (Tuesday 26 June).
In response to a Workshop proposal calling for his desysopping, Fæ's administrator rights were removed at his request on 18 June; he has declared he will not pursue RfA until June 2013, and that should another user nominate him and he feels confident to run, he will launch a reconfirmation RfA rather than requesting the tools back without community process.
Proposed decisions are due by 30 June.
The case, filed by P.T. Aufrette, concerns the suitability of the new move review forum, after a contentious requested move discussion (initiated by the filer) was closed as successful by JHunterJ; the close was a matter of much contention, with allegations that the move was not supported by consensus. After a series of reverts by Deacon of Pndapetzim, Kwamikagami and Gnangarra, the partiality of JHunterJ's decision was discussed, as was the intensity of Deacon of Pndapetzim's academic interests in the topic.
Evidence submissions and proposed decisions are due 28 June and 12 July, respectively.
Reader comments
“ | There is plenty of evidence that wiki-markup is a substantial barrier that prevents many people from contributing to Wikipedia and our other projects. Formal user tests, direct feedback from new editors, and anecdotal evidence collected over the past several years have made the need for a visual editor clear ... It’s the biggest and most important change to our user experience we’ve ever undertaken. | ” |
— The Visual Editor Team, Wikimedia Foundation, November 2011 |
A second prototype of the "Visual" ( what you see is what you get) editor being developed by the Wikimedia Foundation went live to MediaWiki.org this week ( Wikimedia blog), seven months after the first prototype (see previous Signpost coverage). The project is being assisted by developers for the wiki farm site Wikia, many of whose wikis use an existing, less powerful WYSIWYG editor at present.
Work on the editor had been delayed by a late decision to switch the "behind the scenes" framework used to power it; as such, despite the passage of time, developers aimed only for "feature parity" with last December's prototype, though the newer version does add the ability to save articles after editing, the potential for mobile editing, and integration with browser spell-check features. It is further hoped that the newer framework should allow for all remaining features – including tables, images and reference sections – to be rapidly integrated from now. Nevertheless, publication of details of the new live test version has already provoked a long string of bug reports. It seems likely that the deployment of the visual editor to its first live wiki will be pushed back further, possibly until the late northern autumn.
Just like the first prototype, the most significant limitation with this second demonstration version undoubtedly surrounds its inability to understand potentially difficult wikitext constructs (manual override mode has been limited to administrators during the testing period for precisely this reason). Indeed, it has been this concern over backwards compatibility that has long been seen as the major challenge for developers of WYSIWYG editors. The difference this time, developers say, is that the introduction of the radically improved new parser will make all the difference when it comes to the provision of a truly comprehensive editor. Even so, its deployment will almost certainly be accompanied by the "phasing out" of particularly complex wikitext structures.
Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for several weeks.