From Wikipedia, the free encyclopedia
News and notes

Where have all the administrators gone?

Record low number of active administrators

Outcomes of English Wikipedia requests for adminship (RfA). Black dots are RFAs that passed.

We have had eight successful candidacies for adminship so far in 2023, which is just one more than the worst-ever year for RfA, which was 2021. The number of active administrators started the year 2023 at 500, then took a big dive in mid February for no reason that The Signpost has been able to determine, touched 452 a couple of times in April, and until October was steady around 460.

On October 18, we hit a new record low, going back over a decade, of 448 active admins. To find the last time English Wikipedia had fewer than 449 active admins, we have to go back to 2005. [a]

The reasons for this are cloudy and have been covered before by The Signpost, for instance at "Administrator cadre continues to contract – more" in January 2022.

In a recent Administrators' noticeboard discussion titled "Twelve fewer administrators", BeanieFan11 noted not a single month this year have we had a net gain of administrators, and so far all but one have had a net decrease, some large - per the admin's newsletter: January: +3, -11, net -8; February: +1, -5, net -4; March: +1, -2, net -1; April: +1, -1, net 0 (only month without negative net); May: +1, -4, net -3; June: +1, -3, net -2; July: +1, -8, net -7; August: +1, -4, net -3; September: +2, -4, net -2; October: +1, -12, net -11; Overall: +13, -53, net -40

At least one disappeared admin was fallout from the WP:ROADS controversy ending in a content fork and some departures, including Rschen7754 who resigned as an administrator and editor. – B

  1. ^ According to RickBot updates to WP:List of administrators that started in 2014, and charts at User:Widefox/editors before that. Anomalous data 11 September–28 September 2021 are excluded.

Knowledge Equity Fund

The Wikimedia Foundation has published comprehensive notes from the recent community call about the Knowledge Equity Fund on Meta. The notes include a Q&A. The WMF also highlights that the Fund has helped it to make new connections:

We contacted user groups and connected the grantees with them geographically or thematically, explaining the objects of the fund. We are also trying to create new synergies between Wikimedia user groups and external groups to increase our impact.

A few examples of connections we made are:

  • Project Multatuli, which we connected with Wikimedia Indonesia
  • Create Caribbean were connected with Noircir, Wiki Cari UG, Whose Knowledge, Projet:Université de Guyane and WikiMujeres
  • Black Cultural Archives were connected with Noircir, Whose Knowledge and Wikimedia UK
  • Criola were connected with Whose Knowledge, WikiMujeres and Mujeres (mulheres) LatinoAmericanas in Wikimedia and
  • Data for Black Lives which we connected with AfroCrowd and Black Lunch Table

Through these connections, we have seen positive synergies within the movement at large

An ongoing English Wikipedia Village Pump Request for Comment on the controversial fund stands at 35:23 in favour of adopting the following non-binding resolution:

The English Wikipedia community is concerned that the Wikimedia Foundation has found itself engaged in mission creep, and that this has resulted in funds that donors provided in the belief that they would support Wikimedia Projects being allocated to unrelated external organizations, despite urgent need for those funds to address internal deficiencies.

We request that the Wikimedia Foundation reappropriates all money remaining in the Knowledge Equity Fund, and we request that prior to making non-trivial grants that a reasonable individual could consider unrelated to supporting Wikimedia Projects that the Foundation seeks approval from the community.

AK

Community rejects proposal to create policy about large language models

A request for comment (RfC) to create an English Wikipedia policy or guideline regulating editors' use of large language models (e.g. ChatGPT) was rejected recently. Specifically, the RfC concerned the proposal to elevate the draft page Wikipedia:Large language models (expanded from a much smaller version created in December 2022) to policy or guideline status. As summarized by the closing editor:

There is an overwhelming consensus to not promote. 1 editor would promote to policy, 7 editors prefer guideline, and 30 editors were against promotion. 2 editors were fine with either policy or guideline. [...] The most common and strongest rationale against promotion (articulated by 12 editors, plus 3 others outside of their !votes) was that existing P&Gs [ policies and guidelines], particularly the policies against vandalism and policies like WP:V and WP:RS, already cover the issues raised in the proposals. 5 editors would ban LLMs outright. 10-ish editors believed that it was either too soon to promote or that there needed to be some form of improvement. On the one hand, several editors believed that the current proposal was too lax; on the other, some editors felt that it was too harsh, with one editor suggesting that Wikipedia should begin to integrate AI or face replacement by encyclopedias that will. (2 editors made a bet that this wouldn't happen.)
Editors who supported promoting to guideline noted that Wikipedia needs to address the use of LLMs and that the perfect should not be the enemy of the good. However, there was no general agreement on what the "perfect" looked like, and other editors pointed out that promoting would make it much harder to revise or deprecate if consensus still failed to develop.

Similarly, on Wikimedia Commons a page collecting guidance about AI-generated media (particularly the use of generative AI models such as DALL-E, Stable Diffusion or Midjourney), likewise created in December 2022, is still marked as "a work in progress page", although it appears to have progressed a bit further already towards reflecting community consensus.

In any case, discussions about generative AI are continuing in the Wikimedia movement, also in off-wiki fora such as the "ML, AI and GenAI for Wikimedia Projects" Facebook group and the "Wikimedia AI" group on Telegram (non-public but with public invite link). At Wikimania 2023, it was the subject of various sessions including two panels titled "AI advancements and the Wikimedia projects" ( video) and " ChatGPT vs. WikiGPT: Challenges and Opportunities in harnessing generative AI for Wikimedia Projects" ( video). The September edition of the Wiki Education Foundation's "Speaker Series" likewise had the topic " Wikipedia in a Generative AI World", featuring three speakers including Aaron Halfaker ( User:EpochFail, a former research scientist at the Wikimedia Foundation and developer of the AI-based ORES system that is still widely used for vandalism detection and other purposes). – H


Several European regulation efforts may adversely affect Wikimedia projects

In its EU Policy Monitoring Report for September, Wikimedia Europe highlights several legislative efforts that are ongoing on the continent. Some of them raise concerns regarding their possible impact on Wikipedia and other Wikimedia projects:

  • The EMFA (European Media Freedom Act) is "intended to help a pluralistic media landscape", but also contains problematic provisions, e.g. a requirement for online platforms to warn "media providers, who can be media outlets but also individuals, such as journalists [...] ahead of moderating their content and to give them a fast-track channel to contest decisions. Some lawmakers even suggest that online platforms be prohibited from deleting content by media providers before the provider has had a chance to reply. All this is highly problematic, seeing that disinformation is sometimes produced by media providers." Efforts to exempt Wikimedia projects or at least non-profit "online encyclopaedias" succeeded initially but then were in jeopardy again. However, negotiations are expected to continue into 2024.
  • The controversial Regulation to Prevent and Combat Child Sexual Abuse (CSAR) proposed by EU Commissioner Ylva Johansson is reported to have "stalled somewhat" recently. It would cover Wikimedia projects too, "and the Wikimedia Foundation has provided [already in 2022] constructive feedback, outlining some risks and challenges posed by the scanning technologies used. Wikimedia is also criticising the idea to scan direct, interpersonal communication in a general manner and without judicial oversight."
  • In France, the proposed Loi SREN "would introduce some provisions on data retention and user identification, in order to not allow already banned users to re-register. That would require the collection of heaps of data and the compulsory identification of all users. Wikimedia projects are squarely in the scope of this proposal." Initial efforts to "take our projects out of the fireline" have failed.

H


Brief notes

Tyap Wikimedians at Gurara River
From Wikipedia, the free encyclopedia
News and notes

Where have all the administrators gone?

Record low number of active administrators

Outcomes of English Wikipedia requests for adminship (RfA). Black dots are RFAs that passed.

We have had eight successful candidacies for adminship so far in 2023, which is just one more than the worst-ever year for RfA, which was 2021. The number of active administrators started the year 2023 at 500, then took a big dive in mid February for no reason that The Signpost has been able to determine, touched 452 a couple of times in April, and until October was steady around 460.

On October 18, we hit a new record low, going back over a decade, of 448 active admins. To find the last time English Wikipedia had fewer than 449 active admins, we have to go back to 2005. [a]

The reasons for this are cloudy and have been covered before by The Signpost, for instance at "Administrator cadre continues to contract – more" in January 2022.

In a recent Administrators' noticeboard discussion titled "Twelve fewer administrators", BeanieFan11 noted not a single month this year have we had a net gain of administrators, and so far all but one have had a net decrease, some large - per the admin's newsletter: January: +3, -11, net -8; February: +1, -5, net -4; March: +1, -2, net -1; April: +1, -1, net 0 (only month without negative net); May: +1, -4, net -3; June: +1, -3, net -2; July: +1, -8, net -7; August: +1, -4, net -3; September: +2, -4, net -2; October: +1, -12, net -11; Overall: +13, -53, net -40

At least one disappeared admin was fallout from the WP:ROADS controversy ending in a content fork and some departures, including Rschen7754 who resigned as an administrator and editor. – B

  1. ^ According to RickBot updates to WP:List of administrators that started in 2014, and charts at User:Widefox/editors before that. Anomalous data 11 September–28 September 2021 are excluded.

Knowledge Equity Fund

The Wikimedia Foundation has published comprehensive notes from the recent community call about the Knowledge Equity Fund on Meta. The notes include a Q&A. The WMF also highlights that the Fund has helped it to make new connections:

We contacted user groups and connected the grantees with them geographically or thematically, explaining the objects of the fund. We are also trying to create new synergies between Wikimedia user groups and external groups to increase our impact.

A few examples of connections we made are:

  • Project Multatuli, which we connected with Wikimedia Indonesia
  • Create Caribbean were connected with Noircir, Wiki Cari UG, Whose Knowledge, Projet:Université de Guyane and WikiMujeres
  • Black Cultural Archives were connected with Noircir, Whose Knowledge and Wikimedia UK
  • Criola were connected with Whose Knowledge, WikiMujeres and Mujeres (mulheres) LatinoAmericanas in Wikimedia and
  • Data for Black Lives which we connected with AfroCrowd and Black Lunch Table

Through these connections, we have seen positive synergies within the movement at large

An ongoing English Wikipedia Village Pump Request for Comment on the controversial fund stands at 35:23 in favour of adopting the following non-binding resolution:

The English Wikipedia community is concerned that the Wikimedia Foundation has found itself engaged in mission creep, and that this has resulted in funds that donors provided in the belief that they would support Wikimedia Projects being allocated to unrelated external organizations, despite urgent need for those funds to address internal deficiencies.

We request that the Wikimedia Foundation reappropriates all money remaining in the Knowledge Equity Fund, and we request that prior to making non-trivial grants that a reasonable individual could consider unrelated to supporting Wikimedia Projects that the Foundation seeks approval from the community.

AK

Community rejects proposal to create policy about large language models

A request for comment (RfC) to create an English Wikipedia policy or guideline regulating editors' use of large language models (e.g. ChatGPT) was rejected recently. Specifically, the RfC concerned the proposal to elevate the draft page Wikipedia:Large language models (expanded from a much smaller version created in December 2022) to policy or guideline status. As summarized by the closing editor:

There is an overwhelming consensus to not promote. 1 editor would promote to policy, 7 editors prefer guideline, and 30 editors were against promotion. 2 editors were fine with either policy or guideline. [...] The most common and strongest rationale against promotion (articulated by 12 editors, plus 3 others outside of their !votes) was that existing P&Gs [ policies and guidelines], particularly the policies against vandalism and policies like WP:V and WP:RS, already cover the issues raised in the proposals. 5 editors would ban LLMs outright. 10-ish editors believed that it was either too soon to promote or that there needed to be some form of improvement. On the one hand, several editors believed that the current proposal was too lax; on the other, some editors felt that it was too harsh, with one editor suggesting that Wikipedia should begin to integrate AI or face replacement by encyclopedias that will. (2 editors made a bet that this wouldn't happen.)
Editors who supported promoting to guideline noted that Wikipedia needs to address the use of LLMs and that the perfect should not be the enemy of the good. However, there was no general agreement on what the "perfect" looked like, and other editors pointed out that promoting would make it much harder to revise or deprecate if consensus still failed to develop.

Similarly, on Wikimedia Commons a page collecting guidance about AI-generated media (particularly the use of generative AI models such as DALL-E, Stable Diffusion or Midjourney), likewise created in December 2022, is still marked as "a work in progress page", although it appears to have progressed a bit further already towards reflecting community consensus.

In any case, discussions about generative AI are continuing in the Wikimedia movement, also in off-wiki fora such as the "ML, AI and GenAI for Wikimedia Projects" Facebook group and the "Wikimedia AI" group on Telegram (non-public but with public invite link). At Wikimania 2023, it was the subject of various sessions including two panels titled "AI advancements and the Wikimedia projects" ( video) and " ChatGPT vs. WikiGPT: Challenges and Opportunities in harnessing generative AI for Wikimedia Projects" ( video). The September edition of the Wiki Education Foundation's "Speaker Series" likewise had the topic " Wikipedia in a Generative AI World", featuring three speakers including Aaron Halfaker ( User:EpochFail, a former research scientist at the Wikimedia Foundation and developer of the AI-based ORES system that is still widely used for vandalism detection and other purposes). – H


Several European regulation efforts may adversely affect Wikimedia projects

In its EU Policy Monitoring Report for September, Wikimedia Europe highlights several legislative efforts that are ongoing on the continent. Some of them raise concerns regarding their possible impact on Wikipedia and other Wikimedia projects:

  • The EMFA (European Media Freedom Act) is "intended to help a pluralistic media landscape", but also contains problematic provisions, e.g. a requirement for online platforms to warn "media providers, who can be media outlets but also individuals, such as journalists [...] ahead of moderating their content and to give them a fast-track channel to contest decisions. Some lawmakers even suggest that online platforms be prohibited from deleting content by media providers before the provider has had a chance to reply. All this is highly problematic, seeing that disinformation is sometimes produced by media providers." Efforts to exempt Wikimedia projects or at least non-profit "online encyclopaedias" succeeded initially but then were in jeopardy again. However, negotiations are expected to continue into 2024.
  • The controversial Regulation to Prevent and Combat Child Sexual Abuse (CSAR) proposed by EU Commissioner Ylva Johansson is reported to have "stalled somewhat" recently. It would cover Wikimedia projects too, "and the Wikimedia Foundation has provided [already in 2022] constructive feedback, outlining some risks and challenges posed by the scanning technologies used. Wikimedia is also criticising the idea to scan direct, interpersonal communication in a general manner and without judicial oversight."
  • In France, the proposed Loi SREN "would introduce some provisions on data retention and user identification, in order to not allow already banned users to re-register. That would require the collection of heaps of data and the compulsory identification of all users. Wikimedia projects are squarely in the scope of this proposal." Initial efforts to "take our projects out of the fireline" have failed.

H


Brief notes

Tyap Wikimedians at Gurara River

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook