From Wikipedia, the free encyclopedia
File:View of Gaza Strip from Israel - October 2009 (4025037981).jpg
David Berkowitz
CC 2.0 BY
125
43
500
Special report

19-page PDF accuses Wikipedia of bias against Israel, suggests editors be forced to reveal their real names, and demands a new feature allowing people to view the history of Wikipedia articles

Logo of Arabic Wikipedia, showing the Wikipedia "globe" colored to look like the Palestinian flag.
Arabic Wikipedia's logo in solidarity with Palestine has caused controversy

The report

HaeB, sawyer-mcdonell

The World Jewish Congress has published a report titled " The Bias Against Israel on Wikipedia", which has been covered by The Jerusalem Post, Jewish News Syndicate, and Spectator Australia (paywalled). The author, Dr. Shlomit Aharoni Lir, is an academic, described in a related recent publication as "a poet, essayist, lecturer, and gender studies scholar [who] holds a research fellowship at Bar-Ilan University and is a lecturer at Achva College". She had previously published a peer-reviewed paper about gender bias on Wikipedia ( Signpost coverage). The present report does not seem intended to be an academic publication, although it has already been used as a citation in the article Wikipedia and the Israeli–Palestinian conflict.

The report criticizes Arabic Wikipedia's "blackout" in solidarity with Palestinians (see Signpost coverage), the English Wikipedia's coverage of the Holocaust, and its general "bias against Israel", which the author argues is exemplified through content and sourcing bias, "deletion attacks", editing restrictions, "selective enforcement" by administrators, and "anti-Israeli editors".

Lauren Dickinson of the Wikimedia Foundation's Communications Department told The Signpost that several staff had reviewed the document, and found that "the WJC report makes a number of unsubstantiated claims of bias on Wikipedia. It lacks adequate references, quotes, links, or other sources to support its purported findings. Further, the report misunderstands Wikipedia's NPOV policy, as well as the importance of anonymity for user privacy on Wikimedia projects."

But what is the deal, really? Let's take a look.

Big if true, but is it true?

JPxG

The problem with this report is that many of its suggested improvements are things we've already been doing (publicly, and in a very prominent way) for decades; and many of the rest are intrusive threats to the personal safety of editors and administrators. This makes it hard to take most of its claims seriously; for example, if somebody thinks there's no way to see who has added text to a Wikipedia article, it seems easier to tell them where the "history" tab is at the top of the page, rather than create an unaccountable editorial council and appoint them to it.

The 19-page report, which focuses on the English Wikipedia, "is based on research, content analysis, and interviews with Israeli Wikipedians"; its overview of challenges to Wikipedia's ideals include "The Power of the Admins and Beurocrats" [sic], as well as the gender gap (see Signpost coverage). Their citations on the gender gap include a survey taken in 2008 saying that Wikipedia editors were mostly male, [1] and a paper from 2011 which compared Wikipedia to Encyclopedia Britannica on coverage of women in historical biography lists and concluded that Wikipedia had "significantly greater coverage" and its articles were "significantly longer than Britannica articles [...] for every source list" [2]. It is somewhat unclear what, if any, relation the gender ratio of biography articles (or indeed of the editoriat) has to Israel and Palestine; while there is indeed a paper from 2014 [3] (cf. our coverage) that talks in greater depth about lower participation rates for female editors, and it's uncontroversially true that there exists a gender disparity among Wikipedia editors, it's hard to see what the connection is here. It is a little embarrassing, but so are the MoS arguments, and those don't have anything to do with Gaza either. The closest the report comes to making a connection between the gender gap and Gaza (apart from the alliteration) is to say:

Sure: it is true that we live in a society, and that the biases of that society pose issues for our attempt to write an encyclopedia that is both neutral and based on direct citations to sources written in that society. This is a problem apparently endemic to all encyclopedia writers, even our Britannic forebears; and it is the subject of much ongoing reflection and work, which is altogether good and proper to do. But it is not really clear how this relates to the main claim of the report, which is that Wikipedia is biased against Israel.

Regarding administrators, the report mentions issues with "concealment of decision-makers' actions, alongside the significant authority wielded by anonymous administrators who can delete entries and block participants without accountability". This may come as a surprise to readers, who may justifiably consider Wikipedia to be one of the most transparently operated major sites on the Web, and indeed in the modern history of the Web. Editorial decisions, discussions about those decisions, administrative actions, and the edits themselves are meticulously logged, to the second, in full public view, on a page that automatically generates a display of precisely which changes were made in the edit, or what actions were taken, by whom. When someone is banned from Twitter or Facebook, there is not an up-to-the-second log of which specific person pushed the button; there may be a form letter emailed to them later, but the act itself is not public, nor is it really disclosed at all. On Wikipedia it is; moreover, it can be reviewed and contested publicly. There's a public noticeboard for review of administrative actions; the Arbitration Committee regularly rules on cases of administrator misconduct, often deciding to remove admins. There is a large, highly regimented formal process for deleting articles (and a second one for formally appealing deletion decisions). Indeed, there are incidents in which administrators act beyond the bounds of propriety, but it is not really clear that Wikipedia is a "failed state" in any meaningful way with respect to their actions.

For example: this is a full public log of every formal action that I (Signpost editor and Wikipedia administrator JPxG) have ever taken.

You can see, in this log, that Ayamediainc was indefinitely blocked on March 12, 2024, at 04:45 UTC, for violation of the spam guidelines.

On their user talk page is a publicly viewable template indicating the name of the page in question, and an explanation of the specific way that it was in violation of the policy.

The log for that user account indicates it was created at 02:47 that same day, and two hours later they created a page at User:Ayamediainc/sandbox with the summary "Added sections from Ayamediainc profile and added background information". The log also specifies that this edit tripped two automated alerts, because text on the page matched patterns that are strongly associated with spam. The policies, the user warnings, the block, the reason for the block, and the identity of the blocking administrator (in this case me) are publicly viewable and can be audited by anyone. If the user appealed the block, that too would be a matter of public record, as would the response of whichever seprate administrator handled the appeal (which wouldn't be me).

One of the demands I can easily agree with: the "transparent editing history" item, which exhorts Wikipedia to "ensure that all changes to articles are transparent and traceable", which "helps in identifying editors who may consistently introduce bias into articles".

This stance is shared by the Wikipedia community, who implemented it 23 years ago; a link to the ( mostly) complete history of every article has been a central element of the top of every Wikipedia page since the year 2002 (and the feature was incomplete and less prominently linked the year before that).

If there is some other website which provides greater transparency into its administrative and editorial decisions, perhaps it would provide a useful model for us to emulate. However, it is hard to come up with one. The histories viewable for every single Wikipedia article track every modification ever made to them, from major copyedits to em-dash fixes, and are permanently attributable to the editors. It's hard to come up with any way to increase the transparency of the process, except for personally doxing the administrators and editors.

This is one of the suggestions vaguely alluded to in the report, and later said explicitly: to this, I may offer the rejoinder that in that block log you can see me issuing blocks to a wide range of people encompassing bored schoolkids, scammers, vandals, and seriously disturbed and hostile individuals who carry decades-long grudges. I am a volunteer who edits for fun. These are not, generally speaking, people I would prefer to know where my family lived, especially not the guy who capped off a decade and change of Wikipedia harassment career by going to jail for making dozens of graphic death threats to the Merriam-Webster dictionary. That was not a made-up example: this guy is real and he's one of the hundred or so entries on the long-term abuse page. As for editorial integrity, I can definitely imagine what effect it would have on our article about any given scandal of corporate malfeasance or government corruption if somebody were able to instantly file vexatious lawsuits against individual editors. I just cannot imagine it being a good one.

The report says that "network bullying and discriminatory treatment increase when there is no personal responsibility and acting under cover of anonymity is possible". In general, my experience over the last twenty years of hanging out on the Internet suggests that network bullying also increases a lot when every person you ban from a website has trivial access to your home address, including this sicko.

One of the report's proposals for new features that should be implemented is to "host forums and discussions within the Wikipedia community to address concerns about neutrality and gather feedback for policy improvements"; it is not specified how this would fit into the existing directory of centralized discussions, dashboard, project-wide Request for Comment process with fifteen categories, six Village Pumps, and dozen or so noticeboards, including a neutral-point-of-view noticeboard and dispute resolution noticeboard. It is not explained why these venues are insufficient; none of them are even mentioned.

Generally, a persistent problem with the first part of the report is that it repeatedly claims (either explicitly or through insinuation) that Wikipedia lacks a process to deal with some issue, and gives no evidence to support that claim, when in reality Wikipedia not only has a process, but has had it for a very long time (sometimes more than 20 years) and it forms a central part of the site's administrative apparatus upon which editors spend hundreds of hours daily rigorously documenting all of their actions with direct references to policies and consensus. It seems that the main objective is to establish (or at least repeatedly assert) that Wikipedia lacks self-governance, or that it is unable to handle contentious issues, or that nobody has ever realized until now that it was possible for people to be rude about politics online; then this is used as the basis to propose all sorts of bureaucratic impositions, most of them done by external groups (perhaps including the one that made the report).

This is bad. While it is indeed the case that some people are biased towards one view or another, it seems unlikely that instituting binding top-down procedures (like an official oversight committee to dictate content at the behest of external consulting agencies and lobbyists, as is also suggested in the report) would arrive at a remotely better result. It's also not clear why we lack these: the report seems to be unaware of basic features like page history, which allow anyone to see who's written an article. Simply clicking on the history tab seems like an easier solution than doxing every editor or subjecting sitewide editorial decisions to random external think tanks.

If an unknown detective arrives at the murder scene and demands to be given authority over the investigation, of course there are questions about whether he has jurisdiction, but even before that, it seems relevant to note whether the victim is actually dead. If he's sitting at the dinner table asking what you're doing in the living room, this seems like a significant detail in the murder investigation.

The bias

After this, there are a number of specific examples given of pages concerning the conflict in Gaza. Frankly, this may be true: political articles are biased sometimes. They tend to be edited by a variety of people with various allegiances, who argue at great length over everything from trivial minutiae to the timelines of major events. It is not clear this can, or should, be fixed. Existing research seems fairly consistent on the idea that conversation and collaboration between people of various perspectives improves the overall quality of articles.

This process does involve a great deal of tedious, unpleasant argument; ask anybody who's edited (or worse, created) a contentious article on a political subject. But ultimately, if an article about an event is unduly biased towards one side, the only solution is to edit it in a way that fixes the problem. This sometimes results in contention, in which case a discussion must be carried out between the people who disagree, and if they cannot resolve it between themselves, there are a variety of ways of seeking external assistance.

There are a broad number of existing venues to which disputes can be brought, and through which disputes can be addressed. It's true that these processes often take a long time to resolve, and it's true that in the meantime an article can be grotesquely biased. It's even true that an article can stay grotesquely biased for a while. Any editor active on political topics can tell you about their personal Alamo, perhaps several of them, where they showed up armed with reason and common sense, and a dozen idiots showed up armed with idiocy, and they were crushed in ignominious defeat. In fact, maybe I am one of the idiots who ruined your article, and maybe you are one of the idiots who ruined mine. And we are both the idiots for some other third person. It is just an inescapable aspect of living in a society: sometimes people, even people collaborating on a project, have irreconcilable disagreements. This has happened thousands of times, and we have mechanisms for dealing with it; they may not always work perfectly, but it remains to be seen what other way things could possibly be run and work anywhere near as well.

Wikipedia policies and consensus processes are the worst form of collaborative encyclopedia-writing projects, except for all the other ones.

Arbitration Committee grants new editor extended-confirmed status to open case request

JPxG

Subsequent to the publication of this report, on March 20, the Arbitration Committee announced that a user account created that day with zero edits ( Mschwartz1) would be granted extended-confirmed status "for the exclusive purpose of participating in a case request about Israel-Palestine". Extended-confirmed status, generally, is given to accounts with over 500 edits that are at least 30 days old (and is currently a prerequisite for any editing activity in the Israel–Palestine area, formally designated a Contentious Topic).

A long discussion ensued at the ArbCom noticeboard's talk page, as well as at the unmentionable BADSITE, in which it was speculated that this may have been an employee of the organization publishing the report (due to timing that closely aligned with the publishing of the report). Arbitrator Barkeep49 said that it "may or may not be a coincidence", explaining that "I can say the conversation with us that led to this grant has been going on since early February." Limited information has been made available about the nature of the editor, although the rare decision to grant EC status to a zero-edit account on the day of its creation based on private correspondence with the Committee beforehand indicates that there is something unusual about the situation.

It remains uncertain whether this account has any relation to the WJC (or to any lobbying organization); commenters at the talk page for the ArbCom noticeboard have questioned whether this unknown party has standing to request a case be opened, whether a disclosure is required per WP:COI, and other issues. A more comprehensive explanation came from arbitrator Primefac:

Mschwartz1's sole edit (on the 26th) was to add this case request against Nishidani (mistakenly putting it at Wikipedia talk:Arbitration Committee instead of Wikipedia:Arbitration/Requests/Case, after which it was closed with instructions on how to post it to the correct board).

As of press time, there seems to be no conclusive evidence either way of who or what this account belongs to, despite many fairly strong opinions and speculations being expressed on the talk page.

See also related earlier coverage: " Does Wikipedia's Gaza coverage show an anti-Israel bias?" ("In the media", November 6, 2023) and " WikiProjects Israel and Palestine ("WikiProject Report", January 10, 2024)

From Wikipedia, the free encyclopedia
File:View of Gaza Strip from Israel - October 2009 (4025037981).jpg
David Berkowitz
CC 2.0 BY
125
43
500
Special report

19-page PDF accuses Wikipedia of bias against Israel, suggests editors be forced to reveal their real names, and demands a new feature allowing people to view the history of Wikipedia articles

Logo of Arabic Wikipedia, showing the Wikipedia "globe" colored to look like the Palestinian flag.
Arabic Wikipedia's logo in solidarity with Palestine has caused controversy

The report

HaeB, sawyer-mcdonell

The World Jewish Congress has published a report titled " The Bias Against Israel on Wikipedia", which has been covered by The Jerusalem Post, Jewish News Syndicate, and Spectator Australia (paywalled). The author, Dr. Shlomit Aharoni Lir, is an academic, described in a related recent publication as "a poet, essayist, lecturer, and gender studies scholar [who] holds a research fellowship at Bar-Ilan University and is a lecturer at Achva College". She had previously published a peer-reviewed paper about gender bias on Wikipedia ( Signpost coverage). The present report does not seem intended to be an academic publication, although it has already been used as a citation in the article Wikipedia and the Israeli–Palestinian conflict.

The report criticizes Arabic Wikipedia's "blackout" in solidarity with Palestinians (see Signpost coverage), the English Wikipedia's coverage of the Holocaust, and its general "bias against Israel", which the author argues is exemplified through content and sourcing bias, "deletion attacks", editing restrictions, "selective enforcement" by administrators, and "anti-Israeli editors".

Lauren Dickinson of the Wikimedia Foundation's Communications Department told The Signpost that several staff had reviewed the document, and found that "the WJC report makes a number of unsubstantiated claims of bias on Wikipedia. It lacks adequate references, quotes, links, or other sources to support its purported findings. Further, the report misunderstands Wikipedia's NPOV policy, as well as the importance of anonymity for user privacy on Wikimedia projects."

But what is the deal, really? Let's take a look.

Big if true, but is it true?

JPxG

The problem with this report is that many of its suggested improvements are things we've already been doing (publicly, and in a very prominent way) for decades; and many of the rest are intrusive threats to the personal safety of editors and administrators. This makes it hard to take most of its claims seriously; for example, if somebody thinks there's no way to see who has added text to a Wikipedia article, it seems easier to tell them where the "history" tab is at the top of the page, rather than create an unaccountable editorial council and appoint them to it.

The 19-page report, which focuses on the English Wikipedia, "is based on research, content analysis, and interviews with Israeli Wikipedians"; its overview of challenges to Wikipedia's ideals include "The Power of the Admins and Beurocrats" [sic], as well as the gender gap (see Signpost coverage). Their citations on the gender gap include a survey taken in 2008 saying that Wikipedia editors were mostly male, [1] and a paper from 2011 which compared Wikipedia to Encyclopedia Britannica on coverage of women in historical biography lists and concluded that Wikipedia had "significantly greater coverage" and its articles were "significantly longer than Britannica articles [...] for every source list" [2]. It is somewhat unclear what, if any, relation the gender ratio of biography articles (or indeed of the editoriat) has to Israel and Palestine; while there is indeed a paper from 2014 [3] (cf. our coverage) that talks in greater depth about lower participation rates for female editors, and it's uncontroversially true that there exists a gender disparity among Wikipedia editors, it's hard to see what the connection is here. It is a little embarrassing, but so are the MoS arguments, and those don't have anything to do with Gaza either. The closest the report comes to making a connection between the gender gap and Gaza (apart from the alliteration) is to say:

Sure: it is true that we live in a society, and that the biases of that society pose issues for our attempt to write an encyclopedia that is both neutral and based on direct citations to sources written in that society. This is a problem apparently endemic to all encyclopedia writers, even our Britannic forebears; and it is the subject of much ongoing reflection and work, which is altogether good and proper to do. But it is not really clear how this relates to the main claim of the report, which is that Wikipedia is biased against Israel.

Regarding administrators, the report mentions issues with "concealment of decision-makers' actions, alongside the significant authority wielded by anonymous administrators who can delete entries and block participants without accountability". This may come as a surprise to readers, who may justifiably consider Wikipedia to be one of the most transparently operated major sites on the Web, and indeed in the modern history of the Web. Editorial decisions, discussions about those decisions, administrative actions, and the edits themselves are meticulously logged, to the second, in full public view, on a page that automatically generates a display of precisely which changes were made in the edit, or what actions were taken, by whom. When someone is banned from Twitter or Facebook, there is not an up-to-the-second log of which specific person pushed the button; there may be a form letter emailed to them later, but the act itself is not public, nor is it really disclosed at all. On Wikipedia it is; moreover, it can be reviewed and contested publicly. There's a public noticeboard for review of administrative actions; the Arbitration Committee regularly rules on cases of administrator misconduct, often deciding to remove admins. There is a large, highly regimented formal process for deleting articles (and a second one for formally appealing deletion decisions). Indeed, there are incidents in which administrators act beyond the bounds of propriety, but it is not really clear that Wikipedia is a "failed state" in any meaningful way with respect to their actions.

For example: this is a full public log of every formal action that I (Signpost editor and Wikipedia administrator JPxG) have ever taken.

You can see, in this log, that Ayamediainc was indefinitely blocked on March 12, 2024, at 04:45 UTC, for violation of the spam guidelines.

On their user talk page is a publicly viewable template indicating the name of the page in question, and an explanation of the specific way that it was in violation of the policy.

The log for that user account indicates it was created at 02:47 that same day, and two hours later they created a page at User:Ayamediainc/sandbox with the summary "Added sections from Ayamediainc profile and added background information". The log also specifies that this edit tripped two automated alerts, because text on the page matched patterns that are strongly associated with spam. The policies, the user warnings, the block, the reason for the block, and the identity of the blocking administrator (in this case me) are publicly viewable and can be audited by anyone. If the user appealed the block, that too would be a matter of public record, as would the response of whichever seprate administrator handled the appeal (which wouldn't be me).

One of the demands I can easily agree with: the "transparent editing history" item, which exhorts Wikipedia to "ensure that all changes to articles are transparent and traceable", which "helps in identifying editors who may consistently introduce bias into articles".

This stance is shared by the Wikipedia community, who implemented it 23 years ago; a link to the ( mostly) complete history of every article has been a central element of the top of every Wikipedia page since the year 2002 (and the feature was incomplete and less prominently linked the year before that).

If there is some other website which provides greater transparency into its administrative and editorial decisions, perhaps it would provide a useful model for us to emulate. However, it is hard to come up with one. The histories viewable for every single Wikipedia article track every modification ever made to them, from major copyedits to em-dash fixes, and are permanently attributable to the editors. It's hard to come up with any way to increase the transparency of the process, except for personally doxing the administrators and editors.

This is one of the suggestions vaguely alluded to in the report, and later said explicitly: to this, I may offer the rejoinder that in that block log you can see me issuing blocks to a wide range of people encompassing bored schoolkids, scammers, vandals, and seriously disturbed and hostile individuals who carry decades-long grudges. I am a volunteer who edits for fun. These are not, generally speaking, people I would prefer to know where my family lived, especially not the guy who capped off a decade and change of Wikipedia harassment career by going to jail for making dozens of graphic death threats to the Merriam-Webster dictionary. That was not a made-up example: this guy is real and he's one of the hundred or so entries on the long-term abuse page. As for editorial integrity, I can definitely imagine what effect it would have on our article about any given scandal of corporate malfeasance or government corruption if somebody were able to instantly file vexatious lawsuits against individual editors. I just cannot imagine it being a good one.

The report says that "network bullying and discriminatory treatment increase when there is no personal responsibility and acting under cover of anonymity is possible". In general, my experience over the last twenty years of hanging out on the Internet suggests that network bullying also increases a lot when every person you ban from a website has trivial access to your home address, including this sicko.

One of the report's proposals for new features that should be implemented is to "host forums and discussions within the Wikipedia community to address concerns about neutrality and gather feedback for policy improvements"; it is not specified how this would fit into the existing directory of centralized discussions, dashboard, project-wide Request for Comment process with fifteen categories, six Village Pumps, and dozen or so noticeboards, including a neutral-point-of-view noticeboard and dispute resolution noticeboard. It is not explained why these venues are insufficient; none of them are even mentioned.

Generally, a persistent problem with the first part of the report is that it repeatedly claims (either explicitly or through insinuation) that Wikipedia lacks a process to deal with some issue, and gives no evidence to support that claim, when in reality Wikipedia not only has a process, but has had it for a very long time (sometimes more than 20 years) and it forms a central part of the site's administrative apparatus upon which editors spend hundreds of hours daily rigorously documenting all of their actions with direct references to policies and consensus. It seems that the main objective is to establish (or at least repeatedly assert) that Wikipedia lacks self-governance, or that it is unable to handle contentious issues, or that nobody has ever realized until now that it was possible for people to be rude about politics online; then this is used as the basis to propose all sorts of bureaucratic impositions, most of them done by external groups (perhaps including the one that made the report).

This is bad. While it is indeed the case that some people are biased towards one view or another, it seems unlikely that instituting binding top-down procedures (like an official oversight committee to dictate content at the behest of external consulting agencies and lobbyists, as is also suggested in the report) would arrive at a remotely better result. It's also not clear why we lack these: the report seems to be unaware of basic features like page history, which allow anyone to see who's written an article. Simply clicking on the history tab seems like an easier solution than doxing every editor or subjecting sitewide editorial decisions to random external think tanks.

If an unknown detective arrives at the murder scene and demands to be given authority over the investigation, of course there are questions about whether he has jurisdiction, but even before that, it seems relevant to note whether the victim is actually dead. If he's sitting at the dinner table asking what you're doing in the living room, this seems like a significant detail in the murder investigation.

The bias

After this, there are a number of specific examples given of pages concerning the conflict in Gaza. Frankly, this may be true: political articles are biased sometimes. They tend to be edited by a variety of people with various allegiances, who argue at great length over everything from trivial minutiae to the timelines of major events. It is not clear this can, or should, be fixed. Existing research seems fairly consistent on the idea that conversation and collaboration between people of various perspectives improves the overall quality of articles.

This process does involve a great deal of tedious, unpleasant argument; ask anybody who's edited (or worse, created) a contentious article on a political subject. But ultimately, if an article about an event is unduly biased towards one side, the only solution is to edit it in a way that fixes the problem. This sometimes results in contention, in which case a discussion must be carried out between the people who disagree, and if they cannot resolve it between themselves, there are a variety of ways of seeking external assistance.

There are a broad number of existing venues to which disputes can be brought, and through which disputes can be addressed. It's true that these processes often take a long time to resolve, and it's true that in the meantime an article can be grotesquely biased. It's even true that an article can stay grotesquely biased for a while. Any editor active on political topics can tell you about their personal Alamo, perhaps several of them, where they showed up armed with reason and common sense, and a dozen idiots showed up armed with idiocy, and they were crushed in ignominious defeat. In fact, maybe I am one of the idiots who ruined your article, and maybe you are one of the idiots who ruined mine. And we are both the idiots for some other third person. It is just an inescapable aspect of living in a society: sometimes people, even people collaborating on a project, have irreconcilable disagreements. This has happened thousands of times, and we have mechanisms for dealing with it; they may not always work perfectly, but it remains to be seen what other way things could possibly be run and work anywhere near as well.

Wikipedia policies and consensus processes are the worst form of collaborative encyclopedia-writing projects, except for all the other ones.

Arbitration Committee grants new editor extended-confirmed status to open case request

JPxG

Subsequent to the publication of this report, on March 20, the Arbitration Committee announced that a user account created that day with zero edits ( Mschwartz1) would be granted extended-confirmed status "for the exclusive purpose of participating in a case request about Israel-Palestine". Extended-confirmed status, generally, is given to accounts with over 500 edits that are at least 30 days old (and is currently a prerequisite for any editing activity in the Israel–Palestine area, formally designated a Contentious Topic).

A long discussion ensued at the ArbCom noticeboard's talk page, as well as at the unmentionable BADSITE, in which it was speculated that this may have been an employee of the organization publishing the report (due to timing that closely aligned with the publishing of the report). Arbitrator Barkeep49 said that it "may or may not be a coincidence", explaining that "I can say the conversation with us that led to this grant has been going on since early February." Limited information has been made available about the nature of the editor, although the rare decision to grant EC status to a zero-edit account on the day of its creation based on private correspondence with the Committee beforehand indicates that there is something unusual about the situation.

It remains uncertain whether this account has any relation to the WJC (or to any lobbying organization); commenters at the talk page for the ArbCom noticeboard have questioned whether this unknown party has standing to request a case be opened, whether a disclosure is required per WP:COI, and other issues. A more comprehensive explanation came from arbitrator Primefac:

Mschwartz1's sole edit (on the 26th) was to add this case request against Nishidani (mistakenly putting it at Wikipedia talk:Arbitration Committee instead of Wikipedia:Arbitration/Requests/Case, after which it was closed with instructions on how to post it to the correct board).

As of press time, there seems to be no conclusive evidence either way of who or what this account belongs to, despite many fairly strong opinions and speculations being expressed on the talk page.

See also related earlier coverage: " Does Wikipedia's Gaza coverage show an anti-Israel bias?" ("In the media", November 6, 2023) and " WikiProjects Israel and Palestine ("WikiProject Report", January 10, 2024)


Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook