From Wikipedia, the free encyclopedia

As Wikipedians we represent the largest and most read Encyclopedia to ever have existed. Therefore, it is not self-aggrandizing to state that this makes us a very important group of people. For many, we represent the collective knowledge of the world — a position and responsibility not to be taken lightly. While individual contributions may not be significant, as a collective our work is revolutionizing. As a group, we are a highly selected bunch and poorly represent the world at large. We are to a greater extent Western, male, highly educated and belong to certain professions. This matters because we risk introducing systemic biases in our coverage. However, all bias is not systemic, and all systemic bias is not unique to Wikipedia.

Some bias arises because we are human, and humans are prone to logical fallacies and misconceptions. Other bias is systemic, but not caused by the make-up of Wikipedians, but by the make-up of the world. Even with perfect proportional representation some bias will remain.

One strategy that Wikipedia employs to counter bias is analogous to the 'many eyes' principle of software bugs, expressed by Linus Torvalds "given enough eyeballs, all bugs are shallow". The Wikipedia version states: "given a sufficient number of varied viewpoints, all bias is avoided". This holds fast, and some of our best articles are on highly controversial subjects, with extensive debate on the talk pages. To an extent it is true that bias can be avoided this way, but it is not true that it necessarily overcomes bias that arises because we are human. The best strategy to avoid bias is by making ourselves aware of it. This essay attempts to shed light on some biases we Wikipedians (and our fellow humans) have, and ways to avoid them. The ways in which these biases affect Wikipedia, and the ways they synergize with Wikipedia's unique systemic biases; as well as those that affect the world at large — is explored. This essay lists some common biases as well as strategies to avoid them. It also discusses how you can go about making other people aware of bias in their reasoning. I invite other editors to contribute to avoid any bias introduced here by only having one author.

Me biased?

What is bias?

All sources are biased to some extent. Bias is achieved when we see only a portion of reality, obscuring the Truth. As humans we can never see all of reality, and the reality we perceive is always dependent on who does the seeing. In fact, the concept of any objective reality or truth is itself debated. That doesn't mean that all sources are equally biased or that everyone is equally biased; because some sources are heavily biased, and some only very slightly. Sources and opinions can be biased in favor or disfavor of different things.

In science

Science as a discipline attempts to minimize bias. It does so by adhering to scientific methodology, which through empirical study and use of rational argument increases the likelihood that what it finds is close to "the underlying truth/reality". It admits that truth is fickle, and most disciplines accept that no perfect truth can be achieved. ^ Statistics is used to overcome bias in order to catch a gleam of such "underlying truth". Statistics governs most of science, and even in mathematics, number theory has a lively debate whether something as absolute as rational numbers exist outside our imaginations — or whether it's all just statistics.

This view that no perfect truth can ever be articulated is not compatible with writing an encyclopedia, or for that matter living any ordinary life. It is true that I need to eat in order to survive — even though through sheer luck and quantum fuzzyness I could go on sustaining myself through osmosis, eating and drinking nothing but air. This is despite me never having really tested my theory. Instead of saying that "It is highly likely that I need to eat to survive", I can skip this to say "I need to eat in order to survive". Science operates the same way, stating that things that are very likely (as defined by statistics) are true.

In subjective judgement

Bias exists in non-scientific disciplines as well, and is caused by the same phenomena: only seeing part of the whole. It is equally true that bias can exist in subjective statements, for example; I can say "I like chocolate cake". In this statement I don't see the need to qualify with "but not cake that is laced with arsenic", and I can't be sure that the statement is an absolute truth about all chocolate cakes, needing to add "I like the types of chocolate cake I know of, but there may be types I don't like". Thus the statement is biased towards my preconceived notion of what a chocolate cake is. Normally when we speak of bias, we disregard this, but it is nonetheless bias. So my statements about likes are true, but not "absolutely true".

Neither do I need to define what others consider about this statement: "but Lisa doesn't agree that I like cake" — that's just not relevant. So while stating "I like cake" is quite unbiased, stating "cake is good" is more so. This is one of many reasons that Wikipedia, to uphold neutrality discourages anything potentially subjective. Assuming that statements always hold is a bias. Stating that "most people like cake" may be a slightly biased statement, while stating "everyone likes rock music" is heavily biased.

We need bias to survive. Wikipedia accepts some bias, but not 'too much'. When we strive for neutrality some of these strategies of everyday life for rationalizing bias act as our downfall. Most of Wikipedias policies such as those on weight, false balance and best sources all negotiate bias to find a proper middle ground on which to build an encyclopedia. While adhering to policies and guidelines is a necessity for contribution, we are even better served by making ourselves aware of our biases. One can only hope to minimize biases by making oneself aware of both conscious and unconscious biases. Guidelines have so far done little to address this.

What isn't bias?

Some value statements may be perceived as biased, and may rightfully be biased, but are so central to our mission that it isn't fruitful to discuss or contemplate them. Some biases are necessary for life, such as our biased judgement that living is better than dying. We have no truly "rational" reason to consider life better than death, ^ but the alternative is not viable for our continued existence. If we did not consider life better than death, we would not exist. Such a discussion may be apt at philosophy of death, but not really anywhere else on Wikipedia.

Some of Wikipedia's golden rules are likewise built on value statements where it is a bad idea to contemplate bias. For example, "All people are equal", or "everyone deserves access to Wikipedia". To question these statements with "Everyone is equal except ..." is not a good path to tread. If you can not agree with these statements you clearly do not belong here, as debating them is antithetical to building an encyclopedia.

Some of our core values fall in between, such as "Knowledge is good". At first glance it may seem an uncontroversial position, but there are plenty of policies that navigate "bad knowledge", or "unwanted knowledge". The guideline on offensive material, and in particular the "principle of 'least astonishment'" treads the middle ground. Similarly prognostic information about diseases can be placed under the header "prognosis", while omitted or only touched upon in the lede, when the information is likely to be unwanted (such caution is for example not warranted regarding the common cold). The balance between presenting harmful information and information that is good favors a cautious approach, where those who wish to learn of prognosis can do so under a separate header — and those who wish to avoid such knowledge do not have it thrust in their face.

Systemic bias

We can counter systemic bias causing missrepresentation by increasing the number of: female editors, editors from developing countries, and editors of color. We can also spread awareness of the problem and encourage editors who care about traditionally male hobbies and topics to write about traditionally female subjects. Both these strategies are employed, at times very successfully.

We can however never entirely overcome systemic bias; the state of the world forbids us. There are inequalities which we can not effect, in access to: the internet, education, free time for volunteering, the missmatch in already existing content, etc. People value contribution differently, and have different takes on what is required to contribute: there are professors who avoid editing because they feel they lack sufficient knowledge, while some teenagers enthusiastically rewrite articles on complex topics. Even if we level the field with an equal number of men and women; and a proportionally even representation of editors from developed and developing countries — there will still be an absolute missmatch in skills and experiences, and this will continue to cause bias. Projects such as Who's knowledge try to counteract this, but in order for those to be successful we must first collectively admit that there is a problem. This essays argues that we need to go further, we need to look at biases beyond systemic bias, and that it is the duty of each Wikipedian to avoid such bias.

Beyond systemic bias

Strategies to avoid bias

The way we write Wikipedia is flawed

It's the only way we can write Wikipedia

The many eyes principle

Writing what you know, a common cause of bias

People most interested in improving an article may have a connection to its subject.

A rule of writing is to "Write about what you know". This is generally good advice, but when your aim is neutrality — you need to step beyond what you know. Writing an article on a subject you know a great deal about, and adding the sources later is for the most part a bad idea. ^ You need to ask "Why do I know what I know?" and "How do I know what I know?". Sometimes the answer is simple: "I read it in a book" or "I heard it at a lecture". But the follow-up questions: "Why did I read that book?" or "Why did I attend that lecture?" are harder. If the answer is: "My local book-store stocked that book and I found it interesting", then the question turns into "Why do they stock that book?". You don't have to find definitive answers to these questions for them to be useful when thinking of bias. Bias can never be truly overcome, it can only be minimized.

When we write a book, we don't need to always reference each statement, but on Wikipedia we often find that we are required to do just that. So if you're like any ordinary editor, when you want to add a statement to an article: you will go searching for the book you read this in — or if you don't find that source — you will go looking for another source that states the same thing. Colloquially called "doing research", Wikipedia depends on it. However, it's not really research. You're looking for a source for a statement you already think is true. The strategies you employ to find a source for this statement are influenced by you already believing it true. Not all edits are made like this, but many are — and those that do not are subject to the same biases (just to a lesser degree, i.e. "Why did you know to look for that source to write about that topic?").

In an ideal world you would for each statement perform an unbiased review of the entire world literature on the subject before adding it to Wikipedia. This doesn't work, and had we operated like this, there would be no Wikipedia, nor in fact any knowledge whatsoever. In science we have a saying that: "You can find a study to support anything". This doesn't mean that whatever the study concludes is true — and if there are 100 times mores studies indicating the opposite, it is likely to be false. If what you "know" is supported by this study you're introducing strong bias by uncritically citing it on Wikipedia. Wikipedia is about find an acceptable balance between cherry-picking and performing entirely unbiased systematic reviews.

These behaviors introduce biases that are not so much systemic of Wikipedia as they are systemic as a result of being introduced by humans. (Yes you can argue that allowing edits by those who do not possess in-depth knowledge of evidence, makes it systemic of Wikipedia. But that is sort of beside the point, because without those people Wikipedia would not exist.)

This underlies the principle on Wikipedia to prefer secondary sources such as reviews or books over primary sources such as articles of original research or anecdotes. We can often glimpse some semblance of how authoritative a primary source is by going to secondary measures of impact, such as whether the article is from a highly regarded journal, or if it is published by a good publisher. ... Increasingly society is waking up to this, and what we call evidence is co-opted, with reviews with poor methodology being used to promote certain messages.

Simple questions to ask yourself before writing

There are a handful of questions you can ask yourself before contributing in order to decrease the risk of introducing bias:

  • How and why do I know this? If you're looking for a source for a statement you want to add to Wikipedia, you're going to
  • Why did I get hold of this source?
  • Is this the best source for this statement?

Wikipedia will never get rid of these biases, but a good strategy is to ask yourself these questions, and to see if any of your answers indicate one of the following cognitive biases.

Strategies to avoid bias:

  • Feigning disinterest — Asking yourself "If I knew nothing on the topic, where would I start?" can be a good idea, even if you have oodles of good sources at hand. Much like Rawlss original position in political philosophy, this works to diminish some of our bias. If you're likely to be directed to the sources you have at hand, good. If not, or some other sources comes up first, ask why don't I use that one?

If the answer is:

  • "It's no good", ask "Why is it no good?". Preferably you want other sources to describe any potential issues.
  • "I don't have access", ask whether someone else has access. Wikipedians at the {{Wikipedia:WikiProject Resource Exchange/Resource Request|Wikipedia:Resource Exchange]] are very helpful. If your topic is in a certain field, WikiProjects may be helpful, such as Wikipedia:WikiProject Medicine.

Common biases for Wikipedians

Confirmation bias

Wikipedians tend to write about what interests them and what they know. While it is difficult to change what interests you — it is less difficult to approach your interests with a critical eye.

We are likely to have opinions on topics that we know something of. Without getting into the Dunning-Kruger effect and having opinions on those topics we have little knowledge of —

On talk-pages

Confirmation bias arrises not only in choosing what information to include and omit from articles, but also in discussions, especially when heated. When you disagree you will be far more likely to interpret new facts about the issue, or omissions from the other to be guided by bad faith. Assuming a conflict of interest, ulterior motives or other incriminating circumstances is easy when you disagree. I think you'll find it very rare to question the movites of an editor you agree with. Be careful when accusing someone of such things, as it may be harmful to the quality of the discussion, to your odds of coming to an agreement, and to Wikipedia at large (it becomes harder to address real cases of bad faith when we accuse everyone of it).

Disconfirmation bias

There can be scientific consensus that something does not work without testing it. Absolute proof is rare or even impossible, and many times when scientific consensus deems something ineffective — there do not need to be studies. We were for example pretty sure that the moon was not made out of cheese, without actually going there first. Taking surface samples and analyzing them did not strengthen that assertion — as it was already a certainty. Without doubting the veracity of the statement "the moon is not made of cheese" knowledge on the make-up of moon-rock is irrelevant for the strength of the statement. The same is true for many other practices and statements that are pseudoscientific, or adhere to alternate views. Scientific consensus may change — sometimes drastically, but Wikipedia reflects current scientific consensus, and is not a crystal ball which predicts the future consensus. Do not attempt to portrait the future.

The bias that relates here is called disconfirmation bias, and it relies on logical fallacies such as shifting the goalposts, and ... Where ever more proof is required

Back-fire effect

It is helpful to know of the backfire effect, both as it affects oneself, and as it impacts the judgement of others. In short it stipulates that when presented with facts that disprove a position — the response is to double down, defending the position more strongly.

many caveats, with different studies showing...

A good series of podcasts...

Search bias

Rejection of authority

In circumstances like these we rely, not on ignorance, but on our knowledge, or conviction, that if the result we are concerned about were likely to arise, it would have arisen in some of the test cases. This use of the inability to prove something true supposes that investigators are highly skilled, and that they very probably would have uncovered the evidence sought had that been possible.

page 132 — Introduction to Logic Irving M. Copi Carl Cohen

Kenneth McMahon

Fourteenth Edition, 2014

Similar concepts

Overton window

An illustration of the Overton Window, along with Treviño's degrees of acceptance.

The term is from political science and public policy, and denotes that within each context there is a "windows of discourse", ranging from current policy through popular, sensible, accetable, radical to unthinkable. These lie on a continuum, akin to a bell curve. This relates very much to the fallacy of the false middle, with the caveat that there is no real middle.

This can be very well exemplified in politics, where the framing of a debate between countries varies. However, the phenomenon can also be found in things such as which word-choices are acceptable.

While much of this can be put down to systemic bias factors, not all of it can. The most popular position is importantly not the de facto correct or least biased position. Additionally all framing of debate according to acceptable and unacceptable discourse is not necessarily bad. Wikipedia defines a certain range of acceptable discourse, promoting scientific and sourced material over pseudoscientific and unsourced testimony and opinion.

  • It is both valuable and a worthwhile exercise to contemplate where this windows lies, both on Wikipedia and in the world at large. What lies outside the window of acceptable discourse is not objectively wrong.
Examples of acceptable framing

The following are examples of how word and image choices vary according to language versions of Wikipedia. Such variations are not limited to language, and these examples are simply a way to illustrate differences in perceived neutrality. Acceptable terminology or coverage similarly varies between different topic areas on a single Wikipedia.

Adolf Hitler's role in the Holocaust

These articles are chosen to illustrate varying perceptions of neutrality and what constitutes a high quality article on three different Wikipedias. The German and English versions are rated as either Good article or the equivalent of good article. All three articles are heavily edited, and are far more so than the average article. They should therefore all have been subjected to the "many eyes principle of bias", and it is a struggle to label this variation as systemic bias.

Here, in this thoroughly unscientific comparison, the ledes of the articles on Adolf Hitler, in: English, German, and Hebrew — can be placed along a continuum according to how they define the acts of the Holocaust, and Hitler's role in them.

English Wikipedia German Wikipedia Hebrew Wikipedia
Under Hitler's leadership and racially motivated ideology, the Nazi regime was responsible for the genocide of at least 5.5 million Jews[...] In the Holocaust, about 5.6 to 6.3 million Jews, in the Porajmos up to 500,000 were persecuted and [...] murdered. Hitler authorized the most important steps of the murder of the Jews and was informed about the course. During the war, millions of Jews and non-Jews were slaughtered in a genocidal manner by the initiative and direction of Hitler, who strove to realize the principles of racial doctrine.
Im Holocaust wurden etwa 5,6 bis 6,3 Millionen Juden, im Porajmos bis zu 500.000 als „asozial“ [...] und [...] ermordet. Hitler autorisierte die wichtigsten Schritte des Judenmordes und ließ sich über den Verlauf informieren. במהלך המלחמה נטבחו באופן ג'נוסיידי מיליוני יהודים ולא-יהודים ביזמתו ובהכוונתו של היטלר, ששאף לממש את עקרונות תורת הגזע.
2nd degree active voice 1st degree passive voice 1st degree active voice

The English version uses "genocide", the German: "murder", and the Hebrew: "slaughter".

We find the same variation in the image choices on the articles on The Holocaust, with the caveat that the English version uses "systematic murder", ^ and that the Hebrew version omits lede images entirely. The gradient of image choices in the articles differs markedly, with images showing few to no dead bodies in the English version, slightly more so in the German version, and with very severe imagery in the Hebrew version.

Alternative medicine

Motivated reasoning

... not really a bias

This is where things such as conflict of interest are relevant

Countering bias

We've all been there, debating with someone that we consider biased, who has an opinion or conception of facts that we believe is not predicated on reality. There are pretty much three constructive ways in which you can enter such debates. Each path has unique pitfalls and there are risks in taking any one of them. This essay argues you should start from option 1 or 2, reserving option 3 as a last resort.

Countering or counteracting bias is an important skill to have as a Wikipedian, because we are often met by biased opinions, inaccurate beliefs, as well as uninformed opinions. You can't write Wikipedia on your own, and countering bias by making friends is better than doing so by making enemies. Friends can help you in the future, while enemies just make everything more difficult. We've all snapped one time or another, often in the face of "massive ignorance", skipping right to option 3. We may come to regret this, and even if we don't, we're likely to scare off potential new editors. For someone to think it important enough to argue in a biased fashion, their efforts can likely be a positive influence if convinced to rethink.

  • Convince them of your position — The most important thing to remember in this step is that your goal is not to prove your opponent wrong, at least not in the traditional sense. What you want to do is to prove to them that you are right. This approach is rare on the internet, but if we look at the scientific literature on "correcting" people who hold inaccurate positions there are a few things you will want to avoid, such as the back-fire effect, repeating their falsehoods in order to refute them, putting to question someone's world-view, and insulting them. Even if proving yourself right implies you must prove them wrong, try to focus on the underlying virtue of your position, rather than the fallacies of theirs.
  • Reevaluate your own positionNaive realism ... We may be missing important aspects.. While Wikipedia doesn't formally respect titles or.. (Essjay), it might still be valuable to call into question your own judgement, especially if you are editing in a subject field you are less comfortable with.



The central message of all this advice is "Be humble". Wikipedia is a collaborative project, and some of the most misguided editors can change and ... The author of this text started off... and we have many who've started off as trolls and ..., who by simply being exposed to our environment have chosen to reform or... While there is no empirical evidence, these case reports and a general ... say that "it's easier to reform a troll or vandal than to recruit a new editor". Someone who takes time to disrupt Wikipedia obviously thinks Wikipedia is important enough that these actions are worth their time.

While

  • Ignore or report themWikipedia:Don't feed the trolls... intentional trolling and unintentional trolling. An example of an unintentional troll is someone that believes in alternative medicine. Despite..
From Wikipedia, the free encyclopedia

As Wikipedians we represent the largest and most read Encyclopedia to ever have existed. Therefore, it is not self-aggrandizing to state that this makes us a very important group of people. For many, we represent the collective knowledge of the world — a position and responsibility not to be taken lightly. While individual contributions may not be significant, as a collective our work is revolutionizing. As a group, we are a highly selected bunch and poorly represent the world at large. We are to a greater extent Western, male, highly educated and belong to certain professions. This matters because we risk introducing systemic biases in our coverage. However, all bias is not systemic, and all systemic bias is not unique to Wikipedia.

Some bias arises because we are human, and humans are prone to logical fallacies and misconceptions. Other bias is systemic, but not caused by the make-up of Wikipedians, but by the make-up of the world. Even with perfect proportional representation some bias will remain.

One strategy that Wikipedia employs to counter bias is analogous to the 'many eyes' principle of software bugs, expressed by Linus Torvalds "given enough eyeballs, all bugs are shallow". The Wikipedia version states: "given a sufficient number of varied viewpoints, all bias is avoided". This holds fast, and some of our best articles are on highly controversial subjects, with extensive debate on the talk pages. To an extent it is true that bias can be avoided this way, but it is not true that it necessarily overcomes bias that arises because we are human. The best strategy to avoid bias is by making ourselves aware of it. This essay attempts to shed light on some biases we Wikipedians (and our fellow humans) have, and ways to avoid them. The ways in which these biases affect Wikipedia, and the ways they synergize with Wikipedia's unique systemic biases; as well as those that affect the world at large — is explored. This essay lists some common biases as well as strategies to avoid them. It also discusses how you can go about making other people aware of bias in their reasoning. I invite other editors to contribute to avoid any bias introduced here by only having one author.

Me biased?

What is bias?

All sources are biased to some extent. Bias is achieved when we see only a portion of reality, obscuring the Truth. As humans we can never see all of reality, and the reality we perceive is always dependent on who does the seeing. In fact, the concept of any objective reality or truth is itself debated. That doesn't mean that all sources are equally biased or that everyone is equally biased; because some sources are heavily biased, and some only very slightly. Sources and opinions can be biased in favor or disfavor of different things.

In science

Science as a discipline attempts to minimize bias. It does so by adhering to scientific methodology, which through empirical study and use of rational argument increases the likelihood that what it finds is close to "the underlying truth/reality". It admits that truth is fickle, and most disciplines accept that no perfect truth can be achieved. ^ Statistics is used to overcome bias in order to catch a gleam of such "underlying truth". Statistics governs most of science, and even in mathematics, number theory has a lively debate whether something as absolute as rational numbers exist outside our imaginations — or whether it's all just statistics.

This view that no perfect truth can ever be articulated is not compatible with writing an encyclopedia, or for that matter living any ordinary life. It is true that I need to eat in order to survive — even though through sheer luck and quantum fuzzyness I could go on sustaining myself through osmosis, eating and drinking nothing but air. This is despite me never having really tested my theory. Instead of saying that "It is highly likely that I need to eat to survive", I can skip this to say "I need to eat in order to survive". Science operates the same way, stating that things that are very likely (as defined by statistics) are true.

In subjective judgement

Bias exists in non-scientific disciplines as well, and is caused by the same phenomena: only seeing part of the whole. It is equally true that bias can exist in subjective statements, for example; I can say "I like chocolate cake". In this statement I don't see the need to qualify with "but not cake that is laced with arsenic", and I can't be sure that the statement is an absolute truth about all chocolate cakes, needing to add "I like the types of chocolate cake I know of, but there may be types I don't like". Thus the statement is biased towards my preconceived notion of what a chocolate cake is. Normally when we speak of bias, we disregard this, but it is nonetheless bias. So my statements about likes are true, but not "absolutely true".

Neither do I need to define what others consider about this statement: "but Lisa doesn't agree that I like cake" — that's just not relevant. So while stating "I like cake" is quite unbiased, stating "cake is good" is more so. This is one of many reasons that Wikipedia, to uphold neutrality discourages anything potentially subjective. Assuming that statements always hold is a bias. Stating that "most people like cake" may be a slightly biased statement, while stating "everyone likes rock music" is heavily biased.

We need bias to survive. Wikipedia accepts some bias, but not 'too much'. When we strive for neutrality some of these strategies of everyday life for rationalizing bias act as our downfall. Most of Wikipedias policies such as those on weight, false balance and best sources all negotiate bias to find a proper middle ground on which to build an encyclopedia. While adhering to policies and guidelines is a necessity for contribution, we are even better served by making ourselves aware of our biases. One can only hope to minimize biases by making oneself aware of both conscious and unconscious biases. Guidelines have so far done little to address this.

What isn't bias?

Some value statements may be perceived as biased, and may rightfully be biased, but are so central to our mission that it isn't fruitful to discuss or contemplate them. Some biases are necessary for life, such as our biased judgement that living is better than dying. We have no truly "rational" reason to consider life better than death, ^ but the alternative is not viable for our continued existence. If we did not consider life better than death, we would not exist. Such a discussion may be apt at philosophy of death, but not really anywhere else on Wikipedia.

Some of Wikipedia's golden rules are likewise built on value statements where it is a bad idea to contemplate bias. For example, "All people are equal", or "everyone deserves access to Wikipedia". To question these statements with "Everyone is equal except ..." is not a good path to tread. If you can not agree with these statements you clearly do not belong here, as debating them is antithetical to building an encyclopedia.

Some of our core values fall in between, such as "Knowledge is good". At first glance it may seem an uncontroversial position, but there are plenty of policies that navigate "bad knowledge", or "unwanted knowledge". The guideline on offensive material, and in particular the "principle of 'least astonishment'" treads the middle ground. Similarly prognostic information about diseases can be placed under the header "prognosis", while omitted or only touched upon in the lede, when the information is likely to be unwanted (such caution is for example not warranted regarding the common cold). The balance between presenting harmful information and information that is good favors a cautious approach, where those who wish to learn of prognosis can do so under a separate header — and those who wish to avoid such knowledge do not have it thrust in their face.

Systemic bias

We can counter systemic bias causing missrepresentation by increasing the number of: female editors, editors from developing countries, and editors of color. We can also spread awareness of the problem and encourage editors who care about traditionally male hobbies and topics to write about traditionally female subjects. Both these strategies are employed, at times very successfully.

We can however never entirely overcome systemic bias; the state of the world forbids us. There are inequalities which we can not effect, in access to: the internet, education, free time for volunteering, the missmatch in already existing content, etc. People value contribution differently, and have different takes on what is required to contribute: there are professors who avoid editing because they feel they lack sufficient knowledge, while some teenagers enthusiastically rewrite articles on complex topics. Even if we level the field with an equal number of men and women; and a proportionally even representation of editors from developed and developing countries — there will still be an absolute missmatch in skills and experiences, and this will continue to cause bias. Projects such as Who's knowledge try to counteract this, but in order for those to be successful we must first collectively admit that there is a problem. This essays argues that we need to go further, we need to look at biases beyond systemic bias, and that it is the duty of each Wikipedian to avoid such bias.

Beyond systemic bias

Strategies to avoid bias

The way we write Wikipedia is flawed

It's the only way we can write Wikipedia

The many eyes principle

Writing what you know, a common cause of bias

People most interested in improving an article may have a connection to its subject.

A rule of writing is to "Write about what you know". This is generally good advice, but when your aim is neutrality — you need to step beyond what you know. Writing an article on a subject you know a great deal about, and adding the sources later is for the most part a bad idea. ^ You need to ask "Why do I know what I know?" and "How do I know what I know?". Sometimes the answer is simple: "I read it in a book" or "I heard it at a lecture". But the follow-up questions: "Why did I read that book?" or "Why did I attend that lecture?" are harder. If the answer is: "My local book-store stocked that book and I found it interesting", then the question turns into "Why do they stock that book?". You don't have to find definitive answers to these questions for them to be useful when thinking of bias. Bias can never be truly overcome, it can only be minimized.

When we write a book, we don't need to always reference each statement, but on Wikipedia we often find that we are required to do just that. So if you're like any ordinary editor, when you want to add a statement to an article: you will go searching for the book you read this in — or if you don't find that source — you will go looking for another source that states the same thing. Colloquially called "doing research", Wikipedia depends on it. However, it's not really research. You're looking for a source for a statement you already think is true. The strategies you employ to find a source for this statement are influenced by you already believing it true. Not all edits are made like this, but many are — and those that do not are subject to the same biases (just to a lesser degree, i.e. "Why did you know to look for that source to write about that topic?").

In an ideal world you would for each statement perform an unbiased review of the entire world literature on the subject before adding it to Wikipedia. This doesn't work, and had we operated like this, there would be no Wikipedia, nor in fact any knowledge whatsoever. In science we have a saying that: "You can find a study to support anything". This doesn't mean that whatever the study concludes is true — and if there are 100 times mores studies indicating the opposite, it is likely to be false. If what you "know" is supported by this study you're introducing strong bias by uncritically citing it on Wikipedia. Wikipedia is about find an acceptable balance between cherry-picking and performing entirely unbiased systematic reviews.

These behaviors introduce biases that are not so much systemic of Wikipedia as they are systemic as a result of being introduced by humans. (Yes you can argue that allowing edits by those who do not possess in-depth knowledge of evidence, makes it systemic of Wikipedia. But that is sort of beside the point, because without those people Wikipedia would not exist.)

This underlies the principle on Wikipedia to prefer secondary sources such as reviews or books over primary sources such as articles of original research or anecdotes. We can often glimpse some semblance of how authoritative a primary source is by going to secondary measures of impact, such as whether the article is from a highly regarded journal, or if it is published by a good publisher. ... Increasingly society is waking up to this, and what we call evidence is co-opted, with reviews with poor methodology being used to promote certain messages.

Simple questions to ask yourself before writing

There are a handful of questions you can ask yourself before contributing in order to decrease the risk of introducing bias:

  • How and why do I know this? If you're looking for a source for a statement you want to add to Wikipedia, you're going to
  • Why did I get hold of this source?
  • Is this the best source for this statement?

Wikipedia will never get rid of these biases, but a good strategy is to ask yourself these questions, and to see if any of your answers indicate one of the following cognitive biases.

Strategies to avoid bias:

  • Feigning disinterest — Asking yourself "If I knew nothing on the topic, where would I start?" can be a good idea, even if you have oodles of good sources at hand. Much like Rawlss original position in political philosophy, this works to diminish some of our bias. If you're likely to be directed to the sources you have at hand, good. If not, or some other sources comes up first, ask why don't I use that one?

If the answer is:

  • "It's no good", ask "Why is it no good?". Preferably you want other sources to describe any potential issues.
  • "I don't have access", ask whether someone else has access. Wikipedians at the {{Wikipedia:WikiProject Resource Exchange/Resource Request|Wikipedia:Resource Exchange]] are very helpful. If your topic is in a certain field, WikiProjects may be helpful, such as Wikipedia:WikiProject Medicine.

Common biases for Wikipedians

Confirmation bias

Wikipedians tend to write about what interests them and what they know. While it is difficult to change what interests you — it is less difficult to approach your interests with a critical eye.

We are likely to have opinions on topics that we know something of. Without getting into the Dunning-Kruger effect and having opinions on those topics we have little knowledge of —

On talk-pages

Confirmation bias arrises not only in choosing what information to include and omit from articles, but also in discussions, especially when heated. When you disagree you will be far more likely to interpret new facts about the issue, or omissions from the other to be guided by bad faith. Assuming a conflict of interest, ulterior motives or other incriminating circumstances is easy when you disagree. I think you'll find it very rare to question the movites of an editor you agree with. Be careful when accusing someone of such things, as it may be harmful to the quality of the discussion, to your odds of coming to an agreement, and to Wikipedia at large (it becomes harder to address real cases of bad faith when we accuse everyone of it).

Disconfirmation bias

There can be scientific consensus that something does not work without testing it. Absolute proof is rare or even impossible, and many times when scientific consensus deems something ineffective — there do not need to be studies. We were for example pretty sure that the moon was not made out of cheese, without actually going there first. Taking surface samples and analyzing them did not strengthen that assertion — as it was already a certainty. Without doubting the veracity of the statement "the moon is not made of cheese" knowledge on the make-up of moon-rock is irrelevant for the strength of the statement. The same is true for many other practices and statements that are pseudoscientific, or adhere to alternate views. Scientific consensus may change — sometimes drastically, but Wikipedia reflects current scientific consensus, and is not a crystal ball which predicts the future consensus. Do not attempt to portrait the future.

The bias that relates here is called disconfirmation bias, and it relies on logical fallacies such as shifting the goalposts, and ... Where ever more proof is required

Back-fire effect

It is helpful to know of the backfire effect, both as it affects oneself, and as it impacts the judgement of others. In short it stipulates that when presented with facts that disprove a position — the response is to double down, defending the position more strongly.

many caveats, with different studies showing...

A good series of podcasts...

Search bias

Rejection of authority

In circumstances like these we rely, not on ignorance, but on our knowledge, or conviction, that if the result we are concerned about were likely to arise, it would have arisen in some of the test cases. This use of the inability to prove something true supposes that investigators are highly skilled, and that they very probably would have uncovered the evidence sought had that been possible.

page 132 — Introduction to Logic Irving M. Copi Carl Cohen

Kenneth McMahon

Fourteenth Edition, 2014

Similar concepts

Overton window

An illustration of the Overton Window, along with Treviño's degrees of acceptance.

The term is from political science and public policy, and denotes that within each context there is a "windows of discourse", ranging from current policy through popular, sensible, accetable, radical to unthinkable. These lie on a continuum, akin to a bell curve. This relates very much to the fallacy of the false middle, with the caveat that there is no real middle.

This can be very well exemplified in politics, where the framing of a debate between countries varies. However, the phenomenon can also be found in things such as which word-choices are acceptable.

While much of this can be put down to systemic bias factors, not all of it can. The most popular position is importantly not the de facto correct or least biased position. Additionally all framing of debate according to acceptable and unacceptable discourse is not necessarily bad. Wikipedia defines a certain range of acceptable discourse, promoting scientific and sourced material over pseudoscientific and unsourced testimony and opinion.

  • It is both valuable and a worthwhile exercise to contemplate where this windows lies, both on Wikipedia and in the world at large. What lies outside the window of acceptable discourse is not objectively wrong.
Examples of acceptable framing

The following are examples of how word and image choices vary according to language versions of Wikipedia. Such variations are not limited to language, and these examples are simply a way to illustrate differences in perceived neutrality. Acceptable terminology or coverage similarly varies between different topic areas on a single Wikipedia.

Adolf Hitler's role in the Holocaust

These articles are chosen to illustrate varying perceptions of neutrality and what constitutes a high quality article on three different Wikipedias. The German and English versions are rated as either Good article or the equivalent of good article. All three articles are heavily edited, and are far more so than the average article. They should therefore all have been subjected to the "many eyes principle of bias", and it is a struggle to label this variation as systemic bias.

Here, in this thoroughly unscientific comparison, the ledes of the articles on Adolf Hitler, in: English, German, and Hebrew — can be placed along a continuum according to how they define the acts of the Holocaust, and Hitler's role in them.

English Wikipedia German Wikipedia Hebrew Wikipedia
Under Hitler's leadership and racially motivated ideology, the Nazi regime was responsible for the genocide of at least 5.5 million Jews[...] In the Holocaust, about 5.6 to 6.3 million Jews, in the Porajmos up to 500,000 were persecuted and [...] murdered. Hitler authorized the most important steps of the murder of the Jews and was informed about the course. During the war, millions of Jews and non-Jews were slaughtered in a genocidal manner by the initiative and direction of Hitler, who strove to realize the principles of racial doctrine.
Im Holocaust wurden etwa 5,6 bis 6,3 Millionen Juden, im Porajmos bis zu 500.000 als „asozial“ [...] und [...] ermordet. Hitler autorisierte die wichtigsten Schritte des Judenmordes und ließ sich über den Verlauf informieren. במהלך המלחמה נטבחו באופן ג'נוסיידי מיליוני יהודים ולא-יהודים ביזמתו ובהכוונתו של היטלר, ששאף לממש את עקרונות תורת הגזע.
2nd degree active voice 1st degree passive voice 1st degree active voice

The English version uses "genocide", the German: "murder", and the Hebrew: "slaughter".

We find the same variation in the image choices on the articles on The Holocaust, with the caveat that the English version uses "systematic murder", ^ and that the Hebrew version omits lede images entirely. The gradient of image choices in the articles differs markedly, with images showing few to no dead bodies in the English version, slightly more so in the German version, and with very severe imagery in the Hebrew version.

Alternative medicine

Motivated reasoning

... not really a bias

This is where things such as conflict of interest are relevant

Countering bias

We've all been there, debating with someone that we consider biased, who has an opinion or conception of facts that we believe is not predicated on reality. There are pretty much three constructive ways in which you can enter such debates. Each path has unique pitfalls and there are risks in taking any one of them. This essay argues you should start from option 1 or 2, reserving option 3 as a last resort.

Countering or counteracting bias is an important skill to have as a Wikipedian, because we are often met by biased opinions, inaccurate beliefs, as well as uninformed opinions. You can't write Wikipedia on your own, and countering bias by making friends is better than doing so by making enemies. Friends can help you in the future, while enemies just make everything more difficult. We've all snapped one time or another, often in the face of "massive ignorance", skipping right to option 3. We may come to regret this, and even if we don't, we're likely to scare off potential new editors. For someone to think it important enough to argue in a biased fashion, their efforts can likely be a positive influence if convinced to rethink.

  • Convince them of your position — The most important thing to remember in this step is that your goal is not to prove your opponent wrong, at least not in the traditional sense. What you want to do is to prove to them that you are right. This approach is rare on the internet, but if we look at the scientific literature on "correcting" people who hold inaccurate positions there are a few things you will want to avoid, such as the back-fire effect, repeating their falsehoods in order to refute them, putting to question someone's world-view, and insulting them. Even if proving yourself right implies you must prove them wrong, try to focus on the underlying virtue of your position, rather than the fallacies of theirs.
  • Reevaluate your own positionNaive realism ... We may be missing important aspects.. While Wikipedia doesn't formally respect titles or.. (Essjay), it might still be valuable to call into question your own judgement, especially if you are editing in a subject field you are less comfortable with.



The central message of all this advice is "Be humble". Wikipedia is a collaborative project, and some of the most misguided editors can change and ... The author of this text started off... and we have many who've started off as trolls and ..., who by simply being exposed to our environment have chosen to reform or... While there is no empirical evidence, these case reports and a general ... say that "it's easier to reform a troll or vandal than to recruit a new editor". Someone who takes time to disrupt Wikipedia obviously thinks Wikipedia is important enough that these actions are worth their time.

While

  • Ignore or report themWikipedia:Don't feed the trolls... intentional trolling and unintentional trolling. An example of an unintentional troll is someone that believes in alternative medicine. Despite..

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook