From Wikipedia, the free encyclopedia
Welcome to our discussion!
      When participating in this discussion, please remember the following:
  • Debate ideas, not people.
  • Avoid discussing or linking to specific examples of harassment or user misconduct.
  • To participate privately, contact our team via email. (Your email may be shared internally but never publicly.)


What this discussion is about

This is the start of a discussion on ways to improve tools and workflows for users to report harassment.

Wikimedia communities have acknowledged community health as a top concern. Disputes between contributors that advance further to become harassment are a significant concern of Wikimedia communities. According to a survey conducted in early 2017, 73% of Wikipedia volunteers surveyed reported that they had been harassed or bullied on Wikipedia in the previous 12 months. In surveys and community discussions, users have expressed that responses to behavioral issues are frequently inadequate. While our projects have developed good processes for dealing with issues like vandalism and edit-warring, the existing systems for reporting, managing, and evaluating user incidents do not appear to be effective at addressing harassment and other serious user conduct issues. Issues that reach administrators are often resolved adequately, but many incidents that could benefit from intervention do not get attention from administrators. Specifically, 84% of 300 users surveyed in 2017 requested better reporting tools, 77% requested better noticeboards, and 75% requested better policies.

Process and timeline

The Wikimedia Foundation Support and Safety and the Anti-Harassment Tools teams are committed to providing resources for community-supported change, be that technical development, further research, or coordination support. The first stage of the discussion is to identify shortcomings in current systems for reporting  harassment in order to determine potential improvements that might be built by the Anti-harassment tools team in the later part of 2018.

Identifying problems

In preparing for this discussion, we've been gathering research—data and analysis—to aid in decision making. Previous community discussions, community surveys, and research and reports have been collected in the sidebar for your review. Please let us know about other discussions or research so we can add it to the collections.

From community discussions and community surveys, the Anti-Harassment Tools team and the Support and Safety team have identified shortcoming related to the current methods for reporting harassment. From this we have created a preliminary list of problems that could potentially be improved with changes to tools and workflows. We invite you to provide additional problems that you have noted for community discussion.

The discussion about this list of problems will help us prioritize which problems our software developers could work on later in 2018.

Problem list

Working draft of a problem list to be added to during this discussion.

Noticeboards do not facilitate effective case management

  • Basic case management tools are lacking. For example, there is no acknowledgement that the incident is under review. Updates to the case are manual.
  • No method to triage or prioritized cases
  • Lack of structure often makes reports difficult to understand or analyze because of incomplete information and evidence.
  • The open nature of noticeboards encourages uninvolved parties to contribute tendentious or contentious comments.
  • No method to assign cases to assure people with the right skillset manage the case

Users underreport behavioral issues

  • Fifty-three percent of English Wikipedia AN/I survey respondents avoided making a report on Administrators’ Noticeboard/Incidents because they were afraid it would not be handled appropriately.

Policy violations under enforced

  • Lack of detail in policy elevates the importance of administrator discretion, and discourages administrators to take action against abusive behavior because of the possibility of losing standing in the community.
  • Many wikis have a high bar for enforcement of behavioral policy. Administrators are frequently hesitant to take action except in straightforward or blatant cases.
  • Lack of users with the skillset to manage some types of incidents.

Discussion about Problem list items

Place for general discussion about items on the problem list

Discussion about Noticeboards

Place for a targeted discussion about the strengths and weaknesses of using the noticeboards for reporting harassment and other serious types of user conduct issues.

General discussion about improving harassment reporting and workflows

(copied from discussion on Signpost talk page about the AN/I research)

  • A bias that I see in the study is that it does not acknowledge Wikipedia's very strange moderation system. Since Wikipedia is almost entirely community based, until a few years ago there was hardly any reporting system for extreme violence, like death threats or suicide talk. Even more recently the most awful kinds of social problems which no other website would tolerate, including problems which are traumatic to even encounter as a third party like sexual abuse, were issues that fell to random community volunteers because of a Wikimedia Foundation practice that there would never be paid staff intervention in any Wikimedia community occurrence. The Wikimedia community still has not established any lasting norms and I think that everything is in transition, and mostly crazy. I do not look at the current state of things and imagine that anyone is lacking for ideas on reforming it if the funding were available and if it were socially appropriate to use the funding to address the madness.
The Administrator noticeboard exists to settle conflicts related to Wikipedia editing. I do not think that there is anyone on the Administrator board that ever wants to address harassment, stalking, violence, sex danger, criminal derangement, or people who are incapable of socializing. If anyone wants to fix the Admin board, it is possible to divide the pool of issues into "what any sane person would say that crowdsourced volunteers should manage" versus "what any sane person would say requires special training to manage, and probably paid staff". We are at an impasse because the Wikimedia Foundation will not hire paid staff to address social misconduct on Wikimedia projects, nor is there any Wikimedia community organization which has ever requested Wikimedia community funding to address harassment directly. I see no fault in the WMF because there is consensus that the WMF not have paid staff engage too much with the Wikimedia community.
I really feel sorry for the admin board and the personal risk that administrators assume in making themselves available. The WP:AfD process is intense, but community evaluation process demonstrates that the community expects that admins resolve wiki conflicts, and not that they need to perform exorcisms. I think the research shared here has diminished value for not acknowledging a community insight that we already had: the admin board is a last resort and being used as a catch all because there is no place to kick other problems. This research project begins with the presumption that all problems have to go to the admin board, when actually, the Wikipedia community has always behaved as if the admin board is the place for problems with a wiki nature and that the Wikimedia community's funding pool, either through the WMF or otherwise, will be the part of the process for generating ethical judgements of suspected deviancy beyond the context of the software interface. There should be another place, not the admin board, where the problems which would emotionally damage a normal person to hear should go. I think that the criticism that this research surfaces is too much confused over issues which the admin board does not even want to address. Blue Rasberry (talk) 23:29, 30 April 2018 (UTC) reply
IMO one of the problems with ANI is that it's been allowed to become run by the comments of too many uninvolved and/or inexperienced users and wannabe admins (what we casually refer to as the peanut gallery). Due to our open access nature, many people think it's cool to be a 'forum moderator'. Kudpung กุดผึ้ง ( talk) 02:15, 1 May 2018 (UTC) reply
Those "peanut gallery" editors cannot run AN/I because they cannot decide & block. Kudpung, good to read how you think about those non-admins. Then, when you handle a case, the reporter and the accused editors (involved by definition) suddenly have become experienced in AN/I business and you do take them serious? How can you ever make a decision when you are this biased re non-admins? - DePiep ( talk) 09:50, 1 May 2018 (UTC) reply
Bluerasberry, your comments are really insightful and have made me re-think what ANI is all about. It's still swirling around in my brain but I think your idea that the admins aren't to blame for not engaging in discussions that are harmful to them is something I'd never considered. It's always been a kind of "what jerks they are for not dealing with this stuff" Zeitgeist. Ideas like this are hard for the community to swallow, though: that the community has limits on its ability to self-regulate or self-resource and may need help from outside. ☆ Bri ( talk) 21:07, 1 May 2018 (UTC) reply
@ Bri: We already send certain legal complaints to the Wikimedia Foundation legal team. Lately the Wikimedia Foundation Support and Safety team has started taking serious violent threats. These are the precedents we have for sending some issues to paid external support services. There are some Wikimedia chapters who use their paid staff to resolve some social problems, like addressing people in conflict at events in the role of security officers. The most stressful issues at the admin board are intense harassment happening in the wiki space but unrelated to wiki, and which include suggestion of violence, suggestion of sex negativity, and suggestion of personal threat. Wiki volunteers are happy to moderate wiki disputes but when something is creepy, but not creepy enough to trigger a Wikimedia Foundation response, then there is a service failure. On the creepy danger scale, the ANI board can take anything that ranks 1-2 (2 is slightly concerning) and the WMF will take anything that ranks 8-10 (8 being evidence of threat). 3 is "somewhat concerning" and 7 is "really scary but ambiguous". Volunteers do not come to Wikipedia because they want to deal with problems ranking 3+ on this scale, and yet these kinds of problems fall to ANI and ArbCOM. Way too often, administrators and arbitrators who have elite skills to resolve wiki issues get their time and emotional labor wasted on legal, violent, and harassment issues which require a non-wiki skill set to address. I would not prohibit willing wiki volunteers from taking these issues sometimes, but considering that the role specification for admins and arbs is wiki expertise and not social work, it is not a natural fit to expect expertise with domestic violence, mental health, online stalking, and social deviancy from the people who get appointed based on wiki proficiency. I think that there should be trained staff on these issues. Organizations which have volunteers or staff who regularly expose themselves to trauma need to offer their agents regular access to counseling to debrief and process and get regular reality checks on their personal safety, because by responding they actually get involved in the dangerous situation.
Another big problem with all of this is the lack of visibility. The WMF just went through an entire research project in this and I would say that they have a conflict of interest in this research. It is unfortunate, but historically the WMF has been structured in a way that if they acknowledge that harassment exists then for whatever reason the organization interprets that as a failing of their operations. Of course this is not true and there is no shame in admitting that one is the victim of harassment, because the victim is not to blame. While any and all individuals in the WMF acknowledge problems, collectively the organization has an aversion to identifying them. A premise in this study is that the reports which go to ANI are supposed to go to ANI. This has never been the case - ANI is not a police force and lots of things happen on wiki / online which, if they happened on the street, would result in bystanders calling the police. When an issue is 4+ on the scale of 1-10 for danger, a person would call the police if they witnessed that social transgression in-person in an urban crowd. The on-wiki tolerance for social transgression goes far beyond what is tolerable in person and this is not natural.
I advocate for either Wikimedia chapters who hire special staff or non-wiki nonprofit organizations with expertise in social work to handle these issues. I expect that these issues number in the 1000s/year on wiki globally. If we actually had a reporting system rather than pushing them inappropriately to ANI I think that many would be easier to identify and sort. Blue Rasberry (talk) 21:45, 1 May 2018 (UTC) reply
  • After reading the report twice, I still felt missing something. It is this: questions like "Are you an admin?" and "Did you close any AN/I report last year?". Most survey results are more understandable and logical when assuming (ouch) that most respondants are AN/I-active admins. For example this could clarify why so many respondants want to forbid non-admins to engage, and why so little self-criticism is visible (more below). Also missing is the angle "What do you think about the quality of closures?" (e.g., i.e., does the closure reflect the discussion?). My experience is that admins have an enormous leeway in making individual (personal) decisions, covered from criticism by the no-wheelbarring rule and the ~complete absence of any way to appeal. Then 53% is "fearing [a report] would not be handled appropriately", but 'not .. appropriately' is not fleshed out any further.
Telling, the call is for "More guidelines" (Harvard says this too), but no hint is made for more reasonable guidelines. Introducing unfair or unbalanced guidelines will not improve the "community health", it will only let careless admins off the hook (an indicator is the many boomerang references). Survey outcome does not point to this in any way. (And one guideline less could be implemented today: "Personal attacks are allowed at AN/I").
All in all I get the sense that unevenly more respondants are admins, and crucial questions are missing, hence the survey is evading the issue of admin conduct at AN/I. - DePiep ( talk) 09:43, 1 May 2018 (UTC) reply
  • Wow. Now that this discussion has arrived at this talkpage (from Signpost talkpage), I get that the topic is harrassment, not AN/I page improvement (i.e., not improvement of the AN/I process itself, but the harassment issues [to be] reported there). Must say, the Signpost header did not help. - DePiep ( talk) 18:55, 3 May 2018 (UTC) reply
    Hello @ DePiep:, the AN/I research that was reported by the Signpost covers more than just AN/I harassment cases. Among other things, it examined in general which types of cases work well and which don't at AN/I. So, I don't think the heading was incorrect. I'm sorry if linking the discussion was confusing to you. I copied the discussion here because a good bit of it was closely related to this new discussion we are starting.
    The AN/I research results can be used by the community in various different ways beyond a improvement to reporting harassment. The Anti-Harassment Tools team and Trust & Safety team are using the AN/I research along with other research to think about potential improvement to reporting and workflow related to harassment and similar serious user misconduct issues. SPoore (WMF), Trust & Safety, Community health initiative ( talk) 17:20, 4 May 2018 (UTC) reply
    What is "AN/I harassment"? - DePiep ( talk) 21:14, 4 May 2018 (UTC) reply
    The Signpost title is: "Admin reports board under criticism". Does not refer to harassment. Also, please don't put the blame one me (I'm sorry if ... was confusing to you.). Why blame me for the confusion you spread? SPoore (WMF) - DePiep ( talk) 20:54, 6 May 2018 (UTC) reply
From Wikipedia, the free encyclopedia
Welcome to our discussion!
      When participating in this discussion, please remember the following:
  • Debate ideas, not people.
  • Avoid discussing or linking to specific examples of harassment or user misconduct.
  • To participate privately, contact our team via email. (Your email may be shared internally but never publicly.)


What this discussion is about

This is the start of a discussion on ways to improve tools and workflows for users to report harassment.

Wikimedia communities have acknowledged community health as a top concern. Disputes between contributors that advance further to become harassment are a significant concern of Wikimedia communities. According to a survey conducted in early 2017, 73% of Wikipedia volunteers surveyed reported that they had been harassed or bullied on Wikipedia in the previous 12 months. In surveys and community discussions, users have expressed that responses to behavioral issues are frequently inadequate. While our projects have developed good processes for dealing with issues like vandalism and edit-warring, the existing systems for reporting, managing, and evaluating user incidents do not appear to be effective at addressing harassment and other serious user conduct issues. Issues that reach administrators are often resolved adequately, but many incidents that could benefit from intervention do not get attention from administrators. Specifically, 84% of 300 users surveyed in 2017 requested better reporting tools, 77% requested better noticeboards, and 75% requested better policies.

Process and timeline

The Wikimedia Foundation Support and Safety and the Anti-Harassment Tools teams are committed to providing resources for community-supported change, be that technical development, further research, or coordination support. The first stage of the discussion is to identify shortcomings in current systems for reporting  harassment in order to determine potential improvements that might be built by the Anti-harassment tools team in the later part of 2018.

Identifying problems

In preparing for this discussion, we've been gathering research—data and analysis—to aid in decision making. Previous community discussions, community surveys, and research and reports have been collected in the sidebar for your review. Please let us know about other discussions or research so we can add it to the collections.

From community discussions and community surveys, the Anti-Harassment Tools team and the Support and Safety team have identified shortcoming related to the current methods for reporting harassment. From this we have created a preliminary list of problems that could potentially be improved with changes to tools and workflows. We invite you to provide additional problems that you have noted for community discussion.

The discussion about this list of problems will help us prioritize which problems our software developers could work on later in 2018.

Problem list

Working draft of a problem list to be added to during this discussion.

Noticeboards do not facilitate effective case management

  • Basic case management tools are lacking. For example, there is no acknowledgement that the incident is under review. Updates to the case are manual.
  • No method to triage or prioritized cases
  • Lack of structure often makes reports difficult to understand or analyze because of incomplete information and evidence.
  • The open nature of noticeboards encourages uninvolved parties to contribute tendentious or contentious comments.
  • No method to assign cases to assure people with the right skillset manage the case

Users underreport behavioral issues

  • Fifty-three percent of English Wikipedia AN/I survey respondents avoided making a report on Administrators’ Noticeboard/Incidents because they were afraid it would not be handled appropriately.

Policy violations under enforced

  • Lack of detail in policy elevates the importance of administrator discretion, and discourages administrators to take action against abusive behavior because of the possibility of losing standing in the community.
  • Many wikis have a high bar for enforcement of behavioral policy. Administrators are frequently hesitant to take action except in straightforward or blatant cases.
  • Lack of users with the skillset to manage some types of incidents.

Discussion about Problem list items

Place for general discussion about items on the problem list

Discussion about Noticeboards

Place for a targeted discussion about the strengths and weaknesses of using the noticeboards for reporting harassment and other serious types of user conduct issues.

General discussion about improving harassment reporting and workflows

(copied from discussion on Signpost talk page about the AN/I research)

  • A bias that I see in the study is that it does not acknowledge Wikipedia's very strange moderation system. Since Wikipedia is almost entirely community based, until a few years ago there was hardly any reporting system for extreme violence, like death threats or suicide talk. Even more recently the most awful kinds of social problems which no other website would tolerate, including problems which are traumatic to even encounter as a third party like sexual abuse, were issues that fell to random community volunteers because of a Wikimedia Foundation practice that there would never be paid staff intervention in any Wikimedia community occurrence. The Wikimedia community still has not established any lasting norms and I think that everything is in transition, and mostly crazy. I do not look at the current state of things and imagine that anyone is lacking for ideas on reforming it if the funding were available and if it were socially appropriate to use the funding to address the madness.
The Administrator noticeboard exists to settle conflicts related to Wikipedia editing. I do not think that there is anyone on the Administrator board that ever wants to address harassment, stalking, violence, sex danger, criminal derangement, or people who are incapable of socializing. If anyone wants to fix the Admin board, it is possible to divide the pool of issues into "what any sane person would say that crowdsourced volunteers should manage" versus "what any sane person would say requires special training to manage, and probably paid staff". We are at an impasse because the Wikimedia Foundation will not hire paid staff to address social misconduct on Wikimedia projects, nor is there any Wikimedia community organization which has ever requested Wikimedia community funding to address harassment directly. I see no fault in the WMF because there is consensus that the WMF not have paid staff engage too much with the Wikimedia community.
I really feel sorry for the admin board and the personal risk that administrators assume in making themselves available. The WP:AfD process is intense, but community evaluation process demonstrates that the community expects that admins resolve wiki conflicts, and not that they need to perform exorcisms. I think the research shared here has diminished value for not acknowledging a community insight that we already had: the admin board is a last resort and being used as a catch all because there is no place to kick other problems. This research project begins with the presumption that all problems have to go to the admin board, when actually, the Wikipedia community has always behaved as if the admin board is the place for problems with a wiki nature and that the Wikimedia community's funding pool, either through the WMF or otherwise, will be the part of the process for generating ethical judgements of suspected deviancy beyond the context of the software interface. There should be another place, not the admin board, where the problems which would emotionally damage a normal person to hear should go. I think that the criticism that this research surfaces is too much confused over issues which the admin board does not even want to address. Blue Rasberry (talk) 23:29, 30 April 2018 (UTC) reply
IMO one of the problems with ANI is that it's been allowed to become run by the comments of too many uninvolved and/or inexperienced users and wannabe admins (what we casually refer to as the peanut gallery). Due to our open access nature, many people think it's cool to be a 'forum moderator'. Kudpung กุดผึ้ง ( talk) 02:15, 1 May 2018 (UTC) reply
Those "peanut gallery" editors cannot run AN/I because they cannot decide & block. Kudpung, good to read how you think about those non-admins. Then, when you handle a case, the reporter and the accused editors (involved by definition) suddenly have become experienced in AN/I business and you do take them serious? How can you ever make a decision when you are this biased re non-admins? - DePiep ( talk) 09:50, 1 May 2018 (UTC) reply
Bluerasberry, your comments are really insightful and have made me re-think what ANI is all about. It's still swirling around in my brain but I think your idea that the admins aren't to blame for not engaging in discussions that are harmful to them is something I'd never considered. It's always been a kind of "what jerks they are for not dealing with this stuff" Zeitgeist. Ideas like this are hard for the community to swallow, though: that the community has limits on its ability to self-regulate or self-resource and may need help from outside. ☆ Bri ( talk) 21:07, 1 May 2018 (UTC) reply
@ Bri: We already send certain legal complaints to the Wikimedia Foundation legal team. Lately the Wikimedia Foundation Support and Safety team has started taking serious violent threats. These are the precedents we have for sending some issues to paid external support services. There are some Wikimedia chapters who use their paid staff to resolve some social problems, like addressing people in conflict at events in the role of security officers. The most stressful issues at the admin board are intense harassment happening in the wiki space but unrelated to wiki, and which include suggestion of violence, suggestion of sex negativity, and suggestion of personal threat. Wiki volunteers are happy to moderate wiki disputes but when something is creepy, but not creepy enough to trigger a Wikimedia Foundation response, then there is a service failure. On the creepy danger scale, the ANI board can take anything that ranks 1-2 (2 is slightly concerning) and the WMF will take anything that ranks 8-10 (8 being evidence of threat). 3 is "somewhat concerning" and 7 is "really scary but ambiguous". Volunteers do not come to Wikipedia because they want to deal with problems ranking 3+ on this scale, and yet these kinds of problems fall to ANI and ArbCOM. Way too often, administrators and arbitrators who have elite skills to resolve wiki issues get their time and emotional labor wasted on legal, violent, and harassment issues which require a non-wiki skill set to address. I would not prohibit willing wiki volunteers from taking these issues sometimes, but considering that the role specification for admins and arbs is wiki expertise and not social work, it is not a natural fit to expect expertise with domestic violence, mental health, online stalking, and social deviancy from the people who get appointed based on wiki proficiency. I think that there should be trained staff on these issues. Organizations which have volunteers or staff who regularly expose themselves to trauma need to offer their agents regular access to counseling to debrief and process and get regular reality checks on their personal safety, because by responding they actually get involved in the dangerous situation.
Another big problem with all of this is the lack of visibility. The WMF just went through an entire research project in this and I would say that they have a conflict of interest in this research. It is unfortunate, but historically the WMF has been structured in a way that if they acknowledge that harassment exists then for whatever reason the organization interprets that as a failing of their operations. Of course this is not true and there is no shame in admitting that one is the victim of harassment, because the victim is not to blame. While any and all individuals in the WMF acknowledge problems, collectively the organization has an aversion to identifying them. A premise in this study is that the reports which go to ANI are supposed to go to ANI. This has never been the case - ANI is not a police force and lots of things happen on wiki / online which, if they happened on the street, would result in bystanders calling the police. When an issue is 4+ on the scale of 1-10 for danger, a person would call the police if they witnessed that social transgression in-person in an urban crowd. The on-wiki tolerance for social transgression goes far beyond what is tolerable in person and this is not natural.
I advocate for either Wikimedia chapters who hire special staff or non-wiki nonprofit organizations with expertise in social work to handle these issues. I expect that these issues number in the 1000s/year on wiki globally. If we actually had a reporting system rather than pushing them inappropriately to ANI I think that many would be easier to identify and sort. Blue Rasberry (talk) 21:45, 1 May 2018 (UTC) reply
  • After reading the report twice, I still felt missing something. It is this: questions like "Are you an admin?" and "Did you close any AN/I report last year?". Most survey results are more understandable and logical when assuming (ouch) that most respondants are AN/I-active admins. For example this could clarify why so many respondants want to forbid non-admins to engage, and why so little self-criticism is visible (more below). Also missing is the angle "What do you think about the quality of closures?" (e.g., i.e., does the closure reflect the discussion?). My experience is that admins have an enormous leeway in making individual (personal) decisions, covered from criticism by the no-wheelbarring rule and the ~complete absence of any way to appeal. Then 53% is "fearing [a report] would not be handled appropriately", but 'not .. appropriately' is not fleshed out any further.
Telling, the call is for "More guidelines" (Harvard says this too), but no hint is made for more reasonable guidelines. Introducing unfair or unbalanced guidelines will not improve the "community health", it will only let careless admins off the hook (an indicator is the many boomerang references). Survey outcome does not point to this in any way. (And one guideline less could be implemented today: "Personal attacks are allowed at AN/I").
All in all I get the sense that unevenly more respondants are admins, and crucial questions are missing, hence the survey is evading the issue of admin conduct at AN/I. - DePiep ( talk) 09:43, 1 May 2018 (UTC) reply
  • Wow. Now that this discussion has arrived at this talkpage (from Signpost talkpage), I get that the topic is harrassment, not AN/I page improvement (i.e., not improvement of the AN/I process itself, but the harassment issues [to be] reported there). Must say, the Signpost header did not help. - DePiep ( talk) 18:55, 3 May 2018 (UTC) reply
    Hello @ DePiep:, the AN/I research that was reported by the Signpost covers more than just AN/I harassment cases. Among other things, it examined in general which types of cases work well and which don't at AN/I. So, I don't think the heading was incorrect. I'm sorry if linking the discussion was confusing to you. I copied the discussion here because a good bit of it was closely related to this new discussion we are starting.
    The AN/I research results can be used by the community in various different ways beyond a improvement to reporting harassment. The Anti-Harassment Tools team and Trust & Safety team are using the AN/I research along with other research to think about potential improvement to reporting and workflow related to harassment and similar serious user misconduct issues. SPoore (WMF), Trust & Safety, Community health initiative ( talk) 17:20, 4 May 2018 (UTC) reply
    What is "AN/I harassment"? - DePiep ( talk) 21:14, 4 May 2018 (UTC) reply
    The Signpost title is: "Admin reports board under criticism". Does not refer to harassment. Also, please don't put the blame one me (I'm sorry if ... was confusing to you.). Why blame me for the confusion you spread? SPoore (WMF) - DePiep ( talk) 20:54, 6 May 2018 (UTC) reply

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook