This is an
essay. It contains the advice or opinions of one or more Wikipedia contributors. This page is not an encyclopedia article, nor is it one of
Wikipedia's policies or guidelines, as it has not been
thoroughly vetted by the community. Some essays represent widespread norms; others only represent minority viewpoints. |
This page in a nutshell: Despite extensive countermeasures, vandalism overwhelmed Wikipedia by 2009. Blocking vandalism could reduce Wikipedia edits by 80%, as 5x times easier, using sets of usernames or trust-codes earned for IP addresses |
The concept that Vandalism won by 2009 is not an idea that some people wanted to accept in February 2009. Consider all the effort that had been expended to analyze the growing range of vandalism and hackings, and imagine all the intense work done to create and monitor bot programs that reverted numerous botched edits. No wonder some people would not, could not acknowledge being defeated by vandalism. They had worked so hard, that they didn't even have time left to admit failure. To admit how all that work had accomplished nothing: it would be an unbearable reality.
There are numerous ways in which the vandalism slipped past all the massive efforts to thwart it:
The chances for confusion were just too easy, so the botched text got overlooked and left that way for months or years.
Basically, it was just too hard to fix all the problems fast enough:
It is not easy to accept the actual realities of how difficult fixing articles can become, and how long it really takes to spot the "hidden" problems. Even facing the impact requires calculating the thousands of minutes lost:
A problem overlooked for 20 days, means that the total minutes lost would be roughly 60*24 hours * 20 days = 28,800 minutes since article was botched. If the botched-edit took only 30 seconds, then double the impact as: 2 * 28,800 = 57,600 minutes.
To use a Biblical time period, if vandalism is left uncorrected for " 40 days and 40 nights", that means it remained 115,000 times longer than the initial 30-second edit: 60*24 hours * 40 days/nights * 60/30 seconds = 115,200. It took less than a minute to hack, but the impact was 115,000 times longer before being corrected.
There were hundreds of thousands of people who could not, or did not, stop and fix the vandalism. When 6 text sections about " Mobile phone" features were axed on 23-Dec-2008, it took over 20 days to realize how to restore that article to, once again, become fully coherent to general readers. Meanwhile, the botched text was "read" by nearly 4,000 readers per day, resulting in a total of about 82,000 page-views (during 3 weeks) while no one fixed that extensive problem.
IMHO, I think it is time for a wiki- 12-step program:
Be BOLD and face reality:
The anonymous edits are just not working to reduce vandalism. It is too difficult to spot all problems. The definition of insanity is: "Doing the same bots over and over again, and expecting different reverts."
The solution begins with the realization that top management must change policies to deter the rampant vandalism, caused over 95% by unregistered IP-address users.
It is time to find some kind of semi-anonymous user mode, as a compromise, to thwart this frantic environment of re-editing many articles to become over 90% vandalism edits+reverts. Perhaps if each user were allowed a set of different usernames, to seem anonymous, but be accountable when defacing text, then the vandalism could be reduced.
Another tactic, allowing users to remain anonymous, and avoid the bother of the login, would be to establish "trust levels" for each IP address. Various trust-codes would be needed to allow edit-access to some articles, requiring certain trust-code levels before editing. For example, IP addresses could earn seniority trust-codes elevated by the longer they had been editing pages without incident. Edit-violations (such as hackings or pranks) would cause those trust levels to drop, but be allowed to re-elevate after more time to "heal the trust" when violations were detected. Another option would be to grant some professional facilities higher trust levels for their sets of IP-address ranges. Trust-codes could be stored with each edit. Overall, the result would still be the "free encyclopedia that anyone can edit" just not whenever they want to edit it, but only after, first, earning the trust or logging in, before changing those pages.
This is an
essay. It contains the advice or opinions of one or more Wikipedia contributors. This page is not an encyclopedia article, nor is it one of
Wikipedia's policies or guidelines, as it has not been
thoroughly vetted by the community. Some essays represent widespread norms; others only represent minority viewpoints. |
This page in a nutshell: Despite extensive countermeasures, vandalism overwhelmed Wikipedia by 2009. Blocking vandalism could reduce Wikipedia edits by 80%, as 5x times easier, using sets of usernames or trust-codes earned for IP addresses |
The concept that Vandalism won by 2009 is not an idea that some people wanted to accept in February 2009. Consider all the effort that had been expended to analyze the growing range of vandalism and hackings, and imagine all the intense work done to create and monitor bot programs that reverted numerous botched edits. No wonder some people would not, could not acknowledge being defeated by vandalism. They had worked so hard, that they didn't even have time left to admit failure. To admit how all that work had accomplished nothing: it would be an unbearable reality.
There are numerous ways in which the vandalism slipped past all the massive efforts to thwart it:
The chances for confusion were just too easy, so the botched text got overlooked and left that way for months or years.
Basically, it was just too hard to fix all the problems fast enough:
It is not easy to accept the actual realities of how difficult fixing articles can become, and how long it really takes to spot the "hidden" problems. Even facing the impact requires calculating the thousands of minutes lost:
A problem overlooked for 20 days, means that the total minutes lost would be roughly 60*24 hours * 20 days = 28,800 minutes since article was botched. If the botched-edit took only 30 seconds, then double the impact as: 2 * 28,800 = 57,600 minutes.
To use a Biblical time period, if vandalism is left uncorrected for " 40 days and 40 nights", that means it remained 115,000 times longer than the initial 30-second edit: 60*24 hours * 40 days/nights * 60/30 seconds = 115,200. It took less than a minute to hack, but the impact was 115,000 times longer before being corrected.
There were hundreds of thousands of people who could not, or did not, stop and fix the vandalism. When 6 text sections about " Mobile phone" features were axed on 23-Dec-2008, it took over 20 days to realize how to restore that article to, once again, become fully coherent to general readers. Meanwhile, the botched text was "read" by nearly 4,000 readers per day, resulting in a total of about 82,000 page-views (during 3 weeks) while no one fixed that extensive problem.
IMHO, I think it is time for a wiki- 12-step program:
Be BOLD and face reality:
The anonymous edits are just not working to reduce vandalism. It is too difficult to spot all problems. The definition of insanity is: "Doing the same bots over and over again, and expecting different reverts."
The solution begins with the realization that top management must change policies to deter the rampant vandalism, caused over 95% by unregistered IP-address users.
It is time to find some kind of semi-anonymous user mode, as a compromise, to thwart this frantic environment of re-editing many articles to become over 90% vandalism edits+reverts. Perhaps if each user were allowed a set of different usernames, to seem anonymous, but be accountable when defacing text, then the vandalism could be reduced.
Another tactic, allowing users to remain anonymous, and avoid the bother of the login, would be to establish "trust levels" for each IP address. Various trust-codes would be needed to allow edit-access to some articles, requiring certain trust-code levels before editing. For example, IP addresses could earn seniority trust-codes elevated by the longer they had been editing pages without incident. Edit-violations (such as hackings or pranks) would cause those trust levels to drop, but be allowed to re-elevate after more time to "heal the trust" when violations were detected. Another option would be to grant some professional facilities higher trust levels for their sets of IP-address ranges. Trust-codes could be stored with each edit. Overall, the result would still be the "free encyclopedia that anyone can edit" just not whenever they want to edit it, but only after, first, earning the trust or logging in, before changing those pages.