![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 |
I'm afraid I don't grok the rationale. This just seems like more instruction creep, more cumbersome stuff to juggle around manually, etc. The inert articles that go for months without editing have probably only been touched by a few people in their lifetime, and are likely in need of lots of improvement, so inertness is as bad as instability. The rapidly changing articles tend to have disputes going on (even minor ones) that would prevent stabilization, and these are the articles that get the most attention, so stabilization of slower moving articles won't make WP look any better.
Can someone give a few specific examples of articles which stabilization can help? Phr ( talk) 10:36, 10 July 2006 (UTC)
As the main editor (if not the main vandalism reverter) of Cheese as it stands today, I'd love to see this system at least given a trial there. — Bunchofgrapes ( talk) 03:31, 11 July 2006 (UTC)
Reading through the above, I think some of the disagreement might be a result of folks having different goals in mind. I think clarifying the goals might help. I see two similar, but not identical, goals:
1) Some folks want stable/static versions so that the general public does not see Wikipedia articles containing the ravings of any random lunatic with a web browser.
2) Some folks want stable/static versions so that a subset of Wikipedia articles can be advanced to a pristine state, subject to neither casual vandalism nor degradation of their brilliant prose.
Completely ignoring implementation for the moment, let's imagine there's a Wikipedia that satisfies #1 and think about how different it is from the current Wikipedia. Similarly, a Wikipedia satisfying #2.
I think the crux of #1 is "edits from random lunatics should not be immediately visible". But that's what a wiki is, you say. However, the point is not to make articles uneditable — just to have some mechanism to distinguish a "working" version from a generally "visible" version and have casual viewers directed to the generally "visible" version. Forking a "stable" version is one way to do this, but I don't think it's necessary. The "visible" version could just be a version picked from the history. There could still be an "edit this page" link on the generally visible version. What exactly happens if you click this link can be worked out (if the working version is different, perhaps you're first shown the current working version with or without diffs relative to the "visible" version). It doesn't seem to me that a Wikipedia that works like this would need to be tremendously different from the current one (for example, see my proposal for how I think this could work at Wikipedia talk:Stable versions/archive2#forking considered harmful). However it's implemented, I think one of the primary benefits of this kind of mechanism is that we reduce the incentive to vandalize.
I think the premise behind #2 is that "stable versions should be the result of a strict review/editing process and should therefore not be directly editable". Fundamentally, I think this means either the stable version is permanently static (essentially protected) or must be a fork with further changes directed to a separate "working" version which might at some future time go through the same review/editing process to become a new "stable" version. However its done, I think this sort of review/edit process has to be labor intensive and can only be applied to a small subset of Wikipedia articles. We could choose to implement this right now by simply protecting "stable" articles (with further changes discussed on the talk page).
If these really are two separate goals, I think we could consider splitting this discussion into two separate threads that might lead to two complementary solutions. -- Rick Block ( talk) 19:00, 15 July 2006 (UTC)
I object to the implimentation of this proposal, as a number of people have raised serious issues with it which have been totally ignored. Raul654 01:05, 25 July 2006 (UTC)
Those are the first two questions that should be asked whenever the stabilisation of an article is proposed. One of the biggest objections against stable versions is that they restrict the good edits along with the bad. Well, that's why an article should only be stabilised when there is a very low possibility that any large good edits could be made in the near future. That means two things: First, the article must already be very good. Second, the article must be comprehensive. Together, those two conditions ensure that we're dealing with an article that is bound to get mostly bad edits in the future. -- Nikodemos 14:09, 27 July 2006 (UTC)
Since there were no comments when this was posted as a link, I am going to copy the text here in its entirety. -- Nikodemos 14:19, 27 July 2006 (UTC)
One of the assumptions of Wikipedia is that continual editing by multiple users will result in a continual increase in the quality of an article. This has proven true as a general trend. However, I do not believe adequate attention has been given to important exceptions, and, most significantly, to the rate of improvement in article quality.
I have done a bit of research and reflection, and I have reached a conclusion that I call the Average End Quality (AEQ) Hypothesis. It is based on the following observation:
As the quality of an article improves, the rate of improvement declines.
In other words, if Q(t) is the quality of an article at time t, then Q'(t) is positive but downward sloping, or Q"(t)<0. The graph of quality over time looks something like this:
Notice that the quality function appears to level off. This is not accidental. It is a well known fact that not all edits improve the quality of an article; some, in fact, detract from quality. Not all of these are simple vandalism that gets reverted as a matter of course. Some are subtle insertions of uncited speculation, POV, reorganizations of material that disrupt the flow of an article, bad grammar, and so on. They are not enough to have a visible effect on the upward trend of quality for new and short articles, but once an article gets very lengthy and detailed it becomes increasingly difficult to copyedit and remove errors buried deep inside the long text. As a result, bad edits are able to balance out good work and article quality levels off at a value I have termed the Average End Quality (AEQ).
Of course, editing never actually stops. The actual quality of a given article may spike above or dip below the AEQ for various periods of time. But whenever actual quality goes above the AEQ, bad editing gains an upper hand over good editing and drives it back down. Likewise, whenever actual quality goes below the AEQ, good editing gains an upper hand over bad editing and pushes it back up. In other words, if an article gets too good then most editors will declare "mission accomplished" and leave, allowing random bad edits to erode its quality; but if the article gets too bad, a large number of editors will be attracted in an attempt to fix it.
Thus, the quality of most detailed, lengthy articles oscillates around the Average End Quality:
Some might say that the AEQ is good enough, so there is really no problem. However, this property of Wikipedia results in a lot of wasted effort from editors who work hard to get the quality of an article above the AEQ, only to have it eroded down over the next few months. And, in some cases, articles that were once of a very high quality have been reduced to near incoherence. Given all this, I propose that the very best articles on Wikipedia be placed under permanent page protection. After all, the whole reason why people are free to edit articles on Wikipedia is because this policy results in an overall increase in article quality. But if we have good reason to believe that it will result in a decrease in quality for articles X, Y and Z, then it is only reasonable to place articles X, Y and Z under some sort of protection:
Such a protection policy should only apply to articles of exceptional quality, and it should not be a blanket ban on all edits; rather, there should be a requirement that any new edits must undergo some review process before being allowed. This could be the first step towards Wikipedia 1.0: At first, only a handful of articles would have the required quality to be protected, but more would accumulate over time.
I am sure this idea can spark endless controversy. Fortunately, however, it is not just a matter of opinion. There is, in fact, a very sure way to tell whether the Average End Quality Hypothesis is true or false. What articles are recognized as the very best on Wikipedia? Featured articles, of course. Let us do a survey of former featured articles and determine whether their quality has increased or decreased on average since the day they were featured. If it has decreased, then it is true that continual editing usually lowers the quality of our best articles, and therefore it is a good idea to implement some sort of protection policy on them. -- Nikodemos 14:22, 27 July 2006 (UTC)
The whole idea of Average End Quality is based on an assumption (and I haven't analyzed the data, but I assume it's an assumption) that the asymptote is a line of slope 0. Couldn't the limit actually be more like a ln(x), or ln(ln(x)) - i.e. on average still increase over time but with a continuously decreasing rate? It's been my experience with the Monty Hall problem article (perhaps one of the most edited feature articles) that its quality does vary, but that it may overall still be improving.
I think rather than prevent changes what we actually need is some way to help ensure the individual changes are positive, not negative. We currently exert no control over changes, willingly accepting both positive and negative changes. The rapid rate of improvement at the beginning of the lifetime of most articles supports this as a reasonable mechanism. As articles improve, I think the issue is they reach a point where "random" changes aren't necessarily an improvement and accepting any change is no longer the right strategy. We haven't tried it, but I think the next "loosest" strategy is to accept only "reviewed" changes (which is what all this "stable version" stuff is really about). My question is what is the next most minimally "tighter" strategy? Most of these proposals go from "accept anything" all the way to "committee approves everything" which seems to me to be a way bigger step than is necessary. My thought is the next step should be "accept most". The only issue is that differentially accepting changes means we have to somehow split the notion of editing from the notion of accepting. I think this is the crux of all the stable version proposals, but I'd like to see a mechanism that supports varying degrees of control. The control aspect boils down to "who gets to accept"? Again, most of the current proposals go way overboard changing the current "anyone with a browser" to "only admins". If we explicitly create a new permission level for "edit accepter" we could allow "most" editors to be accepters slightly moving the paradigm from "accept all changes" to "accept most changes". Would this prevent most "negative" changes? I don't know. But I think it might.
IMO, the goal is not 0 changes (AEQ asymptote = 0), but continual improvement (AEQ asymptote > 0, implying monotonically increasing quality). -- Rick Block ( talk) 14:26, 31 July 2006 (UTC)
Way to be bold! I was on IRC and had a few questions and JesseW was great about explaining the goals and aims of the policy. Per his request, I'll list my questions and comments.
Sorry about all the questions and don't feel at all pressured to answer them. Cheers! hoopydink Conas tá tú? 21:04, 9 July 2006 (UTC)
Since when? Where's the support to roll this out? -- badlydrawnjeff talk 02:17, 2 August 2006 (UTC)
Its wholly unwiki to protect an article. (Yes, it means protecting the article). I might like to see something where there's a notice across the top pointing to a specific version of the article in history as being a 'good version' or something, but NOT ever to have the article itself protected from changes. It defeats the purpose of a wiki. If people are then concerned vandals will just delete/damage the message pointing to the stable version, well perhaps a software implimentation should be waited for rather than protecting articles like Cheese before this even has a consensus. Kevin_b_er 01:14, 25 July 2006 (UTC)
As I said above: The purpose of wikipedia is to provide good information. Nothing more, nothing less. How we do that is entirely up to us and always open to revision. I am sick and tired of fanatical ideological opposition to any restriction on the free editing of articles. If an article is already very good and further editing tends to make it worse, it's time to stop. Wikipedia is not dogma! (hey, that's a catchy slogan) -- Nikodemos 13:39, 27 July 2006 (UTC)
Won't this make wikipedia obselete? After all this IS the internet. When news breaks, I won't be able to edit an article right away, I will have to wait until whenever the development version is made into the stable version (if ever). This is going to slow down wikipedia. There won't be anymore instant graftification. Used to be that if someone saw an error, they could fix it. Just like that and it was fixed for all to see. Now you submit a correction and have to wait a few weeks to see if it actually makes it into the article. Who likes to wait to see the fruit of their efforts? Wikipedia got past a million articles with the open model, why change now?-- God Ω War 05:20, 2 August 2006 (UTC)
Ladies and gentlemen, I've cranked out the first draft of my proposal. I'm about to get to a couple of the technical details, but the idea is there. I would appreciate feedback from all on the proposal's talk page. Thank you, JDoorj a m Talk 05:34, 2 August 2006 (UTC)
I'm wondering what people think about this so far, now that we've had a fair bit of discussion.
What do we need a poll for? You can see what people think from the discussion - a vote won't achieve anything. Worldtraveller 18:16, 14 July 2006 (UTC)
After back and forth discussions with myself, I'm torn! There's so much positive...yet so much negative! — Deckill e r 06:19, 2 August 2006 (UTC)
I'm most concerned about this proposal. If it were restricted to a temporary contingency measure for articles that are seriously unstable and important, I might feel better about it. But it's currently framed broadly. Tony 08:07, 2 August 2006 (UTC)
Elephant has apparently been made a 'trial' to this. But if we look, the purpose of it has already been ruined, as the stable version has already been directly edited by an administrator:
Here's the history in case it disappears, for Elephant:
# 02:10, 2 August 2006 Jaranda (Talk | contribs) m (→External links - -Rm spam link) # 02:00, 2 August 2006 Cyde (Talk | contribs) m (Protected Elephant: Stable version [edit=sysop:move=sysop]) # 02:00, 2 August 2006 Cyde (Talk | contribs) (To view the complete history of this article and its list of editors see...
Doorjam's idea about administrators making little changes to the stable versions has already come to pass in the second attempted trial of this proposal. See Article stabilization separates administrators even more from other editors. under #Is_anyone_else_worried_about_a_mentality_change_here.3F . Kevin_b_er 02:33, 2 August 2006 (UTC)
Sorry, it's still a new process and not everyone is familiar with all of the rules yet. -- Cyde↔Weys 02:40, 2 August 2006 (UTC)
I've reversed this thing. It's not finding anything like the necessary support either in practise or in discussion. - Splash - tk 03:28, 2 August 2006 (UTC)
Alright, could you guys please get together a few pages that you will "allow" us to run a test on? It'd be great to actually get some real tests in before this goes live in software in a few months. We have over a million articles ... the constant stalemating of these tests on even a single article is going to help no one in the end. -- Cyde↔Weys 03:44, 2 August 2006 (UTC)
![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 |
I'm afraid I don't grok the rationale. This just seems like more instruction creep, more cumbersome stuff to juggle around manually, etc. The inert articles that go for months without editing have probably only been touched by a few people in their lifetime, and are likely in need of lots of improvement, so inertness is as bad as instability. The rapidly changing articles tend to have disputes going on (even minor ones) that would prevent stabilization, and these are the articles that get the most attention, so stabilization of slower moving articles won't make WP look any better.
Can someone give a few specific examples of articles which stabilization can help? Phr ( talk) 10:36, 10 July 2006 (UTC)
As the main editor (if not the main vandalism reverter) of Cheese as it stands today, I'd love to see this system at least given a trial there. — Bunchofgrapes ( talk) 03:31, 11 July 2006 (UTC)
Reading through the above, I think some of the disagreement might be a result of folks having different goals in mind. I think clarifying the goals might help. I see two similar, but not identical, goals:
1) Some folks want stable/static versions so that the general public does not see Wikipedia articles containing the ravings of any random lunatic with a web browser.
2) Some folks want stable/static versions so that a subset of Wikipedia articles can be advanced to a pristine state, subject to neither casual vandalism nor degradation of their brilliant prose.
Completely ignoring implementation for the moment, let's imagine there's a Wikipedia that satisfies #1 and think about how different it is from the current Wikipedia. Similarly, a Wikipedia satisfying #2.
I think the crux of #1 is "edits from random lunatics should not be immediately visible". But that's what a wiki is, you say. However, the point is not to make articles uneditable — just to have some mechanism to distinguish a "working" version from a generally "visible" version and have casual viewers directed to the generally "visible" version. Forking a "stable" version is one way to do this, but I don't think it's necessary. The "visible" version could just be a version picked from the history. There could still be an "edit this page" link on the generally visible version. What exactly happens if you click this link can be worked out (if the working version is different, perhaps you're first shown the current working version with or without diffs relative to the "visible" version). It doesn't seem to me that a Wikipedia that works like this would need to be tremendously different from the current one (for example, see my proposal for how I think this could work at Wikipedia talk:Stable versions/archive2#forking considered harmful). However it's implemented, I think one of the primary benefits of this kind of mechanism is that we reduce the incentive to vandalize.
I think the premise behind #2 is that "stable versions should be the result of a strict review/editing process and should therefore not be directly editable". Fundamentally, I think this means either the stable version is permanently static (essentially protected) or must be a fork with further changes directed to a separate "working" version which might at some future time go through the same review/editing process to become a new "stable" version. However its done, I think this sort of review/edit process has to be labor intensive and can only be applied to a small subset of Wikipedia articles. We could choose to implement this right now by simply protecting "stable" articles (with further changes discussed on the talk page).
If these really are two separate goals, I think we could consider splitting this discussion into two separate threads that might lead to two complementary solutions. -- Rick Block ( talk) 19:00, 15 July 2006 (UTC)
I object to the implimentation of this proposal, as a number of people have raised serious issues with it which have been totally ignored. Raul654 01:05, 25 July 2006 (UTC)
Those are the first two questions that should be asked whenever the stabilisation of an article is proposed. One of the biggest objections against stable versions is that they restrict the good edits along with the bad. Well, that's why an article should only be stabilised when there is a very low possibility that any large good edits could be made in the near future. That means two things: First, the article must already be very good. Second, the article must be comprehensive. Together, those two conditions ensure that we're dealing with an article that is bound to get mostly bad edits in the future. -- Nikodemos 14:09, 27 July 2006 (UTC)
Since there were no comments when this was posted as a link, I am going to copy the text here in its entirety. -- Nikodemos 14:19, 27 July 2006 (UTC)
One of the assumptions of Wikipedia is that continual editing by multiple users will result in a continual increase in the quality of an article. This has proven true as a general trend. However, I do not believe adequate attention has been given to important exceptions, and, most significantly, to the rate of improvement in article quality.
I have done a bit of research and reflection, and I have reached a conclusion that I call the Average End Quality (AEQ) Hypothesis. It is based on the following observation:
As the quality of an article improves, the rate of improvement declines.
In other words, if Q(t) is the quality of an article at time t, then Q'(t) is positive but downward sloping, or Q"(t)<0. The graph of quality over time looks something like this:
Notice that the quality function appears to level off. This is not accidental. It is a well known fact that not all edits improve the quality of an article; some, in fact, detract from quality. Not all of these are simple vandalism that gets reverted as a matter of course. Some are subtle insertions of uncited speculation, POV, reorganizations of material that disrupt the flow of an article, bad grammar, and so on. They are not enough to have a visible effect on the upward trend of quality for new and short articles, but once an article gets very lengthy and detailed it becomes increasingly difficult to copyedit and remove errors buried deep inside the long text. As a result, bad edits are able to balance out good work and article quality levels off at a value I have termed the Average End Quality (AEQ).
Of course, editing never actually stops. The actual quality of a given article may spike above or dip below the AEQ for various periods of time. But whenever actual quality goes above the AEQ, bad editing gains an upper hand over good editing and drives it back down. Likewise, whenever actual quality goes below the AEQ, good editing gains an upper hand over bad editing and pushes it back up. In other words, if an article gets too good then most editors will declare "mission accomplished" and leave, allowing random bad edits to erode its quality; but if the article gets too bad, a large number of editors will be attracted in an attempt to fix it.
Thus, the quality of most detailed, lengthy articles oscillates around the Average End Quality:
Some might say that the AEQ is good enough, so there is really no problem. However, this property of Wikipedia results in a lot of wasted effort from editors who work hard to get the quality of an article above the AEQ, only to have it eroded down over the next few months. And, in some cases, articles that were once of a very high quality have been reduced to near incoherence. Given all this, I propose that the very best articles on Wikipedia be placed under permanent page protection. After all, the whole reason why people are free to edit articles on Wikipedia is because this policy results in an overall increase in article quality. But if we have good reason to believe that it will result in a decrease in quality for articles X, Y and Z, then it is only reasonable to place articles X, Y and Z under some sort of protection:
Such a protection policy should only apply to articles of exceptional quality, and it should not be a blanket ban on all edits; rather, there should be a requirement that any new edits must undergo some review process before being allowed. This could be the first step towards Wikipedia 1.0: At first, only a handful of articles would have the required quality to be protected, but more would accumulate over time.
I am sure this idea can spark endless controversy. Fortunately, however, it is not just a matter of opinion. There is, in fact, a very sure way to tell whether the Average End Quality Hypothesis is true or false. What articles are recognized as the very best on Wikipedia? Featured articles, of course. Let us do a survey of former featured articles and determine whether their quality has increased or decreased on average since the day they were featured. If it has decreased, then it is true that continual editing usually lowers the quality of our best articles, and therefore it is a good idea to implement some sort of protection policy on them. -- Nikodemos 14:22, 27 July 2006 (UTC)
The whole idea of Average End Quality is based on an assumption (and I haven't analyzed the data, but I assume it's an assumption) that the asymptote is a line of slope 0. Couldn't the limit actually be more like a ln(x), or ln(ln(x)) - i.e. on average still increase over time but with a continuously decreasing rate? It's been my experience with the Monty Hall problem article (perhaps one of the most edited feature articles) that its quality does vary, but that it may overall still be improving.
I think rather than prevent changes what we actually need is some way to help ensure the individual changes are positive, not negative. We currently exert no control over changes, willingly accepting both positive and negative changes. The rapid rate of improvement at the beginning of the lifetime of most articles supports this as a reasonable mechanism. As articles improve, I think the issue is they reach a point where "random" changes aren't necessarily an improvement and accepting any change is no longer the right strategy. We haven't tried it, but I think the next "loosest" strategy is to accept only "reviewed" changes (which is what all this "stable version" stuff is really about). My question is what is the next most minimally "tighter" strategy? Most of these proposals go from "accept anything" all the way to "committee approves everything" which seems to me to be a way bigger step than is necessary. My thought is the next step should be "accept most". The only issue is that differentially accepting changes means we have to somehow split the notion of editing from the notion of accepting. I think this is the crux of all the stable version proposals, but I'd like to see a mechanism that supports varying degrees of control. The control aspect boils down to "who gets to accept"? Again, most of the current proposals go way overboard changing the current "anyone with a browser" to "only admins". If we explicitly create a new permission level for "edit accepter" we could allow "most" editors to be accepters slightly moving the paradigm from "accept all changes" to "accept most changes". Would this prevent most "negative" changes? I don't know. But I think it might.
IMO, the goal is not 0 changes (AEQ asymptote = 0), but continual improvement (AEQ asymptote > 0, implying monotonically increasing quality). -- Rick Block ( talk) 14:26, 31 July 2006 (UTC)
Way to be bold! I was on IRC and had a few questions and JesseW was great about explaining the goals and aims of the policy. Per his request, I'll list my questions and comments.
Sorry about all the questions and don't feel at all pressured to answer them. Cheers! hoopydink Conas tá tú? 21:04, 9 July 2006 (UTC)
Since when? Where's the support to roll this out? -- badlydrawnjeff talk 02:17, 2 August 2006 (UTC)
Its wholly unwiki to protect an article. (Yes, it means protecting the article). I might like to see something where there's a notice across the top pointing to a specific version of the article in history as being a 'good version' or something, but NOT ever to have the article itself protected from changes. It defeats the purpose of a wiki. If people are then concerned vandals will just delete/damage the message pointing to the stable version, well perhaps a software implimentation should be waited for rather than protecting articles like Cheese before this even has a consensus. Kevin_b_er 01:14, 25 July 2006 (UTC)
As I said above: The purpose of wikipedia is to provide good information. Nothing more, nothing less. How we do that is entirely up to us and always open to revision. I am sick and tired of fanatical ideological opposition to any restriction on the free editing of articles. If an article is already very good and further editing tends to make it worse, it's time to stop. Wikipedia is not dogma! (hey, that's a catchy slogan) -- Nikodemos 13:39, 27 July 2006 (UTC)
Won't this make wikipedia obselete? After all this IS the internet. When news breaks, I won't be able to edit an article right away, I will have to wait until whenever the development version is made into the stable version (if ever). This is going to slow down wikipedia. There won't be anymore instant graftification. Used to be that if someone saw an error, they could fix it. Just like that and it was fixed for all to see. Now you submit a correction and have to wait a few weeks to see if it actually makes it into the article. Who likes to wait to see the fruit of their efforts? Wikipedia got past a million articles with the open model, why change now?-- God Ω War 05:20, 2 August 2006 (UTC)
Ladies and gentlemen, I've cranked out the first draft of my proposal. I'm about to get to a couple of the technical details, but the idea is there. I would appreciate feedback from all on the proposal's talk page. Thank you, JDoorj a m Talk 05:34, 2 August 2006 (UTC)
I'm wondering what people think about this so far, now that we've had a fair bit of discussion.
What do we need a poll for? You can see what people think from the discussion - a vote won't achieve anything. Worldtraveller 18:16, 14 July 2006 (UTC)
After back and forth discussions with myself, I'm torn! There's so much positive...yet so much negative! — Deckill e r 06:19, 2 August 2006 (UTC)
I'm most concerned about this proposal. If it were restricted to a temporary contingency measure for articles that are seriously unstable and important, I might feel better about it. But it's currently framed broadly. Tony 08:07, 2 August 2006 (UTC)
Elephant has apparently been made a 'trial' to this. But if we look, the purpose of it has already been ruined, as the stable version has already been directly edited by an administrator:
Here's the history in case it disappears, for Elephant:
# 02:10, 2 August 2006 Jaranda (Talk | contribs) m (→External links - -Rm spam link) # 02:00, 2 August 2006 Cyde (Talk | contribs) m (Protected Elephant: Stable version [edit=sysop:move=sysop]) # 02:00, 2 August 2006 Cyde (Talk | contribs) (To view the complete history of this article and its list of editors see...
Doorjam's idea about administrators making little changes to the stable versions has already come to pass in the second attempted trial of this proposal. See Article stabilization separates administrators even more from other editors. under #Is_anyone_else_worried_about_a_mentality_change_here.3F . Kevin_b_er 02:33, 2 August 2006 (UTC)
Sorry, it's still a new process and not everyone is familiar with all of the rules yet. -- Cyde↔Weys 02:40, 2 August 2006 (UTC)
I've reversed this thing. It's not finding anything like the necessary support either in practise or in discussion. - Splash - tk 03:28, 2 August 2006 (UTC)
Alright, could you guys please get together a few pages that you will "allow" us to run a test on? It'd be great to actually get some real tests in before this goes live in software in a few months. We have over a million articles ... the constant stalemating of these tests on even a single article is going to help no one in the end. -- Cyde↔Weys 03:44, 2 August 2006 (UTC)