This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | → | Archive 15 |
Hey all, I'm kinda new to this bot thing, but I have an idea for a new task, and it might or might not be good. The reason I'm not asking at Bot Requests is because me and that page have some bad history between us about 2-3 years ago. I'm nervous to see what would happen if I went over there again so I'm posting it here. It's CHECKWIKI Task 64, "Link equal to linktext" ([[Apple|apple]] --> [[Apple]]). The reason I'm asking is because normally FrescoBot does Task 64, but looking at its contribs, it only seems to edit every 2-3 days, and only edits 20-30 articles at a time. If you look at here, you can see that there are 2,100+ articles with that error. Editing 2,100+ articles to remove that error is something that is tedious to do manually, so I am asking if this is a good idea. I know PrimeBOT uses AWB to edit, and is run manually by Primefac, so I thought it might be a good idea to do something like that. Please let me know what you think. Please leave constructive criticism only please. Thanks! Yoshi24517 Chat Very Busy 04:05, 17 March 2017 (UTC)
Hi. I have an idea to create articles about PGC objects. I have full list like table with characteristics of this objects. Is that possible? I'm not from Engish Wikipedia, I'm just asking, is that possible. Thank for response. -- Artificial123 ( talk) 06:59, 17 March 2017 (UTC)
There is currently an effort to identify which WP:CWERRORS should be considered cosmetic/which aren't. Help and feedback would be appreciated. Headbomb { t · c · p · b} 12:57, 31 March 2017 (UTC)
In case you haven't noticed, there is currently an RFC on the proposed update on our WP:COSMETICBOT policy. Headbomb { t · c · p · b} 13:00, 31 March 2017 (UTC)
Is broken.
It was reported a month ago and also (in the wrong place) here. -- Green C 20:28, 24 March 2017 (UTC)
Pinging Dispenser. -- Edgars2007 ( talk/ contribs) 14:39, 31 March 2017 (UTC)
Dispenser has disabled the "save page" portion of Checklinks on enwiki until he can update the code. -- Green C 18:05, 5 April 2017 (UTC)
If you're making a bot-related presentation at Wikimania 2017 in Montreal, advertise it here!
I'll be making at least two, assuming they are accepted, one on Article Alerts, the other on Journals Cited by Wikipedia (and possibly a third one on bots in general). If you are interested in attending, please sign up! Headbomb { t · c · p · b} 12:59, 7 April 2017 (UTC)
Bots Newsletter, April 2017 | |
---|---|
Greetings! The BAG Newsletter is now the Bots Newsletter, per discussion. As such, we've subscribed all bot operators to the newsletter. You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list. Highlights for this newsletter include:
Magioladitis ARBCOM case has closed. The remedies of the case include:
We currently have 27 open bot requests at Wikipedia:Bots/Requests for approval, and could use your help processing!
There are multiple ongoing discussions surrounding bot-related matters. In particular:
Several new things are around:
Wikimania 2017 is happening in Montreal, during 9–13 August. If you plan to attend, or give a talk, let us know! Thank you! edited by: Headbomb 11:35, 12 April 2017 (UTC) (You can unsubscribe from future newsletters by removing your name from this list.) |
Some old Javascript, some of which has been deprecated for more than five years, is being removed later this month. Some old scripts may need to be updated. If you think that this might be relevant to your code, then please see https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2017-April/001574.html for more details (including links that show how to fix most of the code). Whatamidoing (WMF) ( talk) 19:25, 12 April 2017 (UTC)
Can anybody please assist Cobi ( talk · contribs) in fixing ClueBot III ( talk · contribs)? It is taking too much when it archives threads, see User talk:ClueBot Commons/Archives/2017/April#Improper archival of heading, part 3. Basically, when archiving a level 4 subsection, the bot assumes that the subsection terminates with the next level 4 heading - and if there is an intervening level 3 heading, it is archiving that too, which is an error. -- Redrose64 🌹 ( talk) 20:36, 27 April 2017 (UTC)
ooui=1
to the end of the URL:
https://en.wikipedia.org/?title=Marie_Curie&action=edit&ooui=1 I'm told that Twinkle will probably be okay (no problems in limited testing), but a few other scripts and bots may need to be updated.If you think your bot or script will be broken by this, and you can't figure out how to update it, then please post requests for help either here or at WP:VPT. Whatamidoing (WMF) ( talk) 17:18, 9 May 2017 (UTC)
We had a few more scripts break at fawiki than we had hoped for, so we're slowing this down a bit. I've collected some information, including diffs of some repairs, at mw:OOjs UI/Fixing scripts and gadgets. If you maintain any scripts, or if you depend upon any and aren't certain whether the owner is actively maintaining them, please check that page and start testing (instructions on the page).
Please also share this information with people at other wikis. Whatamidoing (WMF) ( talk) 19:23, 17 May 2017 (UTC)
In followup to something mentioned here. Wikipedia:Administrators'_noticeboard/Incidents#Unauthorized bot job, unresponsive operator would BRFA be the place to request a bot flag for a user account so that it CAN be run in an approved manner?
An additional concern expressed was that the BOT components of it were not compliant with the relevant policy on Cosmetic edits. ShakespeareFan00 ( talk) 21:19, 23 May 2017 (UTC)
Please note a nomination for Bot Approvals Group membership is active. Feel free to comment here. ~ Rob13 Talk 22:46, 26 May 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The following bots, and their respective operator, have each had no contributions in over two years and are scheduled to be deauthorized in one week per the bot policy activity requirements. If your bot is listed and you wish to retain authorization, please add a note to the table and sign below.
bot_name | bot_editcount | bot_lastedit | oper_name | oper_lastedit | notes |
---|---|---|---|---|---|
User:SelketBot | 16870 | 20110624183928 | User:Selket | 20140216162053 | |
User:SkiersBot | 124334 | 20110715052412 | User:Skier_Dude | 20120917042322 | |
User:MartinBotIII | 136346 | 20110731122144 | User:Martinp23 | 20130427212553 | |
User:Kotbot | 157583 | 20110816121147 | User:Kotniski | 20120124000153 | |
User:WalkingSoulBot | 1 | 20110823130647 | User:WalkingSoul | 20110605220714 | |
User:GurchBot | 7421 | 20110919112313 | User:Gurch | 20130804182024 | |
User:MiszaBot | 81480 | 20111013170506 | User:Misza13 | 20150219094323 | |
User:DodoBot | 136137 | 20111126163905 | User:EdoDodo | 20111126164139 | |
User:RaptureBot | 13074 | 20111218221254 | User:FinalRapture | 20111120060515 | |
User:Rfambot | 1774 | 20120213174928 | User:Jennifer Rfm | 20131106230051 | |
User:FlBot | 14324 | 20120217110113 | User:Fl | 20140326014308 | |
User:MessageDeliveryBot | 10187 | 20120605022949 | User:EdoDodo | 20111126164139 | |
User:AlanBOT | 6712 | 20130429203141 | User:ikseevon | 20130429040405 | |
User:MMABot | 5265 | 20130505205805 | User:TreyGeek | 20130628122155 | |
User:LyricsBot | 27368 | 20130921052032 | User:Dcoetzee | 20141003225306 | Operator has been banned Accounts are already globally locked |
User:DyceBot | 45604 | 20140105070113 | User:Dycedarg | 20140315182843 | |
User:HersfoldArbClerkBot | 11398 | 20140110024813 | User:Hersfold | 20140110040539 | |
User:IPLRecordsUpdateBot | 19 | 20140210113220 | User:Jfd34 | 20140420092748 | |
User:Wpp research bot | 3 | 20140328200839 | User:Jantin | 20141222190945 | |
User:AstRoBot | 4229 | 20150125114428 | User:WDGraham | 20150214171645 | |
User:HBC AIV helperbot7 | 253005 | 20150204230319 | User:Wimt | 20150512214048 |
So I had an idea regarding bot tasks, specifically regarding followup. I've often wondered (even with my own tasks) about how many edits actually were made during a bot run, to see if there was any sort of accuracy regarding the initial estimate. Also, thinking about minor tweaks that were made to code to improve it.
Would it be reasonable to ask for bot operators to give some sort of "after action report" for one-off bot runs? Primefac ( talk) 14:14, 8 June 2017 (UTC)
Please comment there. Headbomb { t · c · p · b} 17:59, 22 June 2017 (UTC)
I'm hearing that phab:T53736 is being discussed seriously, and that it may affect bots. I don't understand the project yet, but if you're interested in how bots cope with redirects, then please take a look. Whatamidoing (WMF) ( talk) 16:19, 27 June 2017 (UTC)
There is a discussion happening at the Wikipedia:Village pump (proposals) page on the tone of the wording of the InternetArchiveBot messages that are being left on article talk pages. If you are interested, please see that discussion. Thanks! KDS4444 ( talk) 23:43, 29 June 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
User:InternetArchiveBot is a bot. Responding to an edit of it, I posted on their talkpage this. After "saving", my post did not show?! I had to do research, to discover that the (regular looking) page said like: "Please do not edit this page. Messages left here will likely go unnoticed.". In other words: the bot is deaf. (to be clear: talkpage instructions are not defining. For example: we have Redirects). Why is this bot allowed to operate like this? - DePiep ( talk) 20:40, 28 June 2017 (UTC)
I don't see any real problem on that page: then why reply here at all (duh)?
<div style="display:none">
appears to be why your post doesn't show. Of course, the placement of that may be disputable, as it appears intended to make new entries added at the bottom invisible (except when viewing the talk page source). —
Paleo
Neonate - 21:34, 28 June 2017 (UTC)
__NONEWSECTIONLINK__
when posts are unwanted on the page. Users using the new section link can see their post in preview and then it vanishes on save. The top could also have a source comment saying "DON'T POST TO THIS PAGE. READ THE INSTRUCTIONS." Maybe repeat in the last section for users who try to edit that and manually add a section heading. Or add __NOEDITSECTION__
to prevent that. The unclosed <div style="display:none">
was apparently added deliberately.
[1] That does seem a little extreme without an explanation.
PrimeHunter (
talk) 22:32, 28 June 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
While we are at it: how and why can such a talkpage get TE-level of protection? - DePiep ( talk) 22:45, 28 June 2017 (UTC)
<div style="display:none">
at the bottom of the page, meaning that all new sections would immediately disappear. Thank you for reminding me to extend full protection to the IABot's user page, though.
Primefac (
talk) 23:37, 28 June 2017 (UTC)
<div>
tag starts a page division. If there is no corresponding </div>
tag, the division ends at the bottom of the page. Any HTML element may be given a style=
attribute, and
the display:none
declaration causes the element to not appear in the formatting structure. So if your browser has an "inspect element" feature which permits the toggling of styling, it is possible to make the thread in question displayed by deselecting the display:none
declaration. As noted above, editing the page source also shows that it is present. --
Redrose64 🌹 (
talk) 23:53, 28 June 2017 (UTC)
You all are promoted to level WP:ANI.I am not; I am but an assistant pig-keeper . — Paleo Neonate - 00:50, 29 June 2017 (UTC)
I believe that AWB bots automatically stop if a message is left on the talk page. This is good practice and it probably wouldn't be a good idea to fully protect these talk pages. I see that User:InternetArchiveBot has a shut-off option so this is not a concern in this case. — Martin ( MSGJ · talk) 11:30, 29 June 2017 (UTC)
The real question, aside from the drama above, seems to be: should bot talk pages be allowed to redirect to other pages, or should bot communication be on the bot talk page only? Personally, I'm strongly opposed to this (as above) but welcome constructive discussion to clear the matter up formally. TheMagikCow ( T) ( C) 12:26, 29 June 2017 (UTC)
Speaking as a bot operator, I tried giving blatant warnings on the bot's talk page that the bot won't respond on that talk page and all queries are routed to the bit recycling bin. Even then that didn't prevent editors from trying to engage the bot as a user. I filed for Full Page protection (requiring an admin to be able to edit) and still that doesn't prevent industrious admins from dropping notices when they'll be ignored. There's no reason good reason (IMO) why IAB uses that unclosed div tag, and has the side effect of potentially gobbling up other data. Highly reccomend (with what I see as the application of Full protection) to remove that unclosed tag. Hasteur ( talk) 13:53, 29 June 2017 (UTC)
did you actually see/read/understand the message that said not to post there .. before saving. This is a good Q, but not enough. My A:
This is so far out of hand. @ Cyberpower678: In order to get this discussion over with and move on, would you be fine with adding a literal soft redirect to the bot user talk page to make it even more obvious that the editor is intended to comment on your talk page? This is not indicative of any wrongdoing on your part, just the path of least resistance (if you're interested in taking it). ~ Rob13 Talk 01:41, 30 June 2017 (UTC)
Check edit history here (2013) Maybe we encourage the bot op to wrap up changes? I am sure there are more recent examples. -- Magioladitis ( talk) 22:17, 3 July 2017 (UTC)
I have proposed to do what Cydebot does with Yobot. At least in the level of closed XfDs. -- Magioladitis ( talk) 23:35, 3 July 2017 (UTC)
Speed is kind of important for this one too. We want to minimize the time when articles are split between two categories when a category is being renamed. I wouldn't want a bot renaming a 500-article category making an edit every 10 seconds. AWB can do things quickly, so that's not a problem. The timing as Primefac states is kind of important to keep CFD running smoothly. Whenever Cydebot has been down for even a couple days, it's a huge pain to everyone trying to manage that process. ~ Rob13 Talk 18:05, 4 July 2017 (UTC)
Krinkle posted at message to several mailing lists that some of you may want to read. It begins like this:
TL;DR: In April 2017, the jQuery library in MediaWiki was upgraded from 1.x to 3.x (the current stable version), with the jQuery Migrate plugin enabled to ease migration. We temporarily still load jQuery 1.x on Wikimedia wikis. Read about the API changes at https://jquery.com/upgrade-guide/3.0/
The full message can be read here: https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2017-June/001617.html
Whatamidoing (WMF) ( talk) 19:53, 5 July 2017 (UTC)
Unproductive complaining about a problem that doesn't exist. Jc86035 ( talk) Use {{ re|Jc86035}}
to reply to me 14:44, 6 July 2017 (UTC)
(moved from WT:BAG#Is this the right forum to bot operators unpermissioned?)
Is this the right forum to bot operators unpermissioned? If not, could you redirect me? Thanks. -- Hobbes Goodyear ( talk) 20:59, 2 July 2017 (UTC)
@ JJMC89, Hobbes Goodyear, Magioladitis, Xaosflux, Primefac, BU Rob13, SQL, SkyWarrior, and Cyberpower678: Pinging those involved in previous discussions of which I am aware. ··· 日本穣 · 投稿 · Talk to Nihonjoe · Join WP Japan! 03:37, 3 July 2017 (UTC)
minor
and bot
so as to avoid unnecessary recent changes and watchlist impact. Additionaly, no edits unrelated to the specific task are being made without explanation. Finally, the bot's user page clearly defines the task with examples. —
xaosflux
Talk 17:33, 3 July 2017 (UTC)Is it OK that the bot edits a page multiple times to change one by one HTTP→HTTPS to some external links ? It's page history cluttering. Your opinions? -- XXN, 17:00, 3 July 2017 (UTC)
User:Kandymotownie recently made sweeping disruptive edits to high-profile pages such as Barack Obama and Donald Trump, via IABot, adding archive links for hundreds of sources which are still live, so this only creates useless bloat in wikitext. I reverted those but there is similar bot-assisted damage to other pages, mainly about Ghana. This user's talk page is full of warnings that s/he never responds to over several years, indicating a WP:NOTHERE attitude. How was s/he ever authorized to run a bot? In view of the recent disruptive actions, bot credentials should be revoked immediately, and perhaps a stern warning or short block is in order. — JFG talk 05:33, 7 July 2017 (UTC)
Editors are encouraged to add an archive link as a part of each citation, or at least submit the referenced URL for archiving,[note 1] at the same time that a citation is created or updated.See also this how-to guide. If you have a problem with these how-to guides, please take up your issue with the guide pages, not with editors who follow the guides in good faith. – Jonesey95 ( talk) 16:48, 7 July 2017 (UTC)
Hello everyone!!! I just nominated myself for BAG membership. Your participation would be appreciated.
Wikipedia:Bot Approvals Group/nominations/Cyberpower678 3— CYBERPOWER ( Message) 23:51, 9 July 2017 (UTC)
Bots Newsletter, July 2017 | |
---|---|
Greetings! Here is the 4th issue of the Bots Newsletter (formerly the BAG Newletter). You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list. Highlights for this newsletter include:
BU Rob13 and Cyberpower678 are now members of the BAG (see RfBAG/BU Rob13 and RfBAG/Cyberpower678 3). BU Rob13 and Cyberpower678 are both administrators; the former operates BU RoBOT which does a plethora of tasks, while the latter operates Cyberbot I (which replaces old bots), Cyberbot II (which does many different things), and InternetArchiveBot which combats link rot. Welcome to the BAG!
We currently have 12 open bot requests at Wikipedia:Bots/Requests for approval, and could use your help processing!
Wikimania 2017 is happening in Montreal, during 9–13 August. If you plan to attend, or give a talk, let us know! Thank you! edited by: Headbomb 17:12, 19 July 2017 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
Hello all, I'd like to get some additional feedback on Wikipedia:Bots/Requests for approval/Yobot 55 - as to if "any any genfixes" is appropriate to bundle in here or not. I'm on the fence - this is a very fine technical task that may already be confusing for some editors to determine what occurred - but I'm also generally in support of not wasting edits and making the page better all at once. Please respond at the BRFA. Thank you, — xaosflux Talk 15:53, 23 July 2017 (UTC)
Right now, while we're in the middle of a huge list of speedy renaming of categories, Cydebot appears to have stopped working. It would be nice if some other bot could help out. עוד מישהו Od Mishehu 02:58, 24 July 2017 (UTC)
As of this particular moment, the category counts for ISBNs, PMID, and RFC magic links are 1102, 1189, and 2067, respectively. Obviously RFC has been deemed enough of a problem that manual oversight is required, but I thought I'd mention it. The remainder of the ISBN/PMID pages are either on transcluded pages (i.e. the huge batch of AFD Log pages), in userspace (which I have agreed to avoid), or odd cases which manual editing will be required. I don't know what MW's timeframe for turning off magic links is/was, but I think we're at the point where en-wiki can turn them off with little to no issue. I'm not sure if that's something for us specifically to do, but I figured an update on the situation would be helpful.
There are new cases popping up (mostly in the article space) daily, so the bots will probably keep running, but the bulk of the work (249k out of 250k pages) is complete. Primefac ( talk) 12:39, 23 July 2017 (UTC)
\[?\[?OCLC\]?\]?(:| )?(\d+)
→ {{OCLC|$2}}
has worked well for me.
Headbomb {
t ·
c ·
p ·
b} 12:53, 23 July 2017 (UTC)I think we should exclude from the list all the "Wikipedia:Articles for creation/..." pages. -- Magioladitis ( talk) 13:50, 23 July 2017 (UTC)
I've proposed adding a bot section to the dashboard. Comments welcomed. Headbomb { t · c · p · b} 17:26, 25 July 2017 (UTC)
The new IABot interface tool allows editors to archive all links in an article even when not dead, see Al83tito edit history ( example diff w/ 563 links added). Unlike other tools that operate on a single page, this is more like unattended AWB with a queue system giving great powers to editors who enjoy the ability to make massive edit runs with little effort. This feature can be run by any editor on-demand without needing prior consensus or approval.
We should have a discussion because this feature is not being met with complete acceptance. User talk:Al83tito talk page has example complaints. There is an open RfC at Village Pump for doing this for all articles on Wikipedia via bot. This discussion concerns the IABOt interface tool which does the same thing on-demand. My opinion this feature is powerful and apparently disruptive enough it should have more community discussion. Do we want to have this feature (archiving live links on-demand)? If so, do we want to allow it for mass edits with an AWB-like tool? And if so, do we want an approval mechanism such as AWB, or a bot approval like BOTREQ? Or leave things as they are? @ Cyberpower678, Al83tito, JFG, Lordtobi, and Dhtwiki: -- Green C 14:51, 10 July 2017 (UTC)
@ Andy Dingley: Which bots get a free ride against WP:CITEVAR? Headbomb { t · c · p · b} 01:20, 30 July 2017 (UTC)
We need a centralized debate to define a community guideline about archiving of live sources. However, a number of bot-assisted edits may be due to confusion by users clicking the only checkbox on the IABot page, which says "Add archives to all non-dead references". I have requested a UI improvement at User talk:cyberpower678#IABot: suggest UI clarification. — JFG talk 07:37, 30 July 2017 (UTC)
Bad idea. Have any of you "archive all links" enthusiasts considered whether the Wayback Machine would be able to handle the increased traffic if you replaced 20 million links with archived links? It's been timing out for hours now. A much better way to handle a dead link would be to have a bot ping the two most recent editors on the page with a "dead link" notice, then check back a week or two later to see whether the problem has been rectified (I have no idea whether that's technically feasible). Often the reason for the dead link is that the publisher moved the article, and a search for author and title will provide the new active link. Follow-up on Wayback Machine: Just got this from one of those bot-generated archived links. Space4Time3Continuum2x ( talk) 19:54, 30 July 2017 (UTC)
Good Idea. I think that a major argument against is the bloating of the code. I have been editing Wikipedia for about 7 months. When I first looked at the code I couldn't believe how messy it was; in my humble opinion it is horrible. I don't think adding archive links will make it appear any more bloated. To reduce the bloating I have discovered 2 templates that I now use for citations. List Defined References removes all of the citation code to the bottom of the page, so at the citation point all you do is call the citation name using Template:R. The wiki text would be tidy and readable and not susceptible to bloating. All editors need to be educated to use this template. The remaining issues are out of my level of understanding and I'll leave them for others to discuss. Every link I cite is archived. It will be a hell of a job to go back and and recover archived urls to each citation I have created once the link has died. 8==8 Boneso ( talk) 04:33, 31 July 2017 (UTC)
References
Please take a look at WP:TH#Dead WSJ links and the VPT thead linked from it. We are getting complaints about bot edits made this past march by Bender the Bot 8. DES (talk) DESiegel Contribs 03:20, 27 July 2017 (UTC)
See WP:TH#Dead WSJ links and this edit where removing the s apparently changed an effectively dead link to a live and correct one. Is this something temporary at wsj.com, or are wqe going to have to get a bot to undo these changes from March? DES (talk) DESiegel Contribs 02:41, 27 July 2017 (UTC)
@ Bender235: and others: While the debate over which sort of link is a valid one, any discussion of this matter should note Wikipedia:Sockpuppet investigations/Nate Speed. – Train2104 ( t • c) 00:38, 2 August 2017 (UTC)
This has just been created. Feel free to be WP:BOLD and add missing terms which you feel would be useful. Headbomb { t · c · p · b} 14:10, 7 August 2017 (UTC)
Ponyo protected page Kitni Girhain Baaki Hain because of sockpuppetry, but this template was removed by the MusikBot saying that it is an unprotected page. SahabAli wadia 11:12, 19 August 2017 (UTC)
As soon as a page is moved, Xqbot fixes all the resulting double redirects right away, immediately and instantly. Also, the links to the fixed target pages are shown with the prefix "en:" in the edit summaries. I don't like this behavior, because it can lead to serious errors when there is page-move vandalism. The bot should return to its old behavior. GeoffreyT2000 ( talk, contribs) 23:49, 11 August 2017 (UTC)
I am wondering if we have any policy, rules, or consensus on what to do with a bot where a) the bot isn't used by anyone but its operator and b) the operator hasn't edited Wikipedia for anything but the creation of this bot. Basically, a bot which is at best for the convenience of one reader, and at worst not used at all, but still editing every day.
Specifically, I am concerned about the recently approved User:Wiki Feed Bot, operated by User:Fako85. It makes 16 edits a day, to its own space, to subpages of Fako85, and to User:DNNSRNST, which is an editor with one edit (setting up his talk page for this bot, which wasn't even approved at the time). Fako85 has made no edits unrelated to this bot.
The value of having such a bot seems minimal, and I'm not sure that this value is sufficient to outweigh the potential risks (strain on servers? bot account hacking?). Fram ( talk) 07:55, 8 September 2017 (UTC)
how heavy [a read of a large portion of the recent changes log] is? That is actually a point that I did not check when calling this a low-resource-usage bot. Tigraan Click here to contact me 12:54, 8 September 2017 (UTC)
@ Fram: The BOTREQ contains the text: "Currently Wiki Feed does not use the RCStream. We're considering it, but we need some time to implement this as it requires a fair amount of changes to the system.". Maybe it is wise to ask Fako to switch to EventStreams? ((( The Quixotic Potato))) ( talk) 19:40, 8 September 2017 (UTC)
If I understand the BOTREQ correctly (specifically the edit dated 12:42, 22 July 2017) then the bot will have to check if all images it is using are still usable every 24hrs. Imagine if a lot of people use this bot, then that would mean a massive amount of requests, right? ((( The Quixotic Potato))) ( talk) 20:06, 8 September 2017 (UTC)
For those who have missed it this week, something bot-related. Headbomb { t · c · p · b} 02:52, 11 September 2017 (UTC)
I've spotted User:Bender the Bot, User:KolbertBot and maybe others, helpfully converting HTTP links to HTTPS where sites have begun supporting encrypted connections since links were added to articles. It looks as if this is being done a few websites at a time based on prevalence of links to each site and ease of conversion (obviously much easier all round if http://example.com/page corresponds exactly to https://example.com/page without needing to further amend the URL). Has anyone considered using the rulesets established for HTTPS Everywhere to find many, many more sites that can have link conversion applied, including lots of obscure 'long tail' ones that are never going to get noticed by the bot operators? These rulesets are well tested because they are in daily use by HTTPS Everywhere's userbase, so there shouldn't be too many problems encountered where links are broken by the change, even if relatively complex regular expressions have to be applied rather than straightforwardly adding an 's'. See https://www.eff.org/https-everywhere/atlas/ for a list and https://www.eff.org/https-everywhere/rulesets for more info. If this is too complicated, would it be worth instead (or for starters) plundering the resource that is Chrome's HSTS preload list? Each of the sites on it has committed to serving web content through HTTPS only for the long haul, generally redirecting http:// URLs themselves (but thwarted if someone is intercepting traffic on a user's first visit, hence the need for a preload list shipped with the browser), and may have been considered a high-value target for surveillance/man-in-the-middle by the maintainers of the list. Either way, relevant work is being done in this area by outside parties that bot operators here could piggyback on. Beorhtwulf ( talk) 16:27, 18 September 2017 (UTC)
This discussion was spread across multiple pages, including some village pumps, but organically ended up at a dedicated page. However, we are now at (or beyond?) the stage where this again needs wide publicity and participation. Please see the long discussion at Wikipedia talk:Wikidata/2017 State of affairs#Strategies for improving the descriptions and especially the "Proposal from WMF" subsection (currently the bottom one), where a new magic word is proposed and specific implementations of it discussed.
This is a discussion which will potentially impact all articles and affect the first thing all users get to see in mobile view and on apps (and elsewhere), so getting enough input on this is important. I have posted it at WP:CENT, WP:VPPR and WP:AN. Feel free to drop notes at other places.
I have included this noticeboard because, if this proposal or something similar gets accepted, there probably will need to be one or two big bot runs (and perhaps some clever bot programming) across many or all articles.
Please keep the actual discussion in one place if possible. Fram ( talk) 07:07, 29 September 2017 (UTC)
Hi, I think this got approved too fast, and under a misconception. The BRFA was open for a grand total of 16 minutes, and was closed by a BAG member whose bot's (for lack of a better word) errors are the subject of the task. It was clarified that these are actually supposed to be medium priority lint errors, not high priority like the BRFA states. Pings: @ Cyberpower678 and Nihlus:. Legoktm ( talk) 07:46, 1 October 2017 (UTC)
bot edits can be hidden from view within the watchlist, and those that do not want to hide bot edits can learn what it is doing with one look and ignore the rest. – Nihlus ( talk) 15:50, 1 October 2017 (UTC)
I knew I had seen a discussion about this before: it is at Wikipedia:Bots/Noticeboard/Archive 11#Archiving links not dead - good idea? Most of the discussants there (obviously mostly fans of bots) seemed to approve of archiving all the reference links in an article, even the live ones. Some of us less technically oriented editors think the practice can be damaging to articles. Recent example, which is the reason I am bringing it up: With this recent edit to the article Barack Obama, the IABot v1.5.2 archived 392 references, adding 74,894 bytes to the article, and increasing its already huge size by 22.6%, from 330,241 to 405,135 bytes. Is that really something that people here think is a good outcome? (The other editor reverted at my request.) Does the bot offer the option of archiving only the dead links, as some of us non-techie people have requested? -- MelanieN ( talk) 18:04, 17 September 2017 (UTC)
@ Dhtwiki and Cyberpower678: I posted a question about this at Wikipedia talk:Link rot#Using a tool to archive live links. (Although given that project's enthusiasm about archiving, I wonder if that was kind of like asking an insurance salesman if I need more insurance!) -- MelanieN ( talk) 15:12, 4 October 2017 (UTC)
There is a discussion at Wikipedia talk:Double redirects#The bots should operate with a delay where the input of bot operators, particularly those who operate bots which fix double redirects, would be useful. Note that the section contains multiple ideas (not just the one in the section title), but not yet any firm proposals. Thryduulf ( talk) 16:44, 9 October 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Please see Wikipedia talk:AutoEd#Two bad edit types: changing spaced en dashes to unspaced em dashes, against MOS:DASH; changing page and other numeric ranges like 1901–1911 to 1901–11, which is against the spirit if not letter of MOS:NUM. The fact that the rule to full numbers was only applied to the "Dates" section at MOSNUM is an oversight, which has been fixed (I expect the fix to stick, because all the reasoning about date ranges also applies to other ranges). The rest of what this tool is doing needs closer examination under the style guidelines and WP:COSMETICBOT. — SMcCandlish ☏ ¢ >ʌⱷ҅ᴥⱷʌ< 23:39, 10 October 2017 (UTC)
Wikipedia_talk:Version_1.0_Editorial_Team/Index#Draft_links may be of interest. If noone more experimented with Wikimedia bots can look into it, I could ultimately, but I have not yet looked for its code and have no experience with the APIs involved yet. The previous maintainers are inactive. Thanks, — Paleo Neonate – 01:21, 11 October 2017 (UTC)
This may interest many of you. Please comment. Headbomb { t · c · p · b} 12:29, 11 October 2017 (UTC)
AvicBot used to list pages from Category:Candidates for speedy deletion as abandoned AfC submissions at User:AvicBot/AfCCSD. However, apparently the list stopped being updated because the category was renamed to " Candidates for speedy deletion as abandoned drafts or AfC submissions" per an RfC that expanded the scope of G13. The bot needs to be updated to use the new category name to prevent the list from being permanently empty. GeoffreyT2000 ( talk, contribs) 23:15, 5 October 2017 (UTC)
{{Portal:Current events/{{#time:Y F j|{{{1|{{CURRENTYEAR}}}}}-{{{2|{{CURRENTMONTH}}}}}-{{{3|{{CURRENTDAY}}}}}}}}}
should solve the problem. It's fully protected, otherwise I would do it.
Nihlus 15:48, 13 October 2017 (UTC)The extra lines have been removed. -- John of Reading ( talk) 06:02, 23 October 2017 (UTC)
Hello there, I was wondering if anyone could direct me as to how to get started using and mining Wikipedia database dumps? I have downloaded the latest pages-articles.XML.bz2 version. The goal is to mine for a particular string in order to figure out the relative need for a bot and to build a list of pages that would need to be edited if the string is present within the namespace. ( Xaosflux sent me here). Thank you for your help. -- TheSandDoctor ( talk) 16:15, 24 October 2017 (UTC)
Hey, all, I was thinking of writing a script to automatically assess G13 speedy deletion requests, after an influx of them today. Basically, I'd write a script that would automatically scan Category:Candidates_for_speedy_deletion#Pages_in_category where, for each page there in the Draft namespace, check to see if the CSD nomination is G13 (probably by testing for inclusion in Category:Candidates for speedy deletion as abandoned drafts or AfC submissions and the second-most recent edit (i.e. the edit before the nomination) is more than 6 months prior, and if so, provide a deletion link on the page. But I don't know if such assistance is too close to MEATBOT-like automation, especially given the use of admin tools, so I figured I'd ask here first in case people think that would need some kind of approval. I figure G13 is low-impact enough (not article space, free refund) and has a simple enough inclusion criteria that it isn't a big deal. Any thoughts? Writ Keeper ⚇ ♔ 18:05, 27 October 2017 (UTC)
I thought I would point everyone to the significance of v1.6 of IABot.
https://github.com/cyberpower678/Cyberbot_II/pull/46 — CYBERPOWER ( Trick or Treat) 21:46, 28 October 2017 (UTC)
A provocative title for a cautionary tale. Please see User_talk:Ladsgroup#Helping the vandals do their work for the details of a minor episode of janitors cleaning up the crime scene too quickly / early / inappropriately. Shenme ( talk) 23:34, 22 October 2017 (UTC)
Is this a legit bot? I don't recall any BRFAs for it... CHRISSYMAD ❯❯❯ ¯\_(ツ)_/¯ 13:48, 9 November 2017 (UTC)
The 2017 Community Wishlist Survey is up for proposals (November 6-19). You can make proposals and comment on stuff to help the technical collaboration review and organize the proposals, but the larger community input will happen from Nov 27–Dec 10. Headbomb { t · c · p · b} 15:12, 8 November 2017 (UTC)
Where's the place to "seek out a community discussion" on "a community bot" (whatever that means) about getting a bot to stop leaving a particular kind of pointless message? The InternetArchiveBot does various things, and even leaves some helpful messages, but when it leaves a note on an articles talk page that all it did was provide an archive-url to a cite that didn't have one, this is pointless, annoying bot-spam. We don't need to know that it did something trivial that no one sane would question, and we already know – anyone watching the article already saw the edit, so now they're getting a second watchlist hit for the same thing for no reason.
I went to the bot's talk page, and it isn't editable except by admins. I got to the author/operator's page, which directed me to file a ticket about it a Phabricator. So I did [17]. The response to that was a testy "The bot is currently approved to run with these message.", which is a silly thing to say. All the bots are approved to do what they do or their operator would be in trouble and the bot would be blocked. I was told "The last discussion regarding them had no consensus for change", which means it has been discussed before and other people are tired of these messages, too. "If you feel the bot should stop leaving messages, please seek out a community discussion. This is a community bot". I see a bot requests page, which seems to be only for asking for bots to do stuff not to stop doing them, and isn't really a discussion page; and the noticeboard, which appears to be for reporting bugs and policy violations.
So, I'm not really sure what the process or venue is. PS: This isn't about ALL InternetArchiveBot notices, just the the no-one-will-care pointless ones. — SMcCandlish ☏ ¢ >ʌⱷ҅ᴥⱷʌ< 12:50, 4 October 2017 (UTC)
Simple pseudocode fix: if $CHANGESBOTMADE == ($ARCHIVEURL or ($ARCHIVEURL + $DEADURLYES)) then $POSTABOUTIT = no
– i.e., if it's done anything at all other than that trivia (including that trivia and something non-trivial), then go ahead and post a notice.
—
SMcCandlish
☏
¢ >ʌⱷ҅ᴥⱷʌ< 15:04, 4 October 2017 (UTC), clarified 23:05, 4 October 2017 (UTC)
I am the editor that created the discussion in Wikipedia talk:TPG about bot notices in talk pages. In regards to the InternetArchiveBot and its announcement about modification of external links: what's the point of keeping these notices on an article's talk page after an editor has checked the links and found them okay? Pyxis Solitary talk 05:27, 5 October 2017 (UTC)
I for one find the InternetArchiveBot notices useful. Though the bot is getting better, it doesn't always pick the right archive URL and sometimes misclassifies links as dead; it's also quite possible that a link that it detects needs manual updating. The talk page notices, which as far as I know only show up when the bot adds a (possibly invalid) archive URL, are a useful way of keeping track of what it does, and serve to show that a human has indeed OK'd the bot's changes. The notices also serve as a handy way to get to the bot's interface, which I've used several times. Graham 87 08:05, 5 October 2017 (UTC)
Agree that these posts are a waste of time and bandwidth I notice ClueBot NG doesn't do the same thing whenever it reverts vandalism. It simply leaves a link in the edit summary asking others to report false positives. I don't see why something similar can't be implemented here - in the example SMC provides, the summary for the edit the bot is referring to simply reads, "Rescuing 1 sources and tagging 0 as dead. #IABot (v1.5.4)". There's plenty of space in there for a link like ClueBot NG leaves. It's one thing to alert users to edits like these, but there's a better way to do it, if it needs to be done at all. Zeke, the Mad Horrorist (Speak quickly) (Follow my trail) 14:13, 7 October 2017 (UTC)
suggest WP:BOTAPPEAL for communications issues. ... when the bot was doing a straightforward edit the talk page seemed completely over the top. That is not to say the bot isn't useful and by the look of it working well. But I think there are a number WP:BOTCOMM issues.
I think as it is a bot it would be better if it admitted to being a bot and gave precise instructions. Rather than [
this diff] I think I'd prefer to see something along the lines of:
The Internet Archive BOT has made the following changes:
URL1 (dead) -> Archived URL
It would be helpful if the modifications can be manually reviewed and set checked=true in the sourcecheck template if the edit was successful or failed if not. For detailed information on InternetArchiveBot see **HOWTO** .. the HOWTO going back to a BOT page or subpage and ensuring the FAQ/HOWTO covered the case of manually checking BOT work first.
In summary the BOT looks to be doing some great work but I think its really tricky not to fall foul with WP:BOTCOMM and I think that area needs an improvement. It prefer it didn't make a talk page entry for simple edits but understand that *might* considered necessary. Djm-leighpark ( talk) 23:07, 9 November 2017 (UTC)
Notification for anyone who uses that category in their bot. Jo-Jo Eumerus ( talk, contributions) 11:00, 11 November 2017 (UTC)
I tried to start a discussion regarding Cluebot on the Cluebot talk page and my comments were archived by the bot without response. I'm concerned about Cluebot reverting good-faith edits, and the effect this may have on potential contributors.
Reading through the Cluebot pages and considering the lack of response, and rapid archiving, of my comment -- it is my feeling that discussions of this nature are not welcomed by the bot operator. It seems to me that the wider community ought to have a voice in how Cluebot is operated and should be entitled to review Cluebot's work on an ongoing basis and discuss the bot's settings and edits without having to fill out forms and have the discussion fragmented. I am concerned that the characterization of the 0.1% "false positive rate" used by the bot's proponents, though useful technically, belies the substantial number of good-faith edits this bot is reverting. Since it has been some years since the bot was approved, I think it's appropriate to review the work it is doing in light of the current editing climate and the evolution of the bot itself (and its settings) over the years.
At a minimum, I believe that the bot's operators and proponents have an obligation to take these concerns seriously enough to discuss them.
While mistaken reverts can be undone, the frustration they may cause to a well-meaning, fledgling contributor cannot.
The Uninvited Co., Inc. 19:52, 3 November 2017 (UTC)
( ←) To answer your two specific questions:
How have the decisions been made over what edits the bot will revert?
— The Uninvited Co., Inc.
What is the best way to have an open discussion about the way this automation is being conducted and its effect on new contributors?
— The Uninvited Co., Inc.
-- Cobi( t| c| b) 23:03, 3 November 2017 (UTC)
To reply to your comments here:
I tried to start a discussion regarding Cluebot on the Cluebot talk page and my comments were archived by the bot without response. I'm concerned about Cluebot reverting good-faith edits, and the effect this may have on potential contributors.
— The Uninvited Co., Inc.
False positives are an unfortunate technical inevitability in any system that automatically categorizes user content. Human editors suffer from this as failing as well. The only thing that can be done is to figure out where the trade-off should be made. I am certainly open to discussing where that trade-off is, but as you haven't made a proposal yet, I am happy with where it currently is.
Reading through the Cluebot pages and considering the lack of response, and rapid archiving, of my comment
— The Uninvited Co., Inc.
It's the same 7 day archival period you have on your talk page. I was busy and your message at the time didn't appear particularly urgent in nature, and in the 7 days no one else had any thoughts on the matter and so the bot archived it.
it is my feeling that discussions of this nature are not welcomed by the bot operator.
— The Uninvited Co., Inc.
This is a hasty generalization.
It seems to me that the wider community ought to have a voice in how Cluebot is operated and should be entitled to review Cluebot's work on an ongoing basis and discuss the bot's settings and edits without having to fill out forms and have the discussion fragmented.
— The Uninvited Co., Inc.
Free-form discussion is encouraged on the bot's talk page. Or here.
I am concerned that the characterization of the 0.1% "false positive rate" used by the bot's proponents, though useful technically, belies the substantial number of good-faith edits this bot is reverting.
— The Uninvited Co., Inc.
False positive rates are used as standard metrics for any kind of automated classification system. <0.1% means less than one edit is falsely categorized as vandalism out of every thousand edits it examines.
Since it has been some years since the bot was approved, I think it's appropriate to review the work it is doing in light of the current editing climate and the evolution of the bot itself (and its settings) over the years.
— The Uninvited Co., Inc.
Review is always welcome so long as it comes with concrete, actionable changes of which the merits can be properly discussed. Pull requests are even better.
At a minimum, I believe that the bot's operators and proponents have an obligation to take these concerns seriously enough to discuss them.
— The Uninvited Co., Inc.
We do.
While mistaken reverts can be undone, the frustration they may cause to a well-meaning, fledgling contributor cannot.
— The Uninvited Co., Inc.
Of course, but that is hard to measure objectively. Do you have any good metrics on the frustration caused to well-meaning, fledgling contributors? I'd love to see that data, and be able to tweak things to help those metrics go in the direction we want. -- Cobi( t| c| b) 23:39, 3 November 2017 (UTC)
See this discussion on Meta. Old/invalid accounts were renamed & given new "enwiki" names by the Maintenance script bot but the original accounts apparently weren't closed & account info wasn't migrated to the new/valid accounts... So. Editors are continuing to edit under the old/invalid accounts. Shearonink ( talk) 16:30, 9 November 2017 (UTC)
The community is invited to comment on the appeal lodged by Δ at Arbitration Requests for Clarification and Amendment.
While the discussion at Wikipedia talk:Double redirects#The bots should operate with a delay has pretty much died down without clear consensus, there's been a suggestion that double-redirect-fixing bots should tag the redirects they fix with {{ R avoided double redirect}}. This will help alert human editors to redirects that are left pointing to the wrong location as a result of disputed moves or mergers being reverted. Can this be implemented? Pinging bot operators R'n'B, Xqt and Avicennasis. -- Paul_012 ( talk) 10:12, 21 November 2017 (UTC)
See Wikipedia talk:Arbitration/Requests#Crosswiki issues: Motion (November 2017). This will be relevant both to WP:BAG members and Wikidata-related bot operators. Headbomb { t · c · p · b} 00:37, 28 November 2017 (UTC)
Since AWB has a bot flag, that turns it into a bot engine, I thought you might want to know about a vote going on that will affect the nature of that program:
The Transhumanist 00:25, 2 December 2017 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | → | Archive 15 |
Hey all, I'm kinda new to this bot thing, but I have an idea for a new task, and it might or might not be good. The reason I'm not asking at Bot Requests is because me and that page have some bad history between us about 2-3 years ago. I'm nervous to see what would happen if I went over there again so I'm posting it here. It's CHECKWIKI Task 64, "Link equal to linktext" ([[Apple|apple]] --> [[Apple]]). The reason I'm asking is because normally FrescoBot does Task 64, but looking at its contribs, it only seems to edit every 2-3 days, and only edits 20-30 articles at a time. If you look at here, you can see that there are 2,100+ articles with that error. Editing 2,100+ articles to remove that error is something that is tedious to do manually, so I am asking if this is a good idea. I know PrimeBOT uses AWB to edit, and is run manually by Primefac, so I thought it might be a good idea to do something like that. Please let me know what you think. Please leave constructive criticism only please. Thanks! Yoshi24517 Chat Very Busy 04:05, 17 March 2017 (UTC)
Hi. I have an idea to create articles about PGC objects. I have full list like table with characteristics of this objects. Is that possible? I'm not from Engish Wikipedia, I'm just asking, is that possible. Thank for response. -- Artificial123 ( talk) 06:59, 17 March 2017 (UTC)
There is currently an effort to identify which WP:CWERRORS should be considered cosmetic/which aren't. Help and feedback would be appreciated. Headbomb { t · c · p · b} 12:57, 31 March 2017 (UTC)
In case you haven't noticed, there is currently an RFC on the proposed update on our WP:COSMETICBOT policy. Headbomb { t · c · p · b} 13:00, 31 March 2017 (UTC)
Is broken.
It was reported a month ago and also (in the wrong place) here. -- Green C 20:28, 24 March 2017 (UTC)
Pinging Dispenser. -- Edgars2007 ( talk/ contribs) 14:39, 31 March 2017 (UTC)
Dispenser has disabled the "save page" portion of Checklinks on enwiki until he can update the code. -- Green C 18:05, 5 April 2017 (UTC)
If you're making a bot-related presentation at Wikimania 2017 in Montreal, advertise it here!
I'll be making at least two, assuming they are accepted, one on Article Alerts, the other on Journals Cited by Wikipedia (and possibly a third one on bots in general). If you are interested in attending, please sign up! Headbomb { t · c · p · b} 12:59, 7 April 2017 (UTC)
Bots Newsletter, April 2017 | |
---|---|
Greetings! The BAG Newsletter is now the Bots Newsletter, per discussion. As such, we've subscribed all bot operators to the newsletter. You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list. Highlights for this newsletter include:
Magioladitis ARBCOM case has closed. The remedies of the case include:
We currently have 27 open bot requests at Wikipedia:Bots/Requests for approval, and could use your help processing!
There are multiple ongoing discussions surrounding bot-related matters. In particular:
Several new things are around:
Wikimania 2017 is happening in Montreal, during 9–13 August. If you plan to attend, or give a talk, let us know! Thank you! edited by: Headbomb 11:35, 12 April 2017 (UTC) (You can unsubscribe from future newsletters by removing your name from this list.) |
Some old Javascript, some of which has been deprecated for more than five years, is being removed later this month. Some old scripts may need to be updated. If you think that this might be relevant to your code, then please see https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2017-April/001574.html for more details (including links that show how to fix most of the code). Whatamidoing (WMF) ( talk) 19:25, 12 April 2017 (UTC)
Can anybody please assist Cobi ( talk · contribs) in fixing ClueBot III ( talk · contribs)? It is taking too much when it archives threads, see User talk:ClueBot Commons/Archives/2017/April#Improper archival of heading, part 3. Basically, when archiving a level 4 subsection, the bot assumes that the subsection terminates with the next level 4 heading - and if there is an intervening level 3 heading, it is archiving that too, which is an error. -- Redrose64 🌹 ( talk) 20:36, 27 April 2017 (UTC)
ooui=1
to the end of the URL:
https://en.wikipedia.org/?title=Marie_Curie&action=edit&ooui=1 I'm told that Twinkle will probably be okay (no problems in limited testing), but a few other scripts and bots may need to be updated.If you think your bot or script will be broken by this, and you can't figure out how to update it, then please post requests for help either here or at WP:VPT. Whatamidoing (WMF) ( talk) 17:18, 9 May 2017 (UTC)
We had a few more scripts break at fawiki than we had hoped for, so we're slowing this down a bit. I've collected some information, including diffs of some repairs, at mw:OOjs UI/Fixing scripts and gadgets. If you maintain any scripts, or if you depend upon any and aren't certain whether the owner is actively maintaining them, please check that page and start testing (instructions on the page).
Please also share this information with people at other wikis. Whatamidoing (WMF) ( talk) 19:23, 17 May 2017 (UTC)
In followup to something mentioned here. Wikipedia:Administrators'_noticeboard/Incidents#Unauthorized bot job, unresponsive operator would BRFA be the place to request a bot flag for a user account so that it CAN be run in an approved manner?
An additional concern expressed was that the BOT components of it were not compliant with the relevant policy on Cosmetic edits. ShakespeareFan00 ( talk) 21:19, 23 May 2017 (UTC)
Please note a nomination for Bot Approvals Group membership is active. Feel free to comment here. ~ Rob13 Talk 22:46, 26 May 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The following bots, and their respective operator, have each had no contributions in over two years and are scheduled to be deauthorized in one week per the bot policy activity requirements. If your bot is listed and you wish to retain authorization, please add a note to the table and sign below.
bot_name | bot_editcount | bot_lastedit | oper_name | oper_lastedit | notes |
---|---|---|---|---|---|
User:SelketBot | 16870 | 20110624183928 | User:Selket | 20140216162053 | |
User:SkiersBot | 124334 | 20110715052412 | User:Skier_Dude | 20120917042322 | |
User:MartinBotIII | 136346 | 20110731122144 | User:Martinp23 | 20130427212553 | |
User:Kotbot | 157583 | 20110816121147 | User:Kotniski | 20120124000153 | |
User:WalkingSoulBot | 1 | 20110823130647 | User:WalkingSoul | 20110605220714 | |
User:GurchBot | 7421 | 20110919112313 | User:Gurch | 20130804182024 | |
User:MiszaBot | 81480 | 20111013170506 | User:Misza13 | 20150219094323 | |
User:DodoBot | 136137 | 20111126163905 | User:EdoDodo | 20111126164139 | |
User:RaptureBot | 13074 | 20111218221254 | User:FinalRapture | 20111120060515 | |
User:Rfambot | 1774 | 20120213174928 | User:Jennifer Rfm | 20131106230051 | |
User:FlBot | 14324 | 20120217110113 | User:Fl | 20140326014308 | |
User:MessageDeliveryBot | 10187 | 20120605022949 | User:EdoDodo | 20111126164139 | |
User:AlanBOT | 6712 | 20130429203141 | User:ikseevon | 20130429040405 | |
User:MMABot | 5265 | 20130505205805 | User:TreyGeek | 20130628122155 | |
User:LyricsBot | 27368 | 20130921052032 | User:Dcoetzee | 20141003225306 | Operator has been banned Accounts are already globally locked |
User:DyceBot | 45604 | 20140105070113 | User:Dycedarg | 20140315182843 | |
User:HersfoldArbClerkBot | 11398 | 20140110024813 | User:Hersfold | 20140110040539 | |
User:IPLRecordsUpdateBot | 19 | 20140210113220 | User:Jfd34 | 20140420092748 | |
User:Wpp research bot | 3 | 20140328200839 | User:Jantin | 20141222190945 | |
User:AstRoBot | 4229 | 20150125114428 | User:WDGraham | 20150214171645 | |
User:HBC AIV helperbot7 | 253005 | 20150204230319 | User:Wimt | 20150512214048 |
So I had an idea regarding bot tasks, specifically regarding followup. I've often wondered (even with my own tasks) about how many edits actually were made during a bot run, to see if there was any sort of accuracy regarding the initial estimate. Also, thinking about minor tweaks that were made to code to improve it.
Would it be reasonable to ask for bot operators to give some sort of "after action report" for one-off bot runs? Primefac ( talk) 14:14, 8 June 2017 (UTC)
Please comment there. Headbomb { t · c · p · b} 17:59, 22 June 2017 (UTC)
I'm hearing that phab:T53736 is being discussed seriously, and that it may affect bots. I don't understand the project yet, but if you're interested in how bots cope with redirects, then please take a look. Whatamidoing (WMF) ( talk) 16:19, 27 June 2017 (UTC)
There is a discussion happening at the Wikipedia:Village pump (proposals) page on the tone of the wording of the InternetArchiveBot messages that are being left on article talk pages. If you are interested, please see that discussion. Thanks! KDS4444 ( talk) 23:43, 29 June 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
User:InternetArchiveBot is a bot. Responding to an edit of it, I posted on their talkpage this. After "saving", my post did not show?! I had to do research, to discover that the (regular looking) page said like: "Please do not edit this page. Messages left here will likely go unnoticed.". In other words: the bot is deaf. (to be clear: talkpage instructions are not defining. For example: we have Redirects). Why is this bot allowed to operate like this? - DePiep ( talk) 20:40, 28 June 2017 (UTC)
I don't see any real problem on that page: then why reply here at all (duh)?
<div style="display:none">
appears to be why your post doesn't show. Of course, the placement of that may be disputable, as it appears intended to make new entries added at the bottom invisible (except when viewing the talk page source). —
Paleo
Neonate - 21:34, 28 June 2017 (UTC)
__NONEWSECTIONLINK__
when posts are unwanted on the page. Users using the new section link can see their post in preview and then it vanishes on save. The top could also have a source comment saying "DON'T POST TO THIS PAGE. READ THE INSTRUCTIONS." Maybe repeat in the last section for users who try to edit that and manually add a section heading. Or add __NOEDITSECTION__
to prevent that. The unclosed <div style="display:none">
was apparently added deliberately.
[1] That does seem a little extreme without an explanation.
PrimeHunter (
talk) 22:32, 28 June 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
While we are at it: how and why can such a talkpage get TE-level of protection? - DePiep ( talk) 22:45, 28 June 2017 (UTC)
<div style="display:none">
at the bottom of the page, meaning that all new sections would immediately disappear. Thank you for reminding me to extend full protection to the IABot's user page, though.
Primefac (
talk) 23:37, 28 June 2017 (UTC)
<div>
tag starts a page division. If there is no corresponding </div>
tag, the division ends at the bottom of the page. Any HTML element may be given a style=
attribute, and
the display:none
declaration causes the element to not appear in the formatting structure. So if your browser has an "inspect element" feature which permits the toggling of styling, it is possible to make the thread in question displayed by deselecting the display:none
declaration. As noted above, editing the page source also shows that it is present. --
Redrose64 🌹 (
talk) 23:53, 28 June 2017 (UTC)
You all are promoted to level WP:ANI.I am not; I am but an assistant pig-keeper . — Paleo Neonate - 00:50, 29 June 2017 (UTC)
I believe that AWB bots automatically stop if a message is left on the talk page. This is good practice and it probably wouldn't be a good idea to fully protect these talk pages. I see that User:InternetArchiveBot has a shut-off option so this is not a concern in this case. — Martin ( MSGJ · talk) 11:30, 29 June 2017 (UTC)
The real question, aside from the drama above, seems to be: should bot talk pages be allowed to redirect to other pages, or should bot communication be on the bot talk page only? Personally, I'm strongly opposed to this (as above) but welcome constructive discussion to clear the matter up formally. TheMagikCow ( T) ( C) 12:26, 29 June 2017 (UTC)
Speaking as a bot operator, I tried giving blatant warnings on the bot's talk page that the bot won't respond on that talk page and all queries are routed to the bit recycling bin. Even then that didn't prevent editors from trying to engage the bot as a user. I filed for Full Page protection (requiring an admin to be able to edit) and still that doesn't prevent industrious admins from dropping notices when they'll be ignored. There's no reason good reason (IMO) why IAB uses that unclosed div tag, and has the side effect of potentially gobbling up other data. Highly reccomend (with what I see as the application of Full protection) to remove that unclosed tag. Hasteur ( talk) 13:53, 29 June 2017 (UTC)
did you actually see/read/understand the message that said not to post there .. before saving. This is a good Q, but not enough. My A:
This is so far out of hand. @ Cyberpower678: In order to get this discussion over with and move on, would you be fine with adding a literal soft redirect to the bot user talk page to make it even more obvious that the editor is intended to comment on your talk page? This is not indicative of any wrongdoing on your part, just the path of least resistance (if you're interested in taking it). ~ Rob13 Talk 01:41, 30 June 2017 (UTC)
Check edit history here (2013) Maybe we encourage the bot op to wrap up changes? I am sure there are more recent examples. -- Magioladitis ( talk) 22:17, 3 July 2017 (UTC)
I have proposed to do what Cydebot does with Yobot. At least in the level of closed XfDs. -- Magioladitis ( talk) 23:35, 3 July 2017 (UTC)
Speed is kind of important for this one too. We want to minimize the time when articles are split between two categories when a category is being renamed. I wouldn't want a bot renaming a 500-article category making an edit every 10 seconds. AWB can do things quickly, so that's not a problem. The timing as Primefac states is kind of important to keep CFD running smoothly. Whenever Cydebot has been down for even a couple days, it's a huge pain to everyone trying to manage that process. ~ Rob13 Talk 18:05, 4 July 2017 (UTC)
Krinkle posted at message to several mailing lists that some of you may want to read. It begins like this:
TL;DR: In April 2017, the jQuery library in MediaWiki was upgraded from 1.x to 3.x (the current stable version), with the jQuery Migrate plugin enabled to ease migration. We temporarily still load jQuery 1.x on Wikimedia wikis. Read about the API changes at https://jquery.com/upgrade-guide/3.0/
The full message can be read here: https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2017-June/001617.html
Whatamidoing (WMF) ( talk) 19:53, 5 July 2017 (UTC)
Unproductive complaining about a problem that doesn't exist. Jc86035 ( talk) Use {{ re|Jc86035}}
to reply to me 14:44, 6 July 2017 (UTC)
(moved from WT:BAG#Is this the right forum to bot operators unpermissioned?)
Is this the right forum to bot operators unpermissioned? If not, could you redirect me? Thanks. -- Hobbes Goodyear ( talk) 20:59, 2 July 2017 (UTC)
@ JJMC89, Hobbes Goodyear, Magioladitis, Xaosflux, Primefac, BU Rob13, SQL, SkyWarrior, and Cyberpower678: Pinging those involved in previous discussions of which I am aware. ··· 日本穣 · 投稿 · Talk to Nihonjoe · Join WP Japan! 03:37, 3 July 2017 (UTC)
minor
and bot
so as to avoid unnecessary recent changes and watchlist impact. Additionaly, no edits unrelated to the specific task are being made without explanation. Finally, the bot's user page clearly defines the task with examples. —
xaosflux
Talk 17:33, 3 July 2017 (UTC)Is it OK that the bot edits a page multiple times to change one by one HTTP→HTTPS to some external links ? It's page history cluttering. Your opinions? -- XXN, 17:00, 3 July 2017 (UTC)
User:Kandymotownie recently made sweeping disruptive edits to high-profile pages such as Barack Obama and Donald Trump, via IABot, adding archive links for hundreds of sources which are still live, so this only creates useless bloat in wikitext. I reverted those but there is similar bot-assisted damage to other pages, mainly about Ghana. This user's talk page is full of warnings that s/he never responds to over several years, indicating a WP:NOTHERE attitude. How was s/he ever authorized to run a bot? In view of the recent disruptive actions, bot credentials should be revoked immediately, and perhaps a stern warning or short block is in order. — JFG talk 05:33, 7 July 2017 (UTC)
Editors are encouraged to add an archive link as a part of each citation, or at least submit the referenced URL for archiving,[note 1] at the same time that a citation is created or updated.See also this how-to guide. If you have a problem with these how-to guides, please take up your issue with the guide pages, not with editors who follow the guides in good faith. – Jonesey95 ( talk) 16:48, 7 July 2017 (UTC)
Hello everyone!!! I just nominated myself for BAG membership. Your participation would be appreciated.
Wikipedia:Bot Approvals Group/nominations/Cyberpower678 3— CYBERPOWER ( Message) 23:51, 9 July 2017 (UTC)
Bots Newsletter, July 2017 | |
---|---|
Greetings! Here is the 4th issue of the Bots Newsletter (formerly the BAG Newletter). You can subscribe/unsubscribe from future newsletters by adding/removing your name from this list. Highlights for this newsletter include:
BU Rob13 and Cyberpower678 are now members of the BAG (see RfBAG/BU Rob13 and RfBAG/Cyberpower678 3). BU Rob13 and Cyberpower678 are both administrators; the former operates BU RoBOT which does a plethora of tasks, while the latter operates Cyberbot I (which replaces old bots), Cyberbot II (which does many different things), and InternetArchiveBot which combats link rot. Welcome to the BAG!
We currently have 12 open bot requests at Wikipedia:Bots/Requests for approval, and could use your help processing!
Wikimania 2017 is happening in Montreal, during 9–13 August. If you plan to attend, or give a talk, let us know! Thank you! edited by: Headbomb 17:12, 19 July 2017 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
Hello all, I'd like to get some additional feedback on Wikipedia:Bots/Requests for approval/Yobot 55 - as to if "any any genfixes" is appropriate to bundle in here or not. I'm on the fence - this is a very fine technical task that may already be confusing for some editors to determine what occurred - but I'm also generally in support of not wasting edits and making the page better all at once. Please respond at the BRFA. Thank you, — xaosflux Talk 15:53, 23 July 2017 (UTC)
Right now, while we're in the middle of a huge list of speedy renaming of categories, Cydebot appears to have stopped working. It would be nice if some other bot could help out. עוד מישהו Od Mishehu 02:58, 24 July 2017 (UTC)
As of this particular moment, the category counts for ISBNs, PMID, and RFC magic links are 1102, 1189, and 2067, respectively. Obviously RFC has been deemed enough of a problem that manual oversight is required, but I thought I'd mention it. The remainder of the ISBN/PMID pages are either on transcluded pages (i.e. the huge batch of AFD Log pages), in userspace (which I have agreed to avoid), or odd cases which manual editing will be required. I don't know what MW's timeframe for turning off magic links is/was, but I think we're at the point where en-wiki can turn them off with little to no issue. I'm not sure if that's something for us specifically to do, but I figured an update on the situation would be helpful.
There are new cases popping up (mostly in the article space) daily, so the bots will probably keep running, but the bulk of the work (249k out of 250k pages) is complete. Primefac ( talk) 12:39, 23 July 2017 (UTC)
\[?\[?OCLC\]?\]?(:| )?(\d+)
→ {{OCLC|$2}}
has worked well for me.
Headbomb {
t ·
c ·
p ·
b} 12:53, 23 July 2017 (UTC)I think we should exclude from the list all the "Wikipedia:Articles for creation/..." pages. -- Magioladitis ( talk) 13:50, 23 July 2017 (UTC)
I've proposed adding a bot section to the dashboard. Comments welcomed. Headbomb { t · c · p · b} 17:26, 25 July 2017 (UTC)
The new IABot interface tool allows editors to archive all links in an article even when not dead, see Al83tito edit history ( example diff w/ 563 links added). Unlike other tools that operate on a single page, this is more like unattended AWB with a queue system giving great powers to editors who enjoy the ability to make massive edit runs with little effort. This feature can be run by any editor on-demand without needing prior consensus or approval.
We should have a discussion because this feature is not being met with complete acceptance. User talk:Al83tito talk page has example complaints. There is an open RfC at Village Pump for doing this for all articles on Wikipedia via bot. This discussion concerns the IABOt interface tool which does the same thing on-demand. My opinion this feature is powerful and apparently disruptive enough it should have more community discussion. Do we want to have this feature (archiving live links on-demand)? If so, do we want to allow it for mass edits with an AWB-like tool? And if so, do we want an approval mechanism such as AWB, or a bot approval like BOTREQ? Or leave things as they are? @ Cyberpower678, Al83tito, JFG, Lordtobi, and Dhtwiki: -- Green C 14:51, 10 July 2017 (UTC)
@ Andy Dingley: Which bots get a free ride against WP:CITEVAR? Headbomb { t · c · p · b} 01:20, 30 July 2017 (UTC)
We need a centralized debate to define a community guideline about archiving of live sources. However, a number of bot-assisted edits may be due to confusion by users clicking the only checkbox on the IABot page, which says "Add archives to all non-dead references". I have requested a UI improvement at User talk:cyberpower678#IABot: suggest UI clarification. — JFG talk 07:37, 30 July 2017 (UTC)
Bad idea. Have any of you "archive all links" enthusiasts considered whether the Wayback Machine would be able to handle the increased traffic if you replaced 20 million links with archived links? It's been timing out for hours now. A much better way to handle a dead link would be to have a bot ping the two most recent editors on the page with a "dead link" notice, then check back a week or two later to see whether the problem has been rectified (I have no idea whether that's technically feasible). Often the reason for the dead link is that the publisher moved the article, and a search for author and title will provide the new active link. Follow-up on Wayback Machine: Just got this from one of those bot-generated archived links. Space4Time3Continuum2x ( talk) 19:54, 30 July 2017 (UTC)
Good Idea. I think that a major argument against is the bloating of the code. I have been editing Wikipedia for about 7 months. When I first looked at the code I couldn't believe how messy it was; in my humble opinion it is horrible. I don't think adding archive links will make it appear any more bloated. To reduce the bloating I have discovered 2 templates that I now use for citations. List Defined References removes all of the citation code to the bottom of the page, so at the citation point all you do is call the citation name using Template:R. The wiki text would be tidy and readable and not susceptible to bloating. All editors need to be educated to use this template. The remaining issues are out of my level of understanding and I'll leave them for others to discuss. Every link I cite is archived. It will be a hell of a job to go back and and recover archived urls to each citation I have created once the link has died. 8==8 Boneso ( talk) 04:33, 31 July 2017 (UTC)
References
Please take a look at WP:TH#Dead WSJ links and the VPT thead linked from it. We are getting complaints about bot edits made this past march by Bender the Bot 8. DES (talk) DESiegel Contribs 03:20, 27 July 2017 (UTC)
See WP:TH#Dead WSJ links and this edit where removing the s apparently changed an effectively dead link to a live and correct one. Is this something temporary at wsj.com, or are wqe going to have to get a bot to undo these changes from March? DES (talk) DESiegel Contribs 02:41, 27 July 2017 (UTC)
@ Bender235: and others: While the debate over which sort of link is a valid one, any discussion of this matter should note Wikipedia:Sockpuppet investigations/Nate Speed. – Train2104 ( t • c) 00:38, 2 August 2017 (UTC)
This has just been created. Feel free to be WP:BOLD and add missing terms which you feel would be useful. Headbomb { t · c · p · b} 14:10, 7 August 2017 (UTC)
Ponyo protected page Kitni Girhain Baaki Hain because of sockpuppetry, but this template was removed by the MusikBot saying that it is an unprotected page. SahabAli wadia 11:12, 19 August 2017 (UTC)
As soon as a page is moved, Xqbot fixes all the resulting double redirects right away, immediately and instantly. Also, the links to the fixed target pages are shown with the prefix "en:" in the edit summaries. I don't like this behavior, because it can lead to serious errors when there is page-move vandalism. The bot should return to its old behavior. GeoffreyT2000 ( talk, contribs) 23:49, 11 August 2017 (UTC)
I am wondering if we have any policy, rules, or consensus on what to do with a bot where a) the bot isn't used by anyone but its operator and b) the operator hasn't edited Wikipedia for anything but the creation of this bot. Basically, a bot which is at best for the convenience of one reader, and at worst not used at all, but still editing every day.
Specifically, I am concerned about the recently approved User:Wiki Feed Bot, operated by User:Fako85. It makes 16 edits a day, to its own space, to subpages of Fako85, and to User:DNNSRNST, which is an editor with one edit (setting up his talk page for this bot, which wasn't even approved at the time). Fako85 has made no edits unrelated to this bot.
The value of having such a bot seems minimal, and I'm not sure that this value is sufficient to outweigh the potential risks (strain on servers? bot account hacking?). Fram ( talk) 07:55, 8 September 2017 (UTC)
how heavy [a read of a large portion of the recent changes log] is? That is actually a point that I did not check when calling this a low-resource-usage bot. Tigraan Click here to contact me 12:54, 8 September 2017 (UTC)
@ Fram: The BOTREQ contains the text: "Currently Wiki Feed does not use the RCStream. We're considering it, but we need some time to implement this as it requires a fair amount of changes to the system.". Maybe it is wise to ask Fako to switch to EventStreams? ((( The Quixotic Potato))) ( talk) 19:40, 8 September 2017 (UTC)
If I understand the BOTREQ correctly (specifically the edit dated 12:42, 22 July 2017) then the bot will have to check if all images it is using are still usable every 24hrs. Imagine if a lot of people use this bot, then that would mean a massive amount of requests, right? ((( The Quixotic Potato))) ( talk) 20:06, 8 September 2017 (UTC)
For those who have missed it this week, something bot-related. Headbomb { t · c · p · b} 02:52, 11 September 2017 (UTC)
I've spotted User:Bender the Bot, User:KolbertBot and maybe others, helpfully converting HTTP links to HTTPS where sites have begun supporting encrypted connections since links were added to articles. It looks as if this is being done a few websites at a time based on prevalence of links to each site and ease of conversion (obviously much easier all round if http://example.com/page corresponds exactly to https://example.com/page without needing to further amend the URL). Has anyone considered using the rulesets established for HTTPS Everywhere to find many, many more sites that can have link conversion applied, including lots of obscure 'long tail' ones that are never going to get noticed by the bot operators? These rulesets are well tested because they are in daily use by HTTPS Everywhere's userbase, so there shouldn't be too many problems encountered where links are broken by the change, even if relatively complex regular expressions have to be applied rather than straightforwardly adding an 's'. See https://www.eff.org/https-everywhere/atlas/ for a list and https://www.eff.org/https-everywhere/rulesets for more info. If this is too complicated, would it be worth instead (or for starters) plundering the resource that is Chrome's HSTS preload list? Each of the sites on it has committed to serving web content through HTTPS only for the long haul, generally redirecting http:// URLs themselves (but thwarted if someone is intercepting traffic on a user's first visit, hence the need for a preload list shipped with the browser), and may have been considered a high-value target for surveillance/man-in-the-middle by the maintainers of the list. Either way, relevant work is being done in this area by outside parties that bot operators here could piggyback on. Beorhtwulf ( talk) 16:27, 18 September 2017 (UTC)
This discussion was spread across multiple pages, including some village pumps, but organically ended up at a dedicated page. However, we are now at (or beyond?) the stage where this again needs wide publicity and participation. Please see the long discussion at Wikipedia talk:Wikidata/2017 State of affairs#Strategies for improving the descriptions and especially the "Proposal from WMF" subsection (currently the bottom one), where a new magic word is proposed and specific implementations of it discussed.
This is a discussion which will potentially impact all articles and affect the first thing all users get to see in mobile view and on apps (and elsewhere), so getting enough input on this is important. I have posted it at WP:CENT, WP:VPPR and WP:AN. Feel free to drop notes at other places.
I have included this noticeboard because, if this proposal or something similar gets accepted, there probably will need to be one or two big bot runs (and perhaps some clever bot programming) across many or all articles.
Please keep the actual discussion in one place if possible. Fram ( talk) 07:07, 29 September 2017 (UTC)
Hi, I think this got approved too fast, and under a misconception. The BRFA was open for a grand total of 16 minutes, and was closed by a BAG member whose bot's (for lack of a better word) errors are the subject of the task. It was clarified that these are actually supposed to be medium priority lint errors, not high priority like the BRFA states. Pings: @ Cyberpower678 and Nihlus:. Legoktm ( talk) 07:46, 1 October 2017 (UTC)
bot edits can be hidden from view within the watchlist, and those that do not want to hide bot edits can learn what it is doing with one look and ignore the rest. – Nihlus ( talk) 15:50, 1 October 2017 (UTC)
I knew I had seen a discussion about this before: it is at Wikipedia:Bots/Noticeboard/Archive 11#Archiving links not dead - good idea? Most of the discussants there (obviously mostly fans of bots) seemed to approve of archiving all the reference links in an article, even the live ones. Some of us less technically oriented editors think the practice can be damaging to articles. Recent example, which is the reason I am bringing it up: With this recent edit to the article Barack Obama, the IABot v1.5.2 archived 392 references, adding 74,894 bytes to the article, and increasing its already huge size by 22.6%, from 330,241 to 405,135 bytes. Is that really something that people here think is a good outcome? (The other editor reverted at my request.) Does the bot offer the option of archiving only the dead links, as some of us non-techie people have requested? -- MelanieN ( talk) 18:04, 17 September 2017 (UTC)
@ Dhtwiki and Cyberpower678: I posted a question about this at Wikipedia talk:Link rot#Using a tool to archive live links. (Although given that project's enthusiasm about archiving, I wonder if that was kind of like asking an insurance salesman if I need more insurance!) -- MelanieN ( talk) 15:12, 4 October 2017 (UTC)
There is a discussion at Wikipedia talk:Double redirects#The bots should operate with a delay where the input of bot operators, particularly those who operate bots which fix double redirects, would be useful. Note that the section contains multiple ideas (not just the one in the section title), but not yet any firm proposals. Thryduulf ( talk) 16:44, 9 October 2017 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Please see Wikipedia talk:AutoEd#Two bad edit types: changing spaced en dashes to unspaced em dashes, against MOS:DASH; changing page and other numeric ranges like 1901–1911 to 1901–11, which is against the spirit if not letter of MOS:NUM. The fact that the rule to full numbers was only applied to the "Dates" section at MOSNUM is an oversight, which has been fixed (I expect the fix to stick, because all the reasoning about date ranges also applies to other ranges). The rest of what this tool is doing needs closer examination under the style guidelines and WP:COSMETICBOT. — SMcCandlish ☏ ¢ >ʌⱷ҅ᴥⱷʌ< 23:39, 10 October 2017 (UTC)
Wikipedia_talk:Version_1.0_Editorial_Team/Index#Draft_links may be of interest. If noone more experimented with Wikimedia bots can look into it, I could ultimately, but I have not yet looked for its code and have no experience with the APIs involved yet. The previous maintainers are inactive. Thanks, — Paleo Neonate – 01:21, 11 October 2017 (UTC)
This may interest many of you. Please comment. Headbomb { t · c · p · b} 12:29, 11 October 2017 (UTC)
AvicBot used to list pages from Category:Candidates for speedy deletion as abandoned AfC submissions at User:AvicBot/AfCCSD. However, apparently the list stopped being updated because the category was renamed to " Candidates for speedy deletion as abandoned drafts or AfC submissions" per an RfC that expanded the scope of G13. The bot needs to be updated to use the new category name to prevent the list from being permanently empty. GeoffreyT2000 ( talk, contribs) 23:15, 5 October 2017 (UTC)
{{Portal:Current events/{{#time:Y F j|{{{1|{{CURRENTYEAR}}}}}-{{{2|{{CURRENTMONTH}}}}}-{{{3|{{CURRENTDAY}}}}}}}}}
should solve the problem. It's fully protected, otherwise I would do it.
Nihlus 15:48, 13 October 2017 (UTC)The extra lines have been removed. -- John of Reading ( talk) 06:02, 23 October 2017 (UTC)
Hello there, I was wondering if anyone could direct me as to how to get started using and mining Wikipedia database dumps? I have downloaded the latest pages-articles.XML.bz2 version. The goal is to mine for a particular string in order to figure out the relative need for a bot and to build a list of pages that would need to be edited if the string is present within the namespace. ( Xaosflux sent me here). Thank you for your help. -- TheSandDoctor ( talk) 16:15, 24 October 2017 (UTC)
Hey, all, I was thinking of writing a script to automatically assess G13 speedy deletion requests, after an influx of them today. Basically, I'd write a script that would automatically scan Category:Candidates_for_speedy_deletion#Pages_in_category where, for each page there in the Draft namespace, check to see if the CSD nomination is G13 (probably by testing for inclusion in Category:Candidates for speedy deletion as abandoned drafts or AfC submissions and the second-most recent edit (i.e. the edit before the nomination) is more than 6 months prior, and if so, provide a deletion link on the page. But I don't know if such assistance is too close to MEATBOT-like automation, especially given the use of admin tools, so I figured I'd ask here first in case people think that would need some kind of approval. I figure G13 is low-impact enough (not article space, free refund) and has a simple enough inclusion criteria that it isn't a big deal. Any thoughts? Writ Keeper ⚇ ♔ 18:05, 27 October 2017 (UTC)
I thought I would point everyone to the significance of v1.6 of IABot.
https://github.com/cyberpower678/Cyberbot_II/pull/46 — CYBERPOWER ( Trick or Treat) 21:46, 28 October 2017 (UTC)
A provocative title for a cautionary tale. Please see User_talk:Ladsgroup#Helping the vandals do their work for the details of a minor episode of janitors cleaning up the crime scene too quickly / early / inappropriately. Shenme ( talk) 23:34, 22 October 2017 (UTC)
Is this a legit bot? I don't recall any BRFAs for it... CHRISSYMAD ❯❯❯ ¯\_(ツ)_/¯ 13:48, 9 November 2017 (UTC)
The 2017 Community Wishlist Survey is up for proposals (November 6-19). You can make proposals and comment on stuff to help the technical collaboration review and organize the proposals, but the larger community input will happen from Nov 27–Dec 10. Headbomb { t · c · p · b} 15:12, 8 November 2017 (UTC)
Where's the place to "seek out a community discussion" on "a community bot" (whatever that means) about getting a bot to stop leaving a particular kind of pointless message? The InternetArchiveBot does various things, and even leaves some helpful messages, but when it leaves a note on an articles talk page that all it did was provide an archive-url to a cite that didn't have one, this is pointless, annoying bot-spam. We don't need to know that it did something trivial that no one sane would question, and we already know – anyone watching the article already saw the edit, so now they're getting a second watchlist hit for the same thing for no reason.
I went to the bot's talk page, and it isn't editable except by admins. I got to the author/operator's page, which directed me to file a ticket about it a Phabricator. So I did [17]. The response to that was a testy "The bot is currently approved to run with these message.", which is a silly thing to say. All the bots are approved to do what they do or their operator would be in trouble and the bot would be blocked. I was told "The last discussion regarding them had no consensus for change", which means it has been discussed before and other people are tired of these messages, too. "If you feel the bot should stop leaving messages, please seek out a community discussion. This is a community bot". I see a bot requests page, which seems to be only for asking for bots to do stuff not to stop doing them, and isn't really a discussion page; and the noticeboard, which appears to be for reporting bugs and policy violations.
So, I'm not really sure what the process or venue is. PS: This isn't about ALL InternetArchiveBot notices, just the the no-one-will-care pointless ones. — SMcCandlish ☏ ¢ >ʌⱷ҅ᴥⱷʌ< 12:50, 4 October 2017 (UTC)
Simple pseudocode fix: if $CHANGESBOTMADE == ($ARCHIVEURL or ($ARCHIVEURL + $DEADURLYES)) then $POSTABOUTIT = no
– i.e., if it's done anything at all other than that trivia (including that trivia and something non-trivial), then go ahead and post a notice.
—
SMcCandlish
☏
¢ >ʌⱷ҅ᴥⱷʌ< 15:04, 4 October 2017 (UTC), clarified 23:05, 4 October 2017 (UTC)
I am the editor that created the discussion in Wikipedia talk:TPG about bot notices in talk pages. In regards to the InternetArchiveBot and its announcement about modification of external links: what's the point of keeping these notices on an article's talk page after an editor has checked the links and found them okay? Pyxis Solitary talk 05:27, 5 October 2017 (UTC)
I for one find the InternetArchiveBot notices useful. Though the bot is getting better, it doesn't always pick the right archive URL and sometimes misclassifies links as dead; it's also quite possible that a link that it detects needs manual updating. The talk page notices, which as far as I know only show up when the bot adds a (possibly invalid) archive URL, are a useful way of keeping track of what it does, and serve to show that a human has indeed OK'd the bot's changes. The notices also serve as a handy way to get to the bot's interface, which I've used several times. Graham 87 08:05, 5 October 2017 (UTC)
Agree that these posts are a waste of time and bandwidth I notice ClueBot NG doesn't do the same thing whenever it reverts vandalism. It simply leaves a link in the edit summary asking others to report false positives. I don't see why something similar can't be implemented here - in the example SMC provides, the summary for the edit the bot is referring to simply reads, "Rescuing 1 sources and tagging 0 as dead. #IABot (v1.5.4)". There's plenty of space in there for a link like ClueBot NG leaves. It's one thing to alert users to edits like these, but there's a better way to do it, if it needs to be done at all. Zeke, the Mad Horrorist (Speak quickly) (Follow my trail) 14:13, 7 October 2017 (UTC)
suggest WP:BOTAPPEAL for communications issues. ... when the bot was doing a straightforward edit the talk page seemed completely over the top. That is not to say the bot isn't useful and by the look of it working well. But I think there are a number WP:BOTCOMM issues.
I think as it is a bot it would be better if it admitted to being a bot and gave precise instructions. Rather than [
this diff] I think I'd prefer to see something along the lines of:
The Internet Archive BOT has made the following changes:
URL1 (dead) -> Archived URL
It would be helpful if the modifications can be manually reviewed and set checked=true in the sourcecheck template if the edit was successful or failed if not. For detailed information on InternetArchiveBot see **HOWTO** .. the HOWTO going back to a BOT page or subpage and ensuring the FAQ/HOWTO covered the case of manually checking BOT work first.
In summary the BOT looks to be doing some great work but I think its really tricky not to fall foul with WP:BOTCOMM and I think that area needs an improvement. It prefer it didn't make a talk page entry for simple edits but understand that *might* considered necessary. Djm-leighpark ( talk) 23:07, 9 November 2017 (UTC)
Notification for anyone who uses that category in their bot. Jo-Jo Eumerus ( talk, contributions) 11:00, 11 November 2017 (UTC)
I tried to start a discussion regarding Cluebot on the Cluebot talk page and my comments were archived by the bot without response. I'm concerned about Cluebot reverting good-faith edits, and the effect this may have on potential contributors.
Reading through the Cluebot pages and considering the lack of response, and rapid archiving, of my comment -- it is my feeling that discussions of this nature are not welcomed by the bot operator. It seems to me that the wider community ought to have a voice in how Cluebot is operated and should be entitled to review Cluebot's work on an ongoing basis and discuss the bot's settings and edits without having to fill out forms and have the discussion fragmented. I am concerned that the characterization of the 0.1% "false positive rate" used by the bot's proponents, though useful technically, belies the substantial number of good-faith edits this bot is reverting. Since it has been some years since the bot was approved, I think it's appropriate to review the work it is doing in light of the current editing climate and the evolution of the bot itself (and its settings) over the years.
At a minimum, I believe that the bot's operators and proponents have an obligation to take these concerns seriously enough to discuss them.
While mistaken reverts can be undone, the frustration they may cause to a well-meaning, fledgling contributor cannot.
The Uninvited Co., Inc. 19:52, 3 November 2017 (UTC)
( ←) To answer your two specific questions:
How have the decisions been made over what edits the bot will revert?
— The Uninvited Co., Inc.
What is the best way to have an open discussion about the way this automation is being conducted and its effect on new contributors?
— The Uninvited Co., Inc.
-- Cobi( t| c| b) 23:03, 3 November 2017 (UTC)
To reply to your comments here:
I tried to start a discussion regarding Cluebot on the Cluebot talk page and my comments were archived by the bot without response. I'm concerned about Cluebot reverting good-faith edits, and the effect this may have on potential contributors.
— The Uninvited Co., Inc.
False positives are an unfortunate technical inevitability in any system that automatically categorizes user content. Human editors suffer from this as failing as well. The only thing that can be done is to figure out where the trade-off should be made. I am certainly open to discussing where that trade-off is, but as you haven't made a proposal yet, I am happy with where it currently is.
Reading through the Cluebot pages and considering the lack of response, and rapid archiving, of my comment
— The Uninvited Co., Inc.
It's the same 7 day archival period you have on your talk page. I was busy and your message at the time didn't appear particularly urgent in nature, and in the 7 days no one else had any thoughts on the matter and so the bot archived it.
it is my feeling that discussions of this nature are not welcomed by the bot operator.
— The Uninvited Co., Inc.
This is a hasty generalization.
It seems to me that the wider community ought to have a voice in how Cluebot is operated and should be entitled to review Cluebot's work on an ongoing basis and discuss the bot's settings and edits without having to fill out forms and have the discussion fragmented.
— The Uninvited Co., Inc.
Free-form discussion is encouraged on the bot's talk page. Or here.
I am concerned that the characterization of the 0.1% "false positive rate" used by the bot's proponents, though useful technically, belies the substantial number of good-faith edits this bot is reverting.
— The Uninvited Co., Inc.
False positive rates are used as standard metrics for any kind of automated classification system. <0.1% means less than one edit is falsely categorized as vandalism out of every thousand edits it examines.
Since it has been some years since the bot was approved, I think it's appropriate to review the work it is doing in light of the current editing climate and the evolution of the bot itself (and its settings) over the years.
— The Uninvited Co., Inc.
Review is always welcome so long as it comes with concrete, actionable changes of which the merits can be properly discussed. Pull requests are even better.
At a minimum, I believe that the bot's operators and proponents have an obligation to take these concerns seriously enough to discuss them.
— The Uninvited Co., Inc.
We do.
While mistaken reverts can be undone, the frustration they may cause to a well-meaning, fledgling contributor cannot.
— The Uninvited Co., Inc.
Of course, but that is hard to measure objectively. Do you have any good metrics on the frustration caused to well-meaning, fledgling contributors? I'd love to see that data, and be able to tweak things to help those metrics go in the direction we want. -- Cobi( t| c| b) 23:39, 3 November 2017 (UTC)
See this discussion on Meta. Old/invalid accounts were renamed & given new "enwiki" names by the Maintenance script bot but the original accounts apparently weren't closed & account info wasn't migrated to the new/valid accounts... So. Editors are continuing to edit under the old/invalid accounts. Shearonink ( talk) 16:30, 9 November 2017 (UTC)
The community is invited to comment on the appeal lodged by Δ at Arbitration Requests for Clarification and Amendment.
While the discussion at Wikipedia talk:Double redirects#The bots should operate with a delay has pretty much died down without clear consensus, there's been a suggestion that double-redirect-fixing bots should tag the redirects they fix with {{ R avoided double redirect}}. This will help alert human editors to redirects that are left pointing to the wrong location as a result of disputed moves or mergers being reverted. Can this be implemented? Pinging bot operators R'n'B, Xqt and Avicennasis. -- Paul_012 ( talk) 10:12, 21 November 2017 (UTC)
See Wikipedia talk:Arbitration/Requests#Crosswiki issues: Motion (November 2017). This will be relevant both to WP:BAG members and Wikidata-related bot operators. Headbomb { t · c · p · b} 00:37, 28 November 2017 (UTC)
Since AWB has a bot flag, that turns it into a bot engine, I thought you might want to know about a vote going on that will affect the nature of that program:
The Transhumanist 00:25, 2 December 2017 (UTC)