This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 15 | ← | Archive 20 | Archive 21 | Archive 22 |
I have recently come across an apple store app for iPhone and iPad named wikibot. It is just a regular wiki reader, one of many available on the app store for reading (but not editing I notice) articles. Concerned the name implies something to do with bots - just wanted to see the community's thoughts on this. The link is at [1]. Perhaps we could convince them to rename it? Rcsprinter (talkin' to me?) @ 23:00, 7 September 2012 (UTC)
I need a bot to create mass categories (~400), on other wikiproject. What solution can you suggest? XXN ( talk) 13:56, 19 November 2013 (UTC)
I recall seeing a bot making changes, such as changing a hyphen to an n-dash in a range, such as changing 1998-99 to 1998–99. But I can't remember which bot it was. Does anyone know? — Preceding unsigned comment added by Jc3s5h ( talk • contribs) 00:16, 16 December 2013 (UTC)
Since citations are ever-popular items for bots to operate upon, I am providing this link to a discussion about what to do when citation template documentation, or editions of external printed style guides, change: WT:CITE#When citation style guides are updated.
Obviously it is important for bot operators who change citations should be following the current rules for the citation stye chosen for a particular article, or the rules as they existed at the time the article was written. Jc3s5h ( talk) 15:25, 8 May 2014 (UTC)
As someone who has been doing this manually for years, I hereby dutifully beg of anyone who is technically proficient and knows how to create and run a bot that will:
Please see the centralized discussion at Wikipedia:Bot requests/Archive 61#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK ( talk) 09:31, 4 August 2014 (UTC)
Please see Wikipedia:Village pump (policy)/Archive 114#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK ( talk) 22:53, 5 August 2014 (UTC)
Please see Wikipedia:Village pump (policy)#CatVisor and User:Paradoctor/CatVisor#Planned features if you are willing and able to assist this innovative WP project move along it would be greatly appreciated. Thank you, IZAK ( talk) 23:38, 12 August 2014 (UTC)
This may or may not be a problem with a bot, it could be an AWB issue according to something I read. But this bot made another 1000 edits after this problem, and it is a problem, bad code should be fixed.
I wrote an article with a heading typo: instead of equal signs two left and two right, I put one left and two right. A bot changed it to two left and three right.
I told the bot operator what happened, and his attitude is that since I made a typo and his bot's edit made a similar typo--it is still an unbalanced headline code--there is nothing wrong with the bot. But the bot should only correct headline levels, and it identified this as a levels problem and made a pointless edit, a coding deficiency.
Can an experienced coder explain this error to the bot operator and that it should be fixed? He does not understand. (Someone making 1000s of bot edits a day should get this.) Please don't ping me, if you control bots on Wikipedia, you, too, should get this. MicroPaLeo ( talk) 23:20, 26 February 2015 (UTC)
MicroPaLeo surely the bot's logic could be better to check for these cases. But my experience show that a case like that is not very common. One of Wikipedia's principles is WP:BEBOLD. Feel free to fix articles and improve edits. Bots are good but not good enough. Human editors are always better. Most of the backlogs need human attention and Wikipedia is bases on active editors not bots. -- Magioladitis ( talk) 13:11, 27 February 2015 (UTC)
There should be a reliability history report generator, that generates the reliability of the bot based of its history based on how many errors have it gone thru since the beginning of the bot. There also should be a bot error, log off all the error the bots have gone thru since the beginning of the bots operation. Doorknob747 ( talk) 03:51, 24 March 2015 (UTC)
Ok so there's List of bots which was tantalizing to me, but really basically useless. Is there any list anywhere that has all the bots and what they do? I was trying to quickly find the name of the bot that handles tagging of new articles when they are found to be copied from other sites. -- Hammersoft ( talk) 16:49, 17 July 2015 (UTC)
This
edit request to
Wikipedia:Bots has been answered. Set the |answered= or |ans= parameter to no to reactivate your request. |
Could someone add User:JeffGBot to the list of Wikipedia bots? He checks pages for dead links. Here is an example. Thanks, 73.223.175.207 ( talk) 00:47, 27 November 2015 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Hi there,
What do you think about this edit (and the corresponding to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? -- Tim 1357 talk| poke 02:45, 23 September 2016 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Hi there,
What do you think about this edit (and the corresponding to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? -- Tim 1357 talk| poke 02:45, 23 September 2016 (UTC)
See Wikipedia:Village pump (idea lab)/Archive 21#Bot content with updates. The idea is a category of bot that pulls data from a reliable external source, formats it as text and places it in files that can be transcluded into articles. Obvious types of data for a settlement are terrain, temperatures, rainfall, census data, election results. Abuko is a mock-up of the result, using data generated by User:Lsj's Swedish bot. Edit the location section to see the transclusion. Where this differs from other content generators is that it can be rerun any time to refresh the data, perhaps quite frequently. That may imply a need for rules or processes specific to this type of bot. Any input welcome. Aymatth2 ( talk) 16:56, 24 September 2016 (UTC)
The bot? SummerPhD is confusing me. Sausagea1000 ( talk) 19:44, 30 December 2016 (UTC)
The "bot" flag setting option is not obvious, how is it done if the bureaucrat roles don't exist. - 180.149.192.139 ( talk) 02:41, 13 February 2017 (UTC)
I put a new section on how to hide AWB here, since this is often a WP:MEATBOT related issue, but we could move it to a section of WP:AWB if you feel it's out of place here. Headbomb { talk / contribs / physics / books} 15:18, 24 February 2017 (UTC)
Hi, not sure how to proceed with this situation.
It concerns User:Dispenser's Checklinks tool.
Facts as I know them:
How should we proceed? I suspect Dispenser would be OK with turning it off if requested, but I don't know what kind of community discussion is needed or where. -- Green C 20:29, 6 March 2017 (UTC)
Is there a page where one can raise complaints as to robotic editing of article pages? I have had a number of improvements to the Siri page, contextualizing or removing unreliable sources removed by a username who does not seem to do anything but undo edits. 203.109.212.42 ( talk) 09:47, 24 March 2017 (UTC)
Hi! I'm trying to set up some anti-vandal bots and such over on my wiki I am an owner of the wiki, and right now I use AWB for basic fixes, but vandalism has been a big problem in the past before I adopted the wiki and reverted all of it. Can someone help me learn to use a Bot for my Wiki? I currently run on Windows 10. Thanks! FiveCraft ( talk) 23:06, 1 April 2017 (UTC)
I am new to bots and eventually plan to develop a bot for Wikidata which will compare information published by the US Bureau of the Census to information about New England towns and make appropriate edits. It's possible that this might be adopted to edit the corresponding Wikipedia articles later.
For now I'm just following the tutorial at wikidata:Wikidata:Pywikibot - Python 3 Tutorial which will not involve making any edits, except to sandbox items, but it will involve reading data from Wikipedia. I'm wondering when is the best time to create a bot account, now, or when the nature of the edits will require approval. I am aware that separate approvals would be needed for each project where non-sandbox edits are to be made.
I am also wondering about the preferred method for logging in, or equivalent, and whether the same credentials will work on both Wikidata and Wikipedia. Jc3s5h ( talk) 18:01, 30 July 2017 (UTC)
把我的网页串改 Zhanlang1975 ( talk) 23:24, 1 November 2017 (UTC)
Do we have anything on this already? We're contemplating adding a footnote to WP:Manual of Style about not using AWB, etc., to "enforce" the "rules" of MoS across a zillion pages (see WT:Manual of Style#Proposed footnote to discourage mass changes – the actual proposal, not the joke thread under it, though you may find that amusing on the side). While we've identified a relevant ArbCom ruling, it seems like something that should be in the behavioral or editing guidelines somewhere. — SMcCandlish ☏ ¢ 😼 22:08, 4 July 2018 (UTC)
In fact we need the rules to encourage editors and bot owners to enforce rule of MoS. Not enforcing rules is allowing custom styles over Wikipedia. I would be glad to participate in a discussion where bots will run daily to enforce rules. In an ideal situation editors using VE or any other editor get a heads up of how to comply with MoS. -- Magioladitis ( talk) 16:10, 5 July 2018 (UTC)
I suggest that we leave massive messages to AWB editors to tunr general fixes on to help enforce MoS rules. -- Magioladitis ( talk) 16:13, 5 July 2018 (UTC)
See Wikipedia:Miscellany for deletion/Wikipedia:WikiProject Spam/Report, where it is proposed to delete Wikipedia:WikiProject Spam/Report, an old Betacommand bot report page. — SmokeyJoe ( talk) 00:48, 27 January 2019 (UTC)
Looking at a query by one of my bots, I see the message:
Use
Special:ApiFeatureUsage to see usage of deprecated features by your application.
So I go to Special:ApiFeatureUsage and it wants me to fill in a "User agent" field. What is a " user agent" in this context? What do I enter in this field to specify that "my application" that I want to see what deprecated features it's using is one of my bots (i.e. RMCD bot or Merge bot). Thanks, wbm1058 ( talk) 15:30, 13 May 2019 (UTC)
curl_setopt($this->ch,CURLOPT_USERAGENT,'php wikibot classes');
used by bot function get
and function post
in
User:RMCD bot/botclasses.php – so can I change that to something unique in my copy of that function library to ensure that only my applications are reported, e.g. 'php wikibot classes wbm1058'?action=query&prop=revisions&!rvslots
: This is about the fact that you're using
action=query&prop=revisions without specifying the rvslots
parameter. The plan is that MediaWiki will someday allow more than one "slot" of content in a page, so for example it might be possible to have the template and its TemplateStyles stylesheet at the same title instead of having to use a subpage. Right now though, the only slot is rvslots=main
.action=login&!lgtoken
: You're using action=login
without supplying the lgtoken
parameter to get the login token from the NeedToken response. The new way to do it is to use
action=query&meta=tokens, supplying type=login
.action=query&prop=info&intoken
: You're using action=query&prop=info&intoken=...
to fetch a token, probably an edit token with intoken=edit
Again, the new way to do it is to use
action=query&meta=tokens, most likely supplying type=csrf
. The help for whichever token-needing module you're using will tell you for sure what value of type
you need, e.g. at
Special:ApiHelp/edit it says token: A "csrf" token retrieved from action=query&meta=tokens.
Examples (which I've edited):
BotPasswords
@ Anomie: is there a similar MediaWiki:-summary page for Special:ApiFeatureUsage? MediaWiki:ApiFeatureUsage-summary? I don't want to create it before I can confirm the pagename. wbm1058 ( talk) 15:49, 14 May 2019 (UTC)
uselang=qqx
to the URL, e.g.
/info/en/?search=Special:ApiFeatureUsage?uselang=qqx. It's a little trickier for pages that only show as the result of a form post; for those I normally use the developer console to find the <form>
and adjust the action
attribute.
Anomie
⚔ 23:01, 14 May 2019 (UTC)OK, I'll pick up this dropped ball and work my way through it, documenting what I do here so that (1) others with similar needs can perhaps follow this for guidance, and (2) anyone watching this page can correct me on anything I get wrong.
The first point I want to make is on the need for application ("front-end" in modern lingo) developers to maintain their own libraries. Rather than the systems (back-end) guys doing it. Neither my copy nor Sam Reed's copy nor Kunal Mehta's copy of botclasses.php checks for [warnings] and, if found, passes them on to the end user. So don't expect your end users to be immediately aware of these warnings passed through the API, as they weren't showing up on my bot's console.
The library has a helpful comment with a link to a page explaining the two-step login procedure: /* This is now required - see https://bugzilla.wikimedia.org/show_bug.cgi?id=23076 */
After inserting a couple print_r($ret);
lines in the appropriate places, the library printed the warnings (something it only did when there was an error) so I could see them on my console:
This message is returned by step 1 of the login process:
This message is returned by step 2 of the login process:
Relevant advice from above:
action=login&!lgtoken
: You're using action=login
without supplying the lgtoken
parameter to get the login token from the NeedToken response. The new way to do it is to use
action=query&meta=tokens, supplying type=login
.So I replaced "action=login" with "action=query&meta=tokens&type=login" for step 1. But it's not that simple. I got a new warning:
So I don't need to send my login name and password in step 1 anymore? The query did return a [logintoken], similar to the [token] returned by the old method. Apparently not:
mw:API:Tokens. So, I simply removed the $post
array that passed in the ID & pw from the query in the library. No longer a need to check if ($ret['login']['result'] == 'NeedToken')
. For step 2, I replace ['login']['token']
with ['query']['tokens']['logintoken']
endorphin rush It worked! I'm still getting the warning about the need to use BotPasswords in step 2, but time to take a break to have a beer to celebrate success in step 1. Meanwhile, you can tell Sam and Kunal to make THIS change in their copy of botclasses.php – wbm1058 ( talk) 20:18, 15 May 2019 (UTC)
Success! I implemented BotPasswords for all three of my bot accounts, and updated Help:Creating a bot § Logging in for both the new GET/POST login method and BotPasswords. I created MediaWiki:Botpasswords-text and MediaWiki:Botpasswords-label-appid, which should make the BotPasswords interface much more intuitive. – wbm1058 ( talk) 14:39, 16 May 2019 (UTC)
Would a bot that automatically checks Special:Log/newusers and posts a welcome message overload the project? TheEditster ( talk) 09:52, 11 July 2019 (UTC)
Hi all, I did some searching around and found this page and the list of bots. Of course, the bots are users, but they are special. Which is why this project page exists in the first place. Now, last Monday we were hacking on federated SPARQL queries against Wikidata and a few external end points, and the question came up what data sources are used by Wikidata. So, I explained the general idea, and the bots came up. And I was wondering if we could use the WDQS to list all databases that bots use as input to synchronize with Wikidata. But it seems that the bots, or just the longer running bots, do not have Wikidata items themselves. I could imagine this semantic bot page would list the operating user, which databases it imports CC0 data from, which properties it uses (in the population), maybe relevant ShEx for those imports, etc. Hoping I did not just overlook it, I was wondering if there is something equivalent that I could use instead? If not, what are your thoughts on actually making items in Wikidata for bots with information like given above? -- Egon Willighagen ( talk) 10:38, 23 August 2019 (UTC)
Which box do I check in BotPasswords to give my bot permission to merge page histories? I couldn't find anything in the list that looked relevant. @ Anomie:? wbm1058 ( talk) 03:07, 4 July 2020 (UTC)
mergehistory
. According to
Special:ListGrants, there is no grant that provides that right, so you'd have to request that MediaWiki (or Wikimedia's configuration) be changed to add that to one of the existing grants or to create a new grant for it. Also of note (for anyone else reading this) is that normal bots don't have that right anyway, but
User:Merge bot is an adminbot.
Anomie
⚔ 03:15, 4 July 2020 (UTC)
<!--NOTE: This is NOT THE LocalSettings.php file for en.wikipedia.org's install; the settings file is not editable via the wiki interface-->
You are invited to join the discussion at User talk:MajavahBot/Bot status report § Making this table more useful for identifying bots that have failed. {{u| Sdkb}} talk 02:04, 7 August 2020 (UTC)
Read this message in another language
The Wikimedia Foundation will be testing its secondary data centre. This will make sure that Wikipedia and the other Wikimedia wikis can stay online even after a disaster. To make sure everything is working, the Wikimedia Technology department needs to do a planned test. This test will show if they can reliably switch from one data centre to the other. It requires many teams to prepare for the test and to be available to fix any unexpected problems.
They will switch all traffic to the secondary data centre on Tuesday, September 1st 2020.
Unfortunately, because of some limitations in MediaWiki, all editing must stop while the switch is made. We apologize for this disruption, and we are working to minimize it in the future.
You will be able to read, but not edit, all wikis for a short period of time.
Other effects:
This project may be postponed if necessary. You can read the schedule at wikitech.wikimedia.org. Any changes will be announced in the schedule. There will be more notifications about this. Please share this information with your community.
User:Trizek (WMF) ( talk) 10:30, 31 August 2020 (UTC)
This is a reminder of a message already sent to your wiki.
On Tuesday, October 27 2020, all wikis will be in read-only mode for a short period of time.
You will not be able to edit for up to an hour on Tuesday, October 27. The test will start at 14:00 UTC (14:00 WET, 15:00 CET, 10:00 EDT, 19:30 IST, 07:00 PDT, 23:00 JST, and in New Zealand at 03:00 NZDT on Wednesday October 28).
Background jobs will be slower and some may be dropped. This may have an impact on some bots work.
Know more about this operation.
-- User:Trizek (WMF) ( talk) 09:25, 26 October 2020 (UTC)
First, can anyone tell me which bot it is that goes around replacing potentially confusing template shortcuts in mainspace with actual template names (e.g. replaces {{
cn|...}}
with {{
citation needed|...}}
?
Second, where is this stuff listed? I.e., how can I answer this question for myself? There are a lot of bot-related talk pages, that all have notices basically saying "this is probably not the talk page you want", but this is also obviously not a question that needs bots-noticeboard attention. — SMcCandlish ☏ ¢ 😼 00:28, 19 February 2021 (UTC)
yea ongoing wouldn't be too hard...post above at 12:22, 21 February 2021 (UTC) by xaosflux
This is only tangentially related, but it might be enough for splitting into its own discussion, but I think we should mandate links to BRFAs in edit summaries (and possibly on bot user pages as well). I know, historically, this was avoided in some instances due to the edit summary length limitations, but those effectively don't exist any more (well, at the very least, for adding a link to a BRFA with "Task X" goes). Primefac ( talk) 13:39, 21 February 2021 (UTC)
BRFAs no longer reflect the most up-to-date bot logic/consensus: if a bot task has been updated or changed, it should be reflected somewhere on the BRFA, either as an extra note or on its talk page. Primefac ( talk) 14:23, 21 February 2021 (UTC)
First off, apologies if this is the wrong talk page, feel free to direct me to the correct one. I want to write a simple bot for practice and I want to get feedback on my design decisions.
Workflow:
api.php?action=query&list=users&usprop=groups
to run an SQL query to get all enwiki usernames who possess X permission.api.php?action=edit
to write the data to a page in the bot's userspace.Thoughts? Thanks in advance. – Novem Linguae ( talk) 09:09, 11 April 2021 (UTC)
Alright, I finished my bot's code tonight. It generates output that looks like this. The page size is about 1,000,000 bytes. I'm running into some kind of perm issue... when I use the credentials for my main account, I can post the data just fine, but when I try to post it with User:NovemBot, who has no perms, the edit quietly fails. Two questions:
There are other bots that post similar data of similar size. For example, User:Bellezzasolo Bot/userhighlighter.js/excon.js. So there is precedent for doing this. Thanks for your thoughts. – Novem Linguae ( talk) 11:49, 22 April 2021 (UTC)
the edit quietly fails.The API will always give an error response with reason on why it failed, though it might be the case that the bot library you're using doesn't expose this error.
private, s-maxage=0, max-age=0, must-revalidate
which clearly means there's no caching taking place. You should look into ways on getting the client to cache the result (I guess using the API to fetch the page with uselang=content and appropriate maxage & smaxage values should do the trick). –
SD0001 (
talk) 13:27, 22 April 2021 (UTC)
/index.php?action=raw&ctype=application/json&maxage=86400&smaxage=86400&uselang=content&title=User:Novem_Linguae/User_lists/Staff_and_sysadmins.js. Response headers:
cache-control: private, s-maxage=0, max-age=0, must-revalidate– Novem Linguae ( talk) 04:01, 26 April 2021 (UTC)
apiResponse.query.pages[0].revisions[0].content
and apply JSON.parse
on that. –
SD0001 (
talk) 06:31, 26 April 2021 (UTC)This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 15 | ← | Archive 20 | Archive 21 | Archive 22 |
I have recently come across an apple store app for iPhone and iPad named wikibot. It is just a regular wiki reader, one of many available on the app store for reading (but not editing I notice) articles. Concerned the name implies something to do with bots - just wanted to see the community's thoughts on this. The link is at [1]. Perhaps we could convince them to rename it? Rcsprinter (talkin' to me?) @ 23:00, 7 September 2012 (UTC)
I need a bot to create mass categories (~400), on other wikiproject. What solution can you suggest? XXN ( talk) 13:56, 19 November 2013 (UTC)
I recall seeing a bot making changes, such as changing a hyphen to an n-dash in a range, such as changing 1998-99 to 1998–99. But I can't remember which bot it was. Does anyone know? — Preceding unsigned comment added by Jc3s5h ( talk • contribs) 00:16, 16 December 2013 (UTC)
Since citations are ever-popular items for bots to operate upon, I am providing this link to a discussion about what to do when citation template documentation, or editions of external printed style guides, change: WT:CITE#When citation style guides are updated.
Obviously it is important for bot operators who change citations should be following the current rules for the citation stye chosen for a particular article, or the rules as they existed at the time the article was written. Jc3s5h ( talk) 15:25, 8 May 2014 (UTC)
As someone who has been doing this manually for years, I hereby dutifully beg of anyone who is technically proficient and knows how to create and run a bot that will:
Please see the centralized discussion at Wikipedia:Bot requests/Archive 61#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK ( talk) 09:31, 4 August 2014 (UTC)
Please see Wikipedia:Village pump (policy)/Archive 114#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK ( talk) 22:53, 5 August 2014 (UTC)
Please see Wikipedia:Village pump (policy)#CatVisor and User:Paradoctor/CatVisor#Planned features if you are willing and able to assist this innovative WP project move along it would be greatly appreciated. Thank you, IZAK ( talk) 23:38, 12 August 2014 (UTC)
This may or may not be a problem with a bot, it could be an AWB issue according to something I read. But this bot made another 1000 edits after this problem, and it is a problem, bad code should be fixed.
I wrote an article with a heading typo: instead of equal signs two left and two right, I put one left and two right. A bot changed it to two left and three right.
I told the bot operator what happened, and his attitude is that since I made a typo and his bot's edit made a similar typo--it is still an unbalanced headline code--there is nothing wrong with the bot. But the bot should only correct headline levels, and it identified this as a levels problem and made a pointless edit, a coding deficiency.
Can an experienced coder explain this error to the bot operator and that it should be fixed? He does not understand. (Someone making 1000s of bot edits a day should get this.) Please don't ping me, if you control bots on Wikipedia, you, too, should get this. MicroPaLeo ( talk) 23:20, 26 February 2015 (UTC)
MicroPaLeo surely the bot's logic could be better to check for these cases. But my experience show that a case like that is not very common. One of Wikipedia's principles is WP:BEBOLD. Feel free to fix articles and improve edits. Bots are good but not good enough. Human editors are always better. Most of the backlogs need human attention and Wikipedia is bases on active editors not bots. -- Magioladitis ( talk) 13:11, 27 February 2015 (UTC)
There should be a reliability history report generator, that generates the reliability of the bot based of its history based on how many errors have it gone thru since the beginning of the bot. There also should be a bot error, log off all the error the bots have gone thru since the beginning of the bots operation. Doorknob747 ( talk) 03:51, 24 March 2015 (UTC)
Ok so there's List of bots which was tantalizing to me, but really basically useless. Is there any list anywhere that has all the bots and what they do? I was trying to quickly find the name of the bot that handles tagging of new articles when they are found to be copied from other sites. -- Hammersoft ( talk) 16:49, 17 July 2015 (UTC)
This
edit request to
Wikipedia:Bots has been answered. Set the |answered= or |ans= parameter to no to reactivate your request. |
Could someone add User:JeffGBot to the list of Wikipedia bots? He checks pages for dead links. Here is an example. Thanks, 73.223.175.207 ( talk) 00:47, 27 November 2015 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Hi there,
What do you think about this edit (and the corresponding to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? -- Tim 1357 talk| poke 02:45, 23 September 2016 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Hi there,
What do you think about this edit (and the corresponding to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? -- Tim 1357 talk| poke 02:45, 23 September 2016 (UTC)
See Wikipedia:Village pump (idea lab)/Archive 21#Bot content with updates. The idea is a category of bot that pulls data from a reliable external source, formats it as text and places it in files that can be transcluded into articles. Obvious types of data for a settlement are terrain, temperatures, rainfall, census data, election results. Abuko is a mock-up of the result, using data generated by User:Lsj's Swedish bot. Edit the location section to see the transclusion. Where this differs from other content generators is that it can be rerun any time to refresh the data, perhaps quite frequently. That may imply a need for rules or processes specific to this type of bot. Any input welcome. Aymatth2 ( talk) 16:56, 24 September 2016 (UTC)
The bot? SummerPhD is confusing me. Sausagea1000 ( talk) 19:44, 30 December 2016 (UTC)
The "bot" flag setting option is not obvious, how is it done if the bureaucrat roles don't exist. - 180.149.192.139 ( talk) 02:41, 13 February 2017 (UTC)
I put a new section on how to hide AWB here, since this is often a WP:MEATBOT related issue, but we could move it to a section of WP:AWB if you feel it's out of place here. Headbomb { talk / contribs / physics / books} 15:18, 24 February 2017 (UTC)
Hi, not sure how to proceed with this situation.
It concerns User:Dispenser's Checklinks tool.
Facts as I know them:
How should we proceed? I suspect Dispenser would be OK with turning it off if requested, but I don't know what kind of community discussion is needed or where. -- Green C 20:29, 6 March 2017 (UTC)
Is there a page where one can raise complaints as to robotic editing of article pages? I have had a number of improvements to the Siri page, contextualizing or removing unreliable sources removed by a username who does not seem to do anything but undo edits. 203.109.212.42 ( talk) 09:47, 24 March 2017 (UTC)
Hi! I'm trying to set up some anti-vandal bots and such over on my wiki I am an owner of the wiki, and right now I use AWB for basic fixes, but vandalism has been a big problem in the past before I adopted the wiki and reverted all of it. Can someone help me learn to use a Bot for my Wiki? I currently run on Windows 10. Thanks! FiveCraft ( talk) 23:06, 1 April 2017 (UTC)
I am new to bots and eventually plan to develop a bot for Wikidata which will compare information published by the US Bureau of the Census to information about New England towns and make appropriate edits. It's possible that this might be adopted to edit the corresponding Wikipedia articles later.
For now I'm just following the tutorial at wikidata:Wikidata:Pywikibot - Python 3 Tutorial which will not involve making any edits, except to sandbox items, but it will involve reading data from Wikipedia. I'm wondering when is the best time to create a bot account, now, or when the nature of the edits will require approval. I am aware that separate approvals would be needed for each project where non-sandbox edits are to be made.
I am also wondering about the preferred method for logging in, or equivalent, and whether the same credentials will work on both Wikidata and Wikipedia. Jc3s5h ( talk) 18:01, 30 July 2017 (UTC)
把我的网页串改 Zhanlang1975 ( talk) 23:24, 1 November 2017 (UTC)
Do we have anything on this already? We're contemplating adding a footnote to WP:Manual of Style about not using AWB, etc., to "enforce" the "rules" of MoS across a zillion pages (see WT:Manual of Style#Proposed footnote to discourage mass changes – the actual proposal, not the joke thread under it, though you may find that amusing on the side). While we've identified a relevant ArbCom ruling, it seems like something that should be in the behavioral or editing guidelines somewhere. — SMcCandlish ☏ ¢ 😼 22:08, 4 July 2018 (UTC)
In fact we need the rules to encourage editors and bot owners to enforce rule of MoS. Not enforcing rules is allowing custom styles over Wikipedia. I would be glad to participate in a discussion where bots will run daily to enforce rules. In an ideal situation editors using VE or any other editor get a heads up of how to comply with MoS. -- Magioladitis ( talk) 16:10, 5 July 2018 (UTC)
I suggest that we leave massive messages to AWB editors to tunr general fixes on to help enforce MoS rules. -- Magioladitis ( talk) 16:13, 5 July 2018 (UTC)
See Wikipedia:Miscellany for deletion/Wikipedia:WikiProject Spam/Report, where it is proposed to delete Wikipedia:WikiProject Spam/Report, an old Betacommand bot report page. — SmokeyJoe ( talk) 00:48, 27 January 2019 (UTC)
Looking at a query by one of my bots, I see the message:
Use
Special:ApiFeatureUsage to see usage of deprecated features by your application.
So I go to Special:ApiFeatureUsage and it wants me to fill in a "User agent" field. What is a " user agent" in this context? What do I enter in this field to specify that "my application" that I want to see what deprecated features it's using is one of my bots (i.e. RMCD bot or Merge bot). Thanks, wbm1058 ( talk) 15:30, 13 May 2019 (UTC)
curl_setopt($this->ch,CURLOPT_USERAGENT,'php wikibot classes');
used by bot function get
and function post
in
User:RMCD bot/botclasses.php – so can I change that to something unique in my copy of that function library to ensure that only my applications are reported, e.g. 'php wikibot classes wbm1058'?action=query&prop=revisions&!rvslots
: This is about the fact that you're using
action=query&prop=revisions without specifying the rvslots
parameter. The plan is that MediaWiki will someday allow more than one "slot" of content in a page, so for example it might be possible to have the template and its TemplateStyles stylesheet at the same title instead of having to use a subpage. Right now though, the only slot is rvslots=main
.action=login&!lgtoken
: You're using action=login
without supplying the lgtoken
parameter to get the login token from the NeedToken response. The new way to do it is to use
action=query&meta=tokens, supplying type=login
.action=query&prop=info&intoken
: You're using action=query&prop=info&intoken=...
to fetch a token, probably an edit token with intoken=edit
Again, the new way to do it is to use
action=query&meta=tokens, most likely supplying type=csrf
. The help for whichever token-needing module you're using will tell you for sure what value of type
you need, e.g. at
Special:ApiHelp/edit it says token: A "csrf" token retrieved from action=query&meta=tokens.
Examples (which I've edited):
BotPasswords
@ Anomie: is there a similar MediaWiki:-summary page for Special:ApiFeatureUsage? MediaWiki:ApiFeatureUsage-summary? I don't want to create it before I can confirm the pagename. wbm1058 ( talk) 15:49, 14 May 2019 (UTC)
uselang=qqx
to the URL, e.g.
/info/en/?search=Special:ApiFeatureUsage?uselang=qqx. It's a little trickier for pages that only show as the result of a form post; for those I normally use the developer console to find the <form>
and adjust the action
attribute.
Anomie
⚔ 23:01, 14 May 2019 (UTC)OK, I'll pick up this dropped ball and work my way through it, documenting what I do here so that (1) others with similar needs can perhaps follow this for guidance, and (2) anyone watching this page can correct me on anything I get wrong.
The first point I want to make is on the need for application ("front-end" in modern lingo) developers to maintain their own libraries. Rather than the systems (back-end) guys doing it. Neither my copy nor Sam Reed's copy nor Kunal Mehta's copy of botclasses.php checks for [warnings] and, if found, passes them on to the end user. So don't expect your end users to be immediately aware of these warnings passed through the API, as they weren't showing up on my bot's console.
The library has a helpful comment with a link to a page explaining the two-step login procedure: /* This is now required - see https://bugzilla.wikimedia.org/show_bug.cgi?id=23076 */
After inserting a couple print_r($ret);
lines in the appropriate places, the library printed the warnings (something it only did when there was an error) so I could see them on my console:
This message is returned by step 1 of the login process:
This message is returned by step 2 of the login process:
Relevant advice from above:
action=login&!lgtoken
: You're using action=login
without supplying the lgtoken
parameter to get the login token from the NeedToken response. The new way to do it is to use
action=query&meta=tokens, supplying type=login
.So I replaced "action=login" with "action=query&meta=tokens&type=login" for step 1. But it's not that simple. I got a new warning:
So I don't need to send my login name and password in step 1 anymore? The query did return a [logintoken], similar to the [token] returned by the old method. Apparently not:
mw:API:Tokens. So, I simply removed the $post
array that passed in the ID & pw from the query in the library. No longer a need to check if ($ret['login']['result'] == 'NeedToken')
. For step 2, I replace ['login']['token']
with ['query']['tokens']['logintoken']
endorphin rush It worked! I'm still getting the warning about the need to use BotPasswords in step 2, but time to take a break to have a beer to celebrate success in step 1. Meanwhile, you can tell Sam and Kunal to make THIS change in their copy of botclasses.php – wbm1058 ( talk) 20:18, 15 May 2019 (UTC)
Success! I implemented BotPasswords for all three of my bot accounts, and updated Help:Creating a bot § Logging in for both the new GET/POST login method and BotPasswords. I created MediaWiki:Botpasswords-text and MediaWiki:Botpasswords-label-appid, which should make the BotPasswords interface much more intuitive. – wbm1058 ( talk) 14:39, 16 May 2019 (UTC)
Would a bot that automatically checks Special:Log/newusers and posts a welcome message overload the project? TheEditster ( talk) 09:52, 11 July 2019 (UTC)
Hi all, I did some searching around and found this page and the list of bots. Of course, the bots are users, but they are special. Which is why this project page exists in the first place. Now, last Monday we were hacking on federated SPARQL queries against Wikidata and a few external end points, and the question came up what data sources are used by Wikidata. So, I explained the general idea, and the bots came up. And I was wondering if we could use the WDQS to list all databases that bots use as input to synchronize with Wikidata. But it seems that the bots, or just the longer running bots, do not have Wikidata items themselves. I could imagine this semantic bot page would list the operating user, which databases it imports CC0 data from, which properties it uses (in the population), maybe relevant ShEx for those imports, etc. Hoping I did not just overlook it, I was wondering if there is something equivalent that I could use instead? If not, what are your thoughts on actually making items in Wikidata for bots with information like given above? -- Egon Willighagen ( talk) 10:38, 23 August 2019 (UTC)
Which box do I check in BotPasswords to give my bot permission to merge page histories? I couldn't find anything in the list that looked relevant. @ Anomie:? wbm1058 ( talk) 03:07, 4 July 2020 (UTC)
mergehistory
. According to
Special:ListGrants, there is no grant that provides that right, so you'd have to request that MediaWiki (or Wikimedia's configuration) be changed to add that to one of the existing grants or to create a new grant for it. Also of note (for anyone else reading this) is that normal bots don't have that right anyway, but
User:Merge bot is an adminbot.
Anomie
⚔ 03:15, 4 July 2020 (UTC)
<!--NOTE: This is NOT THE LocalSettings.php file for en.wikipedia.org's install; the settings file is not editable via the wiki interface-->
You are invited to join the discussion at User talk:MajavahBot/Bot status report § Making this table more useful for identifying bots that have failed. {{u| Sdkb}} talk 02:04, 7 August 2020 (UTC)
Read this message in another language
The Wikimedia Foundation will be testing its secondary data centre. This will make sure that Wikipedia and the other Wikimedia wikis can stay online even after a disaster. To make sure everything is working, the Wikimedia Technology department needs to do a planned test. This test will show if they can reliably switch from one data centre to the other. It requires many teams to prepare for the test and to be available to fix any unexpected problems.
They will switch all traffic to the secondary data centre on Tuesday, September 1st 2020.
Unfortunately, because of some limitations in MediaWiki, all editing must stop while the switch is made. We apologize for this disruption, and we are working to minimize it in the future.
You will be able to read, but not edit, all wikis for a short period of time.
Other effects:
This project may be postponed if necessary. You can read the schedule at wikitech.wikimedia.org. Any changes will be announced in the schedule. There will be more notifications about this. Please share this information with your community.
User:Trizek (WMF) ( talk) 10:30, 31 August 2020 (UTC)
This is a reminder of a message already sent to your wiki.
On Tuesday, October 27 2020, all wikis will be in read-only mode for a short period of time.
You will not be able to edit for up to an hour on Tuesday, October 27. The test will start at 14:00 UTC (14:00 WET, 15:00 CET, 10:00 EDT, 19:30 IST, 07:00 PDT, 23:00 JST, and in New Zealand at 03:00 NZDT on Wednesday October 28).
Background jobs will be slower and some may be dropped. This may have an impact on some bots work.
Know more about this operation.
-- User:Trizek (WMF) ( talk) 09:25, 26 October 2020 (UTC)
First, can anyone tell me which bot it is that goes around replacing potentially confusing template shortcuts in mainspace with actual template names (e.g. replaces {{
cn|...}}
with {{
citation needed|...}}
?
Second, where is this stuff listed? I.e., how can I answer this question for myself? There are a lot of bot-related talk pages, that all have notices basically saying "this is probably not the talk page you want", but this is also obviously not a question that needs bots-noticeboard attention. — SMcCandlish ☏ ¢ 😼 00:28, 19 February 2021 (UTC)
yea ongoing wouldn't be too hard...post above at 12:22, 21 February 2021 (UTC) by xaosflux
This is only tangentially related, but it might be enough for splitting into its own discussion, but I think we should mandate links to BRFAs in edit summaries (and possibly on bot user pages as well). I know, historically, this was avoided in some instances due to the edit summary length limitations, but those effectively don't exist any more (well, at the very least, for adding a link to a BRFA with "Task X" goes). Primefac ( talk) 13:39, 21 February 2021 (UTC)
BRFAs no longer reflect the most up-to-date bot logic/consensus: if a bot task has been updated or changed, it should be reflected somewhere on the BRFA, either as an extra note or on its talk page. Primefac ( talk) 14:23, 21 February 2021 (UTC)
First off, apologies if this is the wrong talk page, feel free to direct me to the correct one. I want to write a simple bot for practice and I want to get feedback on my design decisions.
Workflow:
api.php?action=query&list=users&usprop=groups
to run an SQL query to get all enwiki usernames who possess X permission.api.php?action=edit
to write the data to a page in the bot's userspace.Thoughts? Thanks in advance. – Novem Linguae ( talk) 09:09, 11 April 2021 (UTC)
Alright, I finished my bot's code tonight. It generates output that looks like this. The page size is about 1,000,000 bytes. I'm running into some kind of perm issue... when I use the credentials for my main account, I can post the data just fine, but when I try to post it with User:NovemBot, who has no perms, the edit quietly fails. Two questions:
There are other bots that post similar data of similar size. For example, User:Bellezzasolo Bot/userhighlighter.js/excon.js. So there is precedent for doing this. Thanks for your thoughts. – Novem Linguae ( talk) 11:49, 22 April 2021 (UTC)
the edit quietly fails.The API will always give an error response with reason on why it failed, though it might be the case that the bot library you're using doesn't expose this error.
private, s-maxage=0, max-age=0, must-revalidate
which clearly means there's no caching taking place. You should look into ways on getting the client to cache the result (I guess using the API to fetch the page with uselang=content and appropriate maxage & smaxage values should do the trick). –
SD0001 (
talk) 13:27, 22 April 2021 (UTC)
/index.php?action=raw&ctype=application/json&maxage=86400&smaxage=86400&uselang=content&title=User:Novem_Linguae/User_lists/Staff_and_sysadmins.js. Response headers:
cache-control: private, s-maxage=0, max-age=0, must-revalidate– Novem Linguae ( talk) 04:01, 26 April 2021 (UTC)
apiResponse.query.pages[0].revisions[0].content
and apply JSON.parse
on that. –
SD0001 (
talk) 06:31, 26 April 2021 (UTC)