This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | Archive 6 | Archive 7 | Archive 8 | Archive 9 | Archive 10 | → | Archive 15 |
This bot, which maintains the list of requested moves, suddenly stopped working after 17:30, 18 July 2012. I'm asking this group if anybody knows how to kick start the bot. I recall the bot has been stopped before: Wikipedia:Bot owners' noticeboard/Archive 7#RM bot inactive – Wbm1058 ( talk) 00:16, 20 July 2012 (UTC)
Discussion at Wikipedia:Administrators' noticeboard/IncidentArchive761#RM bot inactive. Wbm1058 ( talk) 13:35, 20 July 2012 (UTC)
...and at Wikipedia:Village pump (technical)#RM bot inactive – Wbm1058 ( talk) 12:44, 23 July 2012 (UTC)
Howdy all! Very long time no-really-do-anything-with-Wikipedia! For those (everyone? :)) who has no idea who I am, I "operate" HBC Archive Indexerbot, and ages ago helped out with some of the development of that bot and the HBC AIV Helperbots. Well, I've been horrifically inactive and unhelpful here on Wikipedia for ages now, and have decided it's finally time to try to see if someone else wants to take over the operation and maintenance of HBCAI, rather than it continuing to languish and get repeatedly ignored by me. This seemed the most logical place to suggest that, but please let me know if anyone thinks there's a better place to move the discussion.
HBCAI is written is Perl and designed to run on a UNIX/Linux-like system. I've been running him on FreeBSD, but he should work just as well on any *NIX really. I'd strongly recommend that whomever takes over this bot be reasonably well versed in both Perl and *nix administration, as getting it up and running on a new system may be a bit of an adventure, and I'm afraid I probably won't be much help due to time constraints. The bot uses considerable CPU and RAM, between 1 and 2 GB of RAM active during a run. I've been running it 2x/day, and each run lasts close to an hour, if memory serves.
The source is available on the wiki via the bot's user page, but I'll be happy to provide a bundle with the exact sources including Mediawiki.pm that I'm using, as it's somewhat finnicky about that sort of thing. To be honest I'm not even 100% sure it's working at all at the moment; it seems to break periodically when things change in the Mediawiki software. It's really not a bad bot to run, but I'm just so out of the loop on all things Wikipedia that it's too much effort to try to figure out what's up every time something breaks, and I'm terribly non-responsive on my talk page, and it makes me feel like a jerk. Plus the whole thing really could use a total rewrite, or at least some serious TLC, because it hasn't had much in the last five years or so.
I'll try to check in on the discussion here, if you want to volunteer specifically, though, please also drop me an e-mail via my user page so I know to check in and I'll try to be reasonably responsive! Thanks! — Krellis ( Talk) 00:21, 23 July 2012 (UTC)
lowercase
, but my personal preference is mixedCase
.
Lego
Kontribs
TalkM 08:28, 24 July 2012 (UTC)
Hi -Could anyone here tell me what effect adding this protection tool had on our vandal bots/edit filter? Is it possible for bots/edit filter to search the pending edit queue and reject a "desired addition" that is not yet reviewed and not yet added to an article ? Youreally can 18:24, 26 July 2012 (UTC)
I would expect bot operation to be unafffected by pending changes. For me, the issue is that sometimes we would want the bot operation to be tweaked slightly. For example, if an anti-vandal bot reverts an edit and the version before the reverted edit was approved then we want the bot to approve its new version. If the version before the vandal edit was not approved then we would want the bot to leave its new version unapproved also. Yaris678 ( talk) 11:57, 30 July 2012 (UTC)
Hi all,
I've been playing with pywikipedia a little bit in - and so far using it only to read and parse pages (which has been really quite useful for a number of things). I'd like to move towards using pywikipedia to make changes in a 'approved by human' way.
Now, it's trivial for me to, say, print out the original wikitext of a page/section and then print out the proposed new text and ask the user at the command line if they approve the change - but it would be much more useful/fancy, if when the pywikipedia script had an edit that it wanted to make, it opened up a browser window and gave a preview page that the editor could view. My question is: is that sort of functionally buried anywhere in the pywikipedia librarys? and if not are there any approximations I could use? Fayedizard ( talk) 07:55, 12 August 2012 (UTC)
Is VIAFbot ( talk · contribs) an approved bot or in the process of getting approval? I see a lot of test edits from it today, and it looks like it may be a port from or otherwise related to de.wikipedia. I'm not an expert on bots; that's why I'm asking here before acting. — C.Fred ( talk) 21:28, 16 August 2012 (UTC)
Hi all,
I'm considering the idea of a bot that looks at 'See Also' sections of articles, and does things like remove elements if they are already in the main article (per the "As a general rule the "See also" section should not repeat links which appear in the article's body or its navigation boxes" part of Wikipedia:Manual_of_Style/Layout#See_also_section). I enjoy writing code and it would be quite nice to write a bit of python that works it's way though a wikiproject and presents some edits to the bot runner for approval. I'm interested to know if a) this is reasonable functionality for a bot and b) if other bots already have this capability. Fayedizard ( talk) 11:48, 21 August 2012 (UTC)
I have blocked the seemingly unapproved Commons fair use upload bot ( talk · contribs), and opened a discussion at the incidents board. Input from those familiar with bots/bot policy would be appreciated. J Milburn ( talk) 15:31, 24 August 2012 (UTC)
Is there something wrong with the bot used for giving updates with assessments in WikiProject? WP:LT/A has not been updated for over a month. Also, manually accessing the bot through toolserver is apprently forbidden, according to this. Simply south.... .. flapping wings into buildings for just 6 years 15:40, 26 August 2012 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
I just received a newsletter that I never asked for The Olive Branch: A Dispute Resolution Newsletter (Issue #1), and whoever is behind it is forcing editors to add their names to Wikipedia:Dispute Resolution Improvement Project/NewsletterOptOut to stop being spammed. I am not involved in this process, I do not want to be involved, and I should not have to be put on some list of shame for something that some other editor thought was important. I can't believe that the process actually approved a newsletter that was opt-out instead of opt-in, that's crazy. ▫ JohnnyMrNinja 19:22, 4 September 2012 (UTC)
Seriously. Participating in a WP board is not an opt-in to be spammed. Not okay, and I'm surprised that there isn't anything in guidelines forbidding users from signing up others to receive spam without their permission. – Roscelese ( talk ⋅ contribs) 19:44, 4 September 2012 (UTC)
Seriously, Ocaasi, if you write an apology at the top of the newsletter and add an opt-in list there may still be time to save it. At this point most of the people haven't logged in yet. This is going to create a negative impression of the newsletter that will take a while to wear off, if it survives. People resent things forced on them, even if they would normally like it. Accept that there will not be another opt-out mailing and move on. ▫ JohnnyMrNinja 21:02, 4 September 2012 (UTC)
As I just posted at EdwardsBot, I'd really like to see an RFC or some other similar process implemented to gauge consensus about when it's appropriate (or inappropriate) to send messages like this. And when messages should be opt-in, opt-out, or otherwise. And how often it's appropriate to send out messages.
These are important questions and I'm perfectly happy to see the bot blocked or its access list wiped clean if these issues can't be resolved reasonably. I think there is some utility in having such a message delivery mechanism, but it shouldn't be causing so many editors to be annoyed. -- MZMcBride ( talk) 01:31, 5 September 2012 (UTC)
P.S. And, of course, the name of the bot's configuration page (" User:EdwardsBot/Spam") is completely tongue-in-cheek.
I posted a legite question about a bot. [3] My question was ignored. This bot does not need to create stubs any faster. It creates 100 stubs at a time, and each stub is supposed to be verified by a member of the project; unless the project suddenly gained a dozen new really fast snail editors, there is no reason for this bot to increase its stub-creation output.
Or if there was a reason, it was sure not readily available in answer to a question by a community member.
What is going on? Why did this bot have to be approved to create stubs at 5X its current rate? How is Wikiproject:Gastropod handling the approval of these bot-created stubs. The community has spoken a number of times about bot-created stubs, and not usually favorably. This bot operator has "misinterpretted" prior approvals to mean something entirely different from what was intended. This is not a bot and operator to be speedy approved when a community member has asked a question.
I would like the discussion re-opened, and the question answered. I don't care how old the operator is.
68.107.140.60 ( talk) 03:14, 9 September 2012 (UTC)
As usual, a Wikipedian quotes an essay he has not read and that does not apply to the situation. Thanks. Much appreciated. My focus is on the task, as done by this bot operator who does not seem to think the rules apply to him. So, fresh start, let's get an explanation for how I tell the bot owner is in compliance, when it appears that he thinks 100 means some other number, 141, or, I may be wrong, but I was not allowed to discuss the situation because you heedlessly speedied the closure of the BRFA. Let's get an exact explanation of what the bottleneck is, also, because if it is the slow speed of the bot, something is not right, although with your 11 years of experience, I wonder that you think one can cure a badly written slow program by giving it more to do. Maybe it has something to do with the count variable, and that would explain the 141/100 computer approximation.... 68.107.140.60 ( talk) 07:02, 9 September 2012 (UTC)
Note: There is a lot of history being brought up here. On one side, Ganeshbot was approved in Ganeshbot 4 to create about 580 articles for the genus Conus, limited to 100 per month. In Wikipedia talk:Bot Approvals Group/Archive 7#Wrong way of the close a BRFA, the 100-per-month limit was lifted. Somehow or other, the members of WikiProject Gastropods thought they were allowed to have the bot create 15000+ articles for other gastropods without further approval or any rate limiting. This, understandably, caused much consternation. Ganeshbot 5, asking permission to finish creating these tens of thousands of articles, eventually was denied for no consensus; Ganeshbot 10 was eventually approved with the rate of creation limited to the rate of review by the project.
On the other side, the IP user 68.107.140.60 seems to be the same user who has been around for the anybot mess (see pretty much all of Wikipedia talk:Bots/Requests for approval/Archive 4) and other related discussions. If it is the same user, he/she serves a valuable function in watching bot activity and approvals generally related to "species" articles with a critical eye, but this is counterbalanced by the user being extremely sensitive to perceived slights against IP editors and being generally quick (to the point of disruption) to throw around accusations of being ignored or suppressed, of being on the receiving end of incivility, and of editors in "power" being biased against their viewpoint. Anomie ⚔ 08:38, 9 September 2012 (UTC)
This is just about an understanding of this bot operator's request. Let's not make it about me or anything else. - 68.107.140.60 ( talk) 17:42, 9 September 2012 (UTC)
I don't have enough coding experience to make a bot, but I was wondering if someone could make the code for a bot that automatically posts a message on a person's talk page after they make one edit. This is for a different wiki where I have been asked to make a bot that does that. ad Intellige ad nuntius 02:26, 16 September 2012 (UTC)
Hello everyone-
I wanted to let you know that my dissertation, "Network of Knowledge: Wikipedia as a Sociotechnical System of Intelligence" is now available on my website with a CC BY-NC-SA 3.0 license. Over a year ago I began this project with the WMF Research Committee and the University of Oregon IRB's approval. Nearly 50 bot operators, WP contributors, and WMF administrators were kind enough to participate in the study, offering their time, opinions, and expertise on issues around bots and bot creation. Feel free to download the document or peruse it online, and I look forward to your comments either on the site or via email.
The manuscript is a bit long (~320 pages) and includes some standard dissertation sections (literature review, methods chapter, etc.). Interviewee contributions are featured most in Chapters 5 and 6 (if you want to skip to the good stuff).
I am at a new institution now and will be going through a new IRB approval process to continue this research, but I do indeed want to continue chatting with the bot and semi-automated tool community. Please let me know if you're interested in connecting this fall, and thank you so much to those who have already participated!
Randall Livingstone UOJComm ( talk) 23:55, 20 September 2012 (UTC)
Why is there no "this article created by a bot tag" on the article? And, no, I don't give a dang about the edit history. Was this a decision made, or has it never been discussed? 68.107.140.60 ( talk) 01:56, 16 September 2012 (UTC)
Well, you've caught my curiosity. To my knowledge, I've never seen an article that was created by a bot, though it's easy to find bot edits to articles created by humans. Please post some links to some typical bot-created articles so I can see what the discussion is all about. And I mean actual articles, not pages like Wikipedia:Requested moves/Current discussions that the bot I operate writes. Thanks, Wbm1058 ( talk) 18:28, 19 September 2012 (UTC)
I have temporarily shut down my bot due to internal malfunctions. I was worried the malfunctions can beginning manifesting themselves onto Wikipedia.— cyberpower ChatOffline(Now using HTML5) 12:39, 23 September 2012 (UTC)
Amidst all the Toolserver/Labs turmoil, I acquired a 2008 Mac mini, which I then installed Linux upon. It now runs fairly speedily 24/7 (assuming there isn't a power outage), and, as I have no real use for it, I am offering the use of it (just send me scripts) up for grabs to any disgruntled bot operator(s) who'd like me to run their programs on it. Just an FYI - please drop me a message if you're interested - I can send you system details/whatever else you'd like. Just trying to do my part! Theo polisme 07:40, 30 September 2012 (UTC)
I noticed that some closed threads at NFCR haven't been archived yet. Looking at Special:Contributions/ClueBot_III shows that it has made no edits since 7 October. Will the bot be revived? -- Toshio Yamaguchi ( tlk− ctb) 07:14, 13 October 2012 (UTC)
I have an issue with Yobot. In particular, the bot is making too many changes at once and often with edit summaries that are not specific enough or simply tangential to the actual edits made. My initial comment on the matter started with this conversation which continued here. I think User:Magioladitis has put little weight on my comments, has not tried very hard to understand my point, and has been slow to respond, so I bring the matter here. After our exchange, my main contention is that Yobot's edit summaries are not specific enough. This makes checking its changes difficult and annoying for watchers. In particular, Yobot can sort ref tags while still making other minor changes. This can change large blocks of text in a single article while the edit summary might only refer to some minor edit sprinkled in among those changes. Jason Quinn ( talk) 16:43, 15 October 2012 (UTC)
Comment I would additionally add that it's good advice that bots should do one thing and do it well and that Yobot seems to tackle too many tasks at a time. Jason Quinn ( talk) 16:52, 15 October 2012 (UTC)
My current edit summary is of the following form: "[[WP:CHECKWIKI]] error #xx fix and [[WP:GENFIXES|general fixes]]" which is followed by a short text that explains which rule I apply for the specific CHECKWIKI fix. The remaining edits are explained in WP:GENFIXES. Last week I worked mainly in the direction of creating skip conditions for each CHECKWIKI error fix. Instead of loading all lists together I now work on each list separately. I am open in ideas of making my edit summaries smarter without having to generate a very lengthy edit summary. -- Magioladitis ( talk) 18:41, 15 October 2012 (UTC)
Sorry if I'm about to defame a hardworking user, and an administrator at that, but I noticed something a bit strange on my watchlist today - two identical edits by User:Redrose64 on completely unrelated pages on a fairly esoteric technical point - here and here. Looking at the users edit history reveals large numbers of edits to various pages identical to those I encountered. Looking further back there are similar instances of these editing patterns on various different technical points, sometimes at a rate of several per minute and accounting for hundreds of identically described edits. I can make some fast edits sometimes, and I'm not averse to hard work, but nothing approaching this...
For all I know this may be normal AWB activity, I really am no expert. I'm sure this is a hard working administrator doing a good job - but it just seemed a little odd so I thought I'd flag this up to some people who are more knowledgeable than I am! MatthewHaywood ( talk) 00:14, 17 October 2012 (UTC)
|day=
, |accessed=
and |access-date=
. --
Redrose64 (
talk) 10:43, 17 October 2012 (UTC)I am going to step down as the maintainer of the WP 1.0 bot at the end of November. A new maintainer for the bot is needed. More information can be found here. — Carl ( CBM · talk) 17:10, 28 October 2012 (UTC)
Edits by User:Hammocks and Honey that say db-a9|bot=ExampleBot and put the A9 csd incorrectly on articles where the artist is bluelinked. I'm reverting from the earliest on the list, but I can't see hoe to stop ot. I've messaged Hammocks and Honey. Peridon ( talk) 17:41, 8 November 2012 (UTC)
The RM bot, User:RMCD bot aka User:RM bot has stopped running. Can it be restarted or is yet another fork needed? Apteva ( talk) 16:12, 25 November 2012 (UTC)
Hi all,
A few months ago SuggestBot got approval to update the Community Portal's list of open tasks ( Wikipedia:Community portal/Opentask, BRFA Wikipedia:Bots/Requests for approval/SuggestBot 7)). I'm now interested in having it update a smaller list of tasks ( Template:Opentask-short) for experiments done in the Onboarding new Wikipedians project.
Would it be necessary to do a separate BRFA for this? Maybe it would instead be possible to refer to the previous BRFA and that the bot serves the same purpose? Would appreciate some input on this. Cheers, Nettrom ( talk) 18:38, 20 November 2012 (UTC)
Automating tasks on Wikipedia
Uploading hundreds of files or changing thousands of pages can be tedious. We allow limited automation unless it interferes with normal systems operations. You always can grab your favorite scripting language and write a bot, but there's no need to reinvent the wheel: take a look at PyWikipediaBot, a quite complex automation framework for Wikipedia. If you are more into Perl, libwww-perl is a very useful library for automating web tasks. If you have tested your bot and intend to run it over a longer period of time, please get in touch with the developers first (preferably using the wikitech-l mailing list) or by requesting a flag here. We then can register your bot, so it can be hidden from the list of recent changes.
---cut here---
The above is being put out as the "tipof the day" it looks like it was written in 2005. Somebotty might like to find and update the tip.
Rich
Farmbrough, 02:59, 16 December 2012 (UTC).
There are some threads that may be of interest to bot operators on the mediawiki-api mailing list. In summary:
Feel free to join the mailing list and the discussions (you can sign up for gmail or another free email service if you don't want to reveal your personal address), or I'll try to summarize replies posted here at some point. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Personally, while I think the current system could be cleaned up somewhat, I don't much care for removing all option from the client in how continuation is processed. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Personally, I'm particularly interested in pros and cons for introducing versioning at all, which the original proposal seems to have assumed as a given. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Hi; could somebody who understands these things please check which bots are actually fixing double redirects at the moment? The only one I have personally recently witnessed working is AvocatoBot ( task list · contribs) – but, in any case, could someone do a more thorough/scientific check and update the relevant pages as appropriate? Thanks It Is Me Here t / c 14:23, 14 December 2012 (UTC)
When I discovered the problem, I inserted dates myself. Someone else is doing it as well. I can't be here every day.— Vchimpanzee · talk · contributions · 15:39, 26 December 2012 (UTC)
Is it possible make List of test wikipedia of incubator that renew by bot? Attempt in Russian Wiki: ru:Википедия:Список Википедий в инкубаторе Its talks: ru:Обсуждение Википедии:Список Википедий в инкубаторе, ru:Википедия:Форум ботоводов#Википедия:Список Википедий в инкубаторе -- Kaiyr ( talk) 14:53, 27 December 2012 (UTC)
I am currently applying to be a member of BAG and input is greatly appreciated.— cyberpower Offline Happy 2013 13:31, 2 January 2013 (UTC)
Hi folks, I'd like to expand how I'm contributing to WP by trying to write some bots. The first idea is simple:
I'm sure an experienced bot-writer could knock this out in an hour, but I want to do it myself. Can anybody point me to the source code of a simple Python-based bot that might be a useful shell for me to build such a bot? Appreciate it...
Zad
68
03:50, 3 January 2013 (UTC)
Any comments would be appreciated ·Add§hore· Talk To Me! 16:16, 10 January 2013 (UTC)
Is there a way to get the edit count of an IP account? Although the mw:API:Users example includes an IP, it doesn't seem to work. NE Ent 22:27, 10 January 2013 (UTC)
I am currently (self) nominated to become a member of BAG (Bot Approvals group). Any questions and input you may have is invited with open arms [[ here. ·Add§hore· Talk To Me! 21:39, 16 January 2013 (UTC)
There is a discussion on the administrators' noticeboard regarding a new template designed for alerting bot operators that their bot has been blocked: Wikipedia:Administrators' noticeboard/Archive244#Blocking misbehaving bots. Feedback from bot operators is welcome. 28bytes ( talk) 20:31, 18 January 2013 (UTC)
I'd like to introduce LinqToWiki: a new library for accessing the MediaWiki API from .Net languages (e.g. C#). Its main advantage is that it knows the API and is strongly-typed, which means autocompletion works on API modules, module parameters and result properties and correctness is checked at compile time. Any comments are welcome. User<Svick>. Talk() ; 17:56, 17 February 2013 (UTC)
I wrote a script to search templates with many parser functions that are worth to convert to Lua now: mw:Special:Code/pywikipedia/11099. Bináris ( talk) 06:17, 21 February 2013 (UTC)
Due to a bug, I have temporarily shut down adminstats.— cyberpower ChatOnline 13:38, 21 February 2013 (UTC)
Does anybody have Python code to parse an article and extract all the refs, maybe even parse through a few of the more popular template:cites (cite book, cite journal, etc.)? I'd like to develop a number of little utilities that manipulate refs and cite info, but wanted to see if someone has already laid this sort of groundwork. Ideally it'd be a library function that took a page as input and returned a list of data structures containing the ref and cite info, bonus points if the library had an API to manipulate the data and apply it back to the page for writing. I scanned through a bunch of the existing code I was able to find but didn't come across anything. Any help appreciated, cheers!
Zad
68
15:07, 27 February 2013 (UTC)
As some of you may know, Wikidata interwiki links went live on the Hungarian Wikipedia today. Many editors started removing interwiki links en masse. However, it was soon realized that the interwiki bots were still running, and they started readding links. I assume that we want to prevent this from happening when Wikidata is turned on for the English Wikipedia... -- Rs chen 7754 21:30, 14 January 2013 (UTC)
[5] goes into more detail. Speaking of which, if you have an interwiki bot that runs on the Hungarian Wikipedia, we would appreciate it if you changed your code... -- Rs chen 7754 22:45, 14 January 2013 (UTC)
(edit conflict) Hi everybody, I wrote to pywiki developers to update interwiki bots. (I am a developer myself, but I have never worked with interwiki.py.) I don't think 126 bot owners should be notified; if a developer updates the code, they will be responsible to update their bots in a reasonable time. Let's see what happens. There are more phases: Hungarian Wikipedia will be followed by Hebrew and Italian in the second step and English in the next phase. At last the remnant. Link Fa templates will still be handled by interwiki bots for a while until Wikidata integrates them. It's easier to follow these happenings in pywiki code, I suppose. Cheers, Bináris ( talk) 22:50, 14 January 2013 (UTC)
A temporary solution might be to use the edit filter to block (disallow) either bot changes to interwikis or (and I would think this is more efficient) block bot edits with an interwiki.py edit summary. Of course, 1% of those edits are still probably going to be good ones; but in any case it would generate a handy list of current interwiki bots and give owners some breathing space if they forget to convert. - Jarry1250 Deliberation needed 09:55, 16 January 2013 (UTC)
Just another FYI since this wasn't brought up - the bots will still need to add the FA/GA stars when the article is FA/GA on other Wikipedias. Unfortunately that's not in Wikidata yet. -- Rs chen 7754 10:03, 16 January 2013 (UTC)
Wikipedia:Wikidata interwiki RFC has been started. -- Rs chen 7754 09:33, 17 January 2013 (UTC)
The date is February 11th: http://blog.wikimedia.de/2013/01/30/wikidata-coming-to-the-next-two-wikipedias/ -- Rs chen 7754 20:11, 30 January 2013 (UTC)
So, Wikidata goes live on English Wikipedia in two days. How can we contact all the bot owners to let them know that their bots will no longer be needed to update the interwikilinks in the same way as they have done? It would be great to try for a nice clean transition where interwiki code can begin to slowly be removed from en.wiki. Bot owners should try and direct their bots towards organising interwikis on wikidata instead. Del♉sion23 (talk) 01:02, 9 February 2013 (UTC)
And we're live now! -- Rs chen 7754 21:15, 13 February 2013 (UTC)
I notice that Rubinbot is still editing, that its bot approval request lists it as being based on pywikipedia, and Legoktm's argument that pywikipedia-based bots that don't update themselves and continue re-adding interwiki links should be blocked because of the listed policy. This is not an area I'm experienced in. Is my understanding correct here?, and if so, is blocking here premature? No criticism intended of the author, I just suspect that at some point people will want to stop seeing back-and-forth interwiki adding/removing. (Don't panic, I have absolutely no intention of taking action without some discussion.) -- j⚛e decker talk 18:42, 15 February 2013 (UTC)
Can someone remind me where the test wiki is? I'm updating my bot framework and would like to try out a new function I just created.— cyberpower ChatOnline 03:29, 6 March 2013 (UTC)
The simple testcase at User:Chartbot/simplified dies with an HTTP 517 error as it is, but works fine if I try to send 634 bytes instead of the 635 that the testcase uses. It seems that PHP is sending a Expect Continue 100 header when the request gets too long, and that's getting refused. Anyone know either how to get PHP to stop sending the header or get the Wikimedia server to not get cranky about it? Without me having to learn yet another PHP library? All the parts of the bot that I thought would be hard work, and it's frustrating to be stumped with my toe on the finish line.— Kww( talk) 06:30, 6 March 2013 (UTC)
I'm getting started with a little bot coding. My question isn't necessarily even Wikipedia-specific, but here it is:
Given a particular identifier, a "PMID" (looks like "9736873") which identifies a journal article, I'm trying to go to the NIH's PubMed site and pull an XML file full of metadata about the article. A typical lookup URL looks like: http://www.ncbi.nlm.nih.gov/pubmed/9736873?report=xml&format=text
. In my browser I get back something that looks like:
<PubmedArticle> <MedlineCitation Owner="NLM" Status="MEDLINE"> <PMID Version="1">9736873</PMID> <DateCreated> <Year>1998</Year> <Month>10</Month> <Day>01</Day> </DateCreated> <DateCompleted> <Year>1998</Year>
..etc. I want to use BeautifulSoup to parse the resulting XML and pull out particular fields. My code looks like:
if param.name == 'pmid' and param.value: pmid = str(param.value).strip() url = " http://www.ncbi.nlm.nih.gov/pubmed/" + pmid + \ "?report=xml&format=text" f = urllib.urlopen(url) xml = f.read() f.close() soup = BeautifulSoup(xml)
My problem is that the BeautifulSoup parse doesn't work because what I'm actually getting back from PubMed looks like:
<?xml version="1.0" encoding="utf-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" " http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> < pre> <PubmedArticle> <MedlineCitation Owner="NLM" Status="MEDLINE"> <PMID Version="1">9736873</PMID> <DateCreated> <Year>1998</Year>
so instead of < and > characters I'm getting < and >, and BeautifulSoup isn't parsing it. If I copy the text out of the browser and paste it into a file, and read from the file, it works.
I tried using the requests
library instead of urllib and got the same result. What am I doing wrong? Cheers...
Zad
68
19:10, 7 March 2013 (UTC)
soup = BeautifulSoup(xml.replace('<','<').\
replace('>','>') )
Zad
68
19:41, 7 March 2013 (UTC)
xml = html_parser.unescape(xml_in_html)
and that worked for a few but then bombed on something. Gee what a pain... I'm just going to go with my manual decoding! Thanks for the note that PubMed is "lying" about delivering XML, appreciate it.
Zad
68
19:56, 7 March 2013 (UTC)
File "/usr/lib/python2.7/HTMLParser.py", line 472, in unescape return re.sub(r"&(#?[xX]?(?:[0-9a-fA-F]+|\w{1,8}));", replaceEntities, s) File "/usr/lib/python2.7/re.py", line 151, in sub return _compile(pattern, flags).sub(repl, string, count) UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 1: ordinal not in range(128)
M\xc3\xb6ls\xc3\xa4
. If I pre-process what PubMed gives me and remove that last name with html_parser.unescape(xml_in_html.replace('M\xc3\xb6ls\xc3\xa4',' '))
it works. Any insight appreciated!
Zad
68
14:15, 8 March 2013 (UTC)
All botops should be aware that Wikipedia just experienced a server failure that affected all bots. If your bot just edited in a strange and unusual manner, it may not have been the bot's fault.— cyberpower ChatOnline 00:49, 13 March 2013 (UTC)
Hey folks, I'd like to modify my bot (coded in Python) to pull the Wikicode for an article and extract all the <ref>...</ref>
definitions, put them in a list, and do whatever I want with them. First use will be to create a table of refs for me to review in doing GA or FA reviews. Bot already uses the mwparserfromhell
libraries, and with it I can parse out templates nicely, but now I'm trying to use it to parse out ref tags and having trouble, they appear to be treated as text by the library:
>>> import mwparserfromhell >>> text = "I has a template!<ref>{{foo|bar|baz|eggs=spam}}</ref> See it?" >>> wikicode = mwparserfromhell.parse(text) >>> wikicode.filter_templates() [u'{{foo|bar|baz|eggs=spam}}'] >>> wikicode.filter_tags() [] >>> wikicode.filter_text() [u'I has a template!<ref>', u'</ref> See it?']
Any idea what I'm doing wrong? Cheers...
Zad
68
15:33, 13 March 2013 (UTC)
It is imperative that any out of the norm edits as of now be reported to me. I am about load an update to the edit function of the framework into my bot which will affect all tasks. I will of course be watching myself, but I may miss something as I recode Peachy.— cyberpower ChatOnline 23:22, 14 March 2013 (UTC)
Can we routinely ask users to post the function overview even if it is a request to take over another bot's task? There is no standard way that bot tasks are linked that I can understand, and if it is another bot's task, it takes forever for me to search for it, sometimes with no result.
I think it is straightforward to not create a trail of breadcrumbs/links and just include a sentence in the function overview about what the bot does.
Thanks, - 68.99.89.234 ( talk) 11:43, 10 March 2013 (UTC)
Howdy. After a discussion at ANI, I'm hoping that somebody here might be able to help. Over the last few weeks, HBC AIV helperbot5 ( talk · contribs) and HBC AIV helperbot7 ( talk · contribs) have been intermittently disappearing from WP:AIV for anywhere from a few hours to nearly a day. The bots appear to be functioning normally at WP:UAA - or at least more normally - so it's not as if the bots are just dead. Because AIV maintenance is a fairly important function and I'm not sure if the bots' operators are around at the moment, I was wondering if anybody here had any idea what the problem might be? Thanks. -- Bongwarrior ( talk) 01:33, 22 March 2013 (UTC)
FYI, DASHBot has been blocked as it is malfuncioning, Tim is aware. As soon as the code is fixed I'll unblock. Regards, Giant Snowman 20:05, 23 March 2013 (UTC)
From now on you may use two new exceptions in Pywikipedia (r11300). When replacing texts with replace.py/fixes, you may add to 'inside-tags' section of exceptions:
Cheers, Bináris ( talk) 09:38, 30 March 2013 (UTC)
See discussion at User talk:cyberpower678#Buggy bot, and feel free to jump in and help if you're familiar with the framework that cyberpower is using. ‑Scottywong | speak _ 22:29, 3 April 2013 (UTC)
As described here, Citation bot ( BRFA · contribs · actions log · block log · flag log · user rights) needs an update to be fully compatible with the new Lua-based citations when there are more than 9 authors or more than 4 editors. However, the bot's original author doesn't seem to have much time for it anymore [6]. He invites others to improve the improve the source though, and I was hoping that someone else might feel inclined to work on the needed improvements. Dragons flight ( talk) 22:52, 3 April 2013 (UTC)
In addition to the core Framework being updated, and bug fixes being implemented on the currently shutdown tasks. Cyberbots I and II will be slowly migrating to labs. Services such as RfX reporter, tally, cratstats, and adminstats that many of you use may get cut for a bit. Also because I am unfamiliar with labs at the moment, I have absolutely no idea how adminstats will run on labs. If there are any errors or out of the norm edits in the coming future, please let me know.— cyberpower ChatOffline 15:17, 10 April 2013 (UTC)
Could someone please check Special:Contributions/EdinBot - I think it may be operating on the wrong wiki. Thanks! GoingBatty ( talk) 01:58, 15 April 2013 (UTC)
See Special:Contributions/185.15.59.211. Of course, I have no idea which one it is. What can/should be done when a bot is editing logged out? Thanks. Someguy1221 ( talk) 03:26, 20 April 2013 (UTC)
We have an existing consensus to softblock the toolserver IP addresses, for this very reason. I guess from the above that they have new IP addresses now; that would be 185.15.59.192/27, judging by whois data? Which is part of 185.15.56.0/22 that is assigned to WMF. And then there's 2620:0:860::/46 in IPv6 assigned to WMF too, of which we currently have 2620:0:862:101::2:0/124 blocked as toolserver.
We should probably also do the same for Labs projects; it seems edits from Tool Labs will currently come from 10.64.0.126, and I see past edits from 10.64.0.123, 10.64.0.169, and 10.64.0.170, but the full range of possible IPs is not obvious (random guess: 10.64.0.123–170, plus a few others in 10.64.0.0/12, plus some in 2620:0:861:101:10::/80 if IPv6 happens to be used). OTOH, should logged-out edits be coming from anything in 10.0.0.0/8 at all? Anomie ⚔ 03:35, 21 April 2013 (UTC)
Cyberbot II is now running on labs. I am now migrating Cyberbot I.— cyberpower ChatOffline 18:47, 24 April 2013 (UTC)
Hello Bot owners. It seems that User:Tedder has recently had difficulty with TedderBot. The bot has been out of commission for nearly 2 months. This lack of service is a great loss for the staff of many of the various WIkiProjects, who usually reply on that bot's listings of new articles that are in their subject areas. Is it possible for other people to help Tedder in some way in order to get TedderBot up and running properly? Perhaps I should apologize for going behind Tedder's back in this way, but I believe my motivation is good. Many thanks, Invertzoo ( talk) 00:57, 2 May 2013 (UTC)
I left a discussion about this on the bot owner's talk page, as well as the MfD talk page. It seems like since January 2013, One bot has been doing a great job removing closed discussions on Wikipedia:Miscellany for deletion, but has not been archiving any of the discussions. This can be proven on the bot's edit log. Steel1943 ( talk) 07:13, 23 April 2013 (UTC)
Does anything need to be done about User:Riley Huntley's approved bots? He retired under a cloud due to sharing his password, a single incident, but I think this should raise concerns about flagged bots and approved tasks. - 68.107.137.178 ( talk) 15:19, 1 May 2013 (UTC)
Actually, I also noticed there were a lot of such unsubstituted templates, plus as I mentioned in my BRFA, I was just extending a task my bot already does on Commons and Wikidata. Hazard-SJ ✈ 02:42, 3 May 2013 (UTC)
@ Addshore: I talked to MBisanz back when Riley first announced his retirement (Apr 29th) and the idea is just that, if his bot dies, I'll file a takeover request (Riley emailed me all of his code). Like others said above, though, until it dies, we don't have an issue. Theopolisme ( talk) 11:07, 14 May 2013 (UTC)
Went ahead and filed a BRFA here. There's no real hurry, but, per the above arguments, "better safe than sorry." Theopolisme ( talk) 14:05, 18 May 2013 (UTC)
Please go to WP:VPR and offer your opinions in the "Remove bot flag from inactive bots" thread. Nyttend ( talk) 00:07, 23 May 2013 (UTC)
Just to let everyone know that I have blocked User:EmausBot for incorrectly removing interwiki links from articles. This was first reported here over a week ago and nothing has been done. After giving Emaus an extra day to respond I have blocked the bot.
The way i see it, the bot removed a link from Stockert Radio Telescope to de:Astropeiler Stockert after looking at d:Q2350652 which when the bot looked at it contained only two links. These links linked de:Stockert (Berg) and Stockert Radio Telescope. After looking further de:Astropeiler Stockert is a section redirect to de:Astropeiler_Stockert#Radioteleskop_Astropeiler_Stockert which means it should have been left and not removed.
·Add§hore· Talk To Me! 11:07, 20 May 2013 (UTC)
Please see User talk:Citation bot/Archive1#Update required to avoid deleterious impact on new Lua-based citations. This issue is not being addressed, and there are various other complaints on the bot talk page which are similarly being ignored. Should I block the bot? -- Redrose64 ( talk) 21:34, 18 May 2013 (UTC)
I haven't been around on-wiki much lately, and I don't see that situation changing anytime in the near future. My bot has still been running on autopilot, but I haven't been monitoring it at all. The code is fairly stable, so maybe it's not a big deal. But, if anyone is interested in taking over the tasks that Snotbot still runs, let me know. I believe the main tasks that it runs are task 10 (cleaning up various things with AfD's), task 12 (archiving requests at RFPP), and another task that doesn't have a BRFA because it only edits userspace (updating the summary table at CAT:RFU). The code is Python using the pywikipedia library. If someone does take over the tasks, I'd prefer it to be someone who has some familiarity with Python and pywikipedia. ‑Scottywong | converse _ 13:43, 23 May 2013 (UTC)
If they're truly "communal" bots then we can just make LabsBot1, fill 'er up with 10 tasks, then create LabsBot2, fill 'er up with 10 more tasks, etc...what would be really awesome is if someone created a UI for managing them, though...i.e., JIRA on steroids? So, you can "submit new task" and it gets automatically assigned to the next available bot, you just have to upload source code and it can be automatically scheduled into the crontab?...I smell unicorns.... Theopolisme ( talk) 04:14, 29 May 2013 (UTC)
Hi! There is currently a request for global editinterface rights for Addbot open on meta wiki here to allow the bot to edit protected pages to remove interwiki links that are already on wikidata. It has been proposed that a second global bot group be created that includes the following flags (edit, editprotected, autoconfirmed). This is not something stewards want to rush into as the flag would allow the bot to operate on protected pages and would prefer to have a wider participation in the request for approval before any action is taken. All comments should be posted at (meta:Steward_requests/Global_permissions#New_global_permissions_group) ·addshore· talk to me! 14:54, 1 June 2013 (UTC)
A request for comment has been started at Wikipedia:Requests for comment/The bot flag regarding removing the bot flag from inactive bots and potentially modifying the bot flag itself. The RFC started after the discussion on VP/Prop. ·addshore· talk to me! 10:28, 6 June 2013 (UTC)
Hello to everyone, I`m a user from greek (el) wikipedia. Could you please check the bot contibutions of CarsracBot; He added interwikis in greel (el) wikipedia.-- Vagrand ( talk) 19:54, 12 June 2013 (UTC)
In recent weeks some bot is using Perl literals to access articles. In the situation of an article name having diacritics or en-dash/em-dash it fails. For example using Gal\xC3\xA1pagos Islands rather then Galápagos Islands and using Karush\xE2\x80\x93Kuhn\xE2\x80\x93Tucker conditions rather then Karush–Kuhn–Tucker conditions. This is causing Perl literal names to appear on WP:TOPRED and also distort the Wikipedia article traffic statistics. Of course it might be automation beyond the scope of WP:BOT, but perhaps posting here will give someone a realisation. From the number of redlink hits it appears the page visits EXCEED those of user article views, so we are looking at some major bot doing a lot of page visits! Something like over a quarter of a million hits are not going directly to the articles weekly. Regards, Sun Creator( talk) 23:47, 24 June 2013 (UTC)
I created a litte PHP script/"bot" to upload pictures from my web site to Commons, just to avoid that I had to go through the file upload wizard and enter all the information manually that is already in our database anyway and should thus be transferred automatically. I only used it occasionally, and all it did was upload one picture at a time. Now it appears that the server's IP address got blocked (the response contains ":"Unknown error: \"globalblocking-ipblocked\""), so I guess somebody does not like what I'm doing. Thus:
Thanks! -- Kabelleger ( talk) 19:20, 25 June 2013 (UTC)
There appears to be a problem with the Wikipedia back-link information returned by "What links here". See Wikipedia:Administrators'_noticeboard/Incidents#Hazard-Bot_false_positives_flood. This caused Hazard-Bot to start flagging thousands of images as orphaned. This may be due to a corrupted database index. Please watch your 'bot behavior closely until this problem is cleared up. Any 'bots that rely on "what links here" data should be temporarily suspended. -- John Nagle ( talk) 20:26, 22 July 2013 (UTC)
(Posted only at WP:VPT; thought it would be worthwhile to repost here. -- John Broughton (♫♫) 03:18, 24 July 2013 (UTC) )
Hello, Sorry for English but It's very important for bot operators so I hope someone translates this. Pywikipedia is migrating to Git so after July 26, SVN checkouts won't be updated If you're using Pywikipedia you have to switch to git, otherwise you will use out-dated framework and your bot might not work properly. There is a manual for doing that and a blog post explaining about this change in non-technical language. If you have question feel free to ask in mw:Manual talk:Pywikipediabot/Gerrit, mailing list, or in the IRC channel. Best Amir (via Global message delivery). 13:06, 23 July 2013 (UTC)
NoomBot was shut down by its creator (see User talk:NoomBot#Bot shutoff), and that user went inactive the next day and hasn't made a single edit since April 22, 2013. I thought I should report this and that maybe someone would step up and takeover. The operator states in the talk page post I linked that if anyone wants the source code to email him.-- Fuhghettaboutit ( talk) 12:40, 26 June 2013 (UTC)
Please see this thread about a weird occurence related to MadmanBot. De728631 ( talk) 13:15, 7 August 2013 (UTC)
I can't seem to find a record anywhere that User:RotlinkBot is an approved bot. It doesn't have the bot bit set. Am I just missing something? ElKevbo ( talk) 18:00, 18 August 2013 (UTC)
I need someone to take over my bots for me. This has been a long time coming and I simply don't have the time to maintain them anymore. Furthermore my response time to issues/bugs has started to become unacceptable for a bot op. They're all written in PHP, have some quirks, and could use a bit of work, but they are pretty easy to run as long as you keep on top of things. So if anyone is interested, please drop me a line and I'll send you the latest source code, and details of how to set the bot up.
-- Chris 03:18, 22 August 2013 (UTC)
Hi all. User:28bot is going to be offline for a while, so I thought it would be best to solicit other bot ops to take over the tasks it currently performs. Those are:
Thanks, 28bytes ( talk) 07:04, 11 August 2013 (UTC)
Starting Aug 29, my pywikipedia-based bot can't edit anymore - it returns an error message saying "Token not found on wikipedia:en. You will not be able to edit any page". Anyone else seeing this? -- Rick Block ( talk) 19:47, 31 August 2013 (UTC)
use_api = True
)
Theopolisme (
talk) 20:01, 31 August 2013 (UTC)
Updating to the latest git clone fixed it. Thanks! -- Rick Block ( talk) 23:33, 31 August 2013 (UTC)
FYI: The Tools cluster is in a period of instability ( http://ganglia.wmflabs.org/latest/?r=hour&cs=&ce=&m=load_one&s=by+name&c=tools&h=&host_regex=&max_graphs=0&tab=m&vn=&sh=1&z=small&hc=4). If your bot runs on the tools cluster let them know that the labs administrators are made aware of the problem and are in the process of cleaning up the problem. Hasteur ( talk) 11:00, 10 September 2013 (UTC)
I am getting significant push back from a single editor (who happens to be an Admin) who is wanting to significantly change the way that approved tasks function. I fundamentally disagree with their complaint and want to know how far am I required to bend to accommodate individual editors complaints about the bot's approved action. I am intentionally not making it about the individual editor or about the specific complaint, (but am willing to accept advice within the current context). I think I've already been very accommodating with respect to the approval and taking onboard reasonable requests, but I think that fundamentally changing the bot's activities to make it effectively toothless and a disservice to Wikipedia (why even bother nominating G13s if we're never making headway on the backlog). Hasteur ( talk) 19:13, 13 September 2013 (UTC)
MiszaBot III ( talk · contribs) has not edited for a week, and Misza13 ( talk · contribs) has not edited here since May. Did something happen to the API/framework/etc on September 11 that would explain the failure? Can anyone suggest how to get it going again? -- John of Reading ( talk) 16:00, 18 September 2013 (UTC)
I'll check the logs when I get back from work and have SSH access. On a different note, if anyone wants to take over this mess from me, they're more than welcome. The bot is mostly running correctly on pages where configuration is okay, but the amount of pages where it errors out due to misconfiguration, blacklisted links etc. is simply staggering. And I have neither time nor interest to clean up people's mess anymore. — Миша 13 07:42, 20 September 2013 (UTC)
Please remember to stop the weblink checking bots for now! Library of Congress is already down. [8] -- Hedwig in Washington (TALK) 02:00, 2 October 2013 (UTC)
I thought it worth pointing out to the botops about this RfC.— cyberpower ChatOnline 23:52, 2 October 2013 (UTC)
Some input would be appreciated here about the possibility of a bot with admin rights. Giant Snowman 11:49, 11 October 2013 (UTC)
See m:October 2013 private data security issue. If your bot tries logging in through the API, it will fail unless you manually login and reset the password. Legoktm ( talk) 06:05, 3 October 2013 (UTC)
MiszaBot I ( talk · contribs) has not edited since 04:16, 2 October 2013; MiszaBot II ( talk · contribs) not since 22:35, 2 October 2013; and MiszaBot III ( talk · contribs) not since 00:18, 3 October 2013. Is this an effect of the API password issue described above? -- Redrose64 ( talk) 16:50, 4 October 2013 (UTC)
I switched from MiszaBot to ClueBot for my user talk archival when the former went down. Now it looks like User:ClueBot III is down as well -- it hasn't edited since 10/20 (PDT). Any news on that front? — Darkwind ( talk) 18:57, 25 October 2013 (UTC)
See this [9] ANI discussion re SporkBot "fixing" the use of now-deleted templates on archived comments from article talk pages. As I explain there I see unquantifiable (but nonzero) risk, and zero value, of tampering with already-archived talkpage comments, such as here [10]. I'd appreciate others' thoughts on this. EEng ( talk) 13:21, 27 October 2013 (UTC)
You dont understand how the reports are created. Lets say we have template X. It is used on a total of say 10000 pages. A deletion discussion as been closed as delete. We have two options, substitute and then delete, or change links and then delete. Otherwise you will end up with thousands of template transculsions that are broken. This causes several issues. It adds pointless entries to Special:WantedTemplates, and makes general housekeeping of template issues messier. Outright deletion of templates would modify the posts in the exact same way that you want to not do. Lets say {{ Keep}} was deleted/merged/moved/whatever to make it mean {{ Delete}} anywhere where that template was modified the meaning of the original poster would get changed. Hosekeeping bots would then subst the old Keep template, making it an orphan, maintaining the meaning of the post, while also allowing the template to be changed. Also you where talking about filtering out archive pages from the report, How do you define an archive? I bet I can find 100 archive pages that fail to meet your definition of an archive but are still an archive. These reports are often created via processing template links in the database, requiring the bots to access the live wiki, download and process the text of a page to filter out archives. If template X is used on 10,000 pages there might be 2,000 pages where its on an archive, WantedTemplates cannot take that into consideration, and each time the bot ran it would need to process all 10,000 pages each time it ran in order to find which pages are archives and which are not. That just ends up being a megalithic process that would succumb to its own weight in a short amount of time, and fail. The best solution is for a bot to just cleanup those templates converting the transclusions into either substitutions or links. Werieth ( talk) 23:54, 28 October 2013 (UTC)
There seems to be a problem with the above - it is adding Today's Featured Article to some IP talk pages instead of a warning. Regards Denisarona ( talk) 15:54, 6 November 2013 (UTC)
Being lazy I always used to code {{subst:dated|clarify}}, letting some bot come in to add the date. But recently I discovered I could code {{subst:dated|clarify}}. It's a minor difference but it does eliminate the need for an extra history entry (and possible edit conflict) where the bot does the dating. The funny thing is that other editors can't learn from my example, because once the subst is done the code left in the source looks the same as if I had added the date by hand.
Not to take away work from your lovely bots, but wouldn't it be good to publicize this? It occurs to me that one way to do that would be in the bot's edit summary as it adds dates e.g. instead of just saying
say something like
Just a thought. EEng ( talk) 03:10, 29 October 2013 (UTC)
{{
subst:dated}}
is that it doesn't pass any parameters through: it assumes that the only parameter used by the wrapped template is |date=
, which is often not the case. To take your example, {{
clarify}}
recognises four parameters: |reason=
|date=
|pre-text=
|post-text=
- but {{
subst:dated|clarify}}
would only fill in |date=
. If you were to use {{
subst:dated|clarify|reason=does this description refer to the whole station, or just the ticket hall?}}
, what you would get back is {{
clarify|date=October 2013}}
- the |reason=
parameter has been lost. However, if you subst: {{
clarify}}
in the manner described by its documentation, i.e. {{
subst:clarify}}
- that is, without using dated|
- it yields {{
Clarify|reason=does this description refer to the whole station, or just the ticket hall?|date=October 2013}}
and the |reason=
parameter is preserved. --
Redrose64 (
talk) 08:33, 29 October 2013 (UTC)
Dead end caused by EEng not reading what people said carefully
|
---|
Well, gee mister, why dintcha say so in the first place? That's the very thing needed! Works beautifully
becomes wikisource
which renders on page as The need for all caps DATE is unfortunate, as is the need to remember to subst and not not to code date=date=October 2013 which would be a natural mistake to make. Please, no lectures on car, cdr, nil and so on, but isn't there some way to make a template selfsubsting, so we could code something like
(where SDATE means self-substing DATE, or something). Anyway, assuming no such improvements, what would y'all think of bot edit summaries like:
Along another line of thought, isn't there some way to make a template add the date on its own, by default? EEng ( talk) 00:07, 30 October 2013 (UTC)
To be blunt my cognitive faculties were neutralized by your run-on paragraph of examples. Why didn't you just say:
Being by then in an impatient frame of mind, when I saw "AWB" in Anomie's post I figured, "Oh, the technogeeks are having a coding fest again" and kind of tuned out. (Please understand when I say this that I myself am part technogeek on my father's side.) I am taking the liberty of collapsing this side discussion, which is my fault. |
Anomie's idea sounds great, though I lack the knowledge to comment on potential technical problems, and I take it this is not the forum to gain agreement on this. I'd be happy to join the discussion there. BTW, I have the recollection that at least some of the usual templates (claify, cn, ...) don't seem to understand reason= in the sense they don't show the reason text when one hovers. May I suggest that all of these types of improvement-needed templates consistently take such a parm and show it on hovering. Synonyms for reason (such as concern=, explanation= ...) might be good too. EEng ( talk) 16:02, 30 October 2013 (UTC)
I'm not sure if the right forum for this question, but I'm having trouble with PyWikiBot. When running any program from the Command Prompt, I get the message 'git' is not recognized as an internal or external command, operable program or batch file.
Running login.py lets me log in (after displaying that message), but nothing else works.
Any help would be appreciated. My computer uses Windows 7. – Ypnypn ( talk) 17:39, 19 November 2013 (UTC)
I think I figured out the problem: The current version of PWB uses the basic import pywikibot
, but
the instructions on MediaWiki say said to use import wikipedia
. Thanks for your help, Werieth! -
Ypnypn (
talk) 23:05, 19 November 2013 (UTC)
People are encouraged to place {{ Experimental archiving}} on talk pages if they used MiszaBot for archival. Wikipedia:Bots/Requests for approval/Lowercase sigmabot III 2 is growing stagnant, and such a task should be moving faster than it is. → Σ σ ς. ( Sigma) 23:31, 19 November 2013 (UTC)
Wanted to let this board know that User:DumbBOT has not performed any edits since November 23. Since this bot creates and transcludes WP:RFD daily subpages, I would hope that this can be resolved ASAP. I went ahead and posted this issue on the bot's owner Tizio's talk page. Hopefully, this gets resolved soon. Steel1943 ( talk) 08:11, 25 November 2013 (UTC)
As with all of Misza13's bots, User:Wikinews Importer Bot hasn't run since October 26th 2013. Is there a replacement in the works or another bot that could run this task? Nanonic ( talk) 01:40, 9 December 2013 (UTC)
I have been nominated for BAG membership. Input is invited. The request can be found at Wikipedia:Bot Approvals Group/nominations/Cyberpower678 2.— cyberpower Online Merry Christmas 14:24, 22 December 2013 (UTC)
A new "Draft" namespace has been configured on enwiki for suitable AfDs and new articles (voluntary except for IPs). Ids: Draft - 118, Draft talk - 119. See Wikipedia:Drafts for more details. -- Bamyers99 ( talk) 19:31, 24 December 2013 (UTC)
Just for the records: Pywikibot owners and developers may contribute there patches to the pywikibot framework without having git installed on their local computer, but using the Gerrit Patch Uploader tool. Have fun! @ xqt 15:46, 30 December 2013 (UTC)
The proposal was closed as having consensus to move all orphan tags to the talk namespace, including with a bot. Any bots or scripts that currently add {{ orphan}} to articles should be modified accordingly. Ramaksoud2000 ( Talk to me) 20:51, 19 December 2013 (UTC)
AWB is almost ready to disactivate orphan tagging for the English Wikipedia and can also guarantee that AWB bots can add orphan tags in the correct place in the talk pages. Only some final minor changes should be done. AWB moreover, will discontinue automated orphan tagging/untagging on article space and won't auto tag on the talk pages. -- Magioladitis ( talk) 23:46, 21 December 2013 (UTC)
Is there a list of all currently approved adminbots? WJBscribe (talk) 12:16, 6 January 2014 (UTC)
Hi,
can someone recommend me a bot (software) for mass creation of articles/categories? (It's not a task for english wikipedia, so i can't post to requests). Thanks in advance. --
XXN (
talk) 22:32, 26 December 2013 (UTC)
«Contribs» 16:17, 3 January 2014 (UTC)
I've brought this to the attention of DeltaQuad. MM (Report findings) (Past espionage) 02:30, 18 January 2014 (UTC)
Issue is now resolved. DQ didn't say that he'd sorted it but his last edit was on the 19th to DQBot and it's now working properly. MM (Report findings) (Past espionage) 00:49, 21 January 2014 (UTC)
The new Flow extension is being deployed to enwiki today. It is being deployed to only two wikiproject pages as a test run to get real users trying out the new interface constructs so they can be tweaked or completely changed based on real world usage until we arrive at a discussion system that can serve the needs of wikipedians.
Because Flow is in such an early stage, with many things uncertain, the API modules it enables are a shim exposing the internals which is sufficient only for the existing ajax calls. These will change without notice, and I encourage you to not yet build out integrations with these APIs.
We have a regular integrated MediaWiki API in the works ( T59659 and others) which bots will be able to integrate with, we expect to have this merged and deployed well before expanding from our initial test runs in the wiki project space. Flow integrates with a number of MediaWiki constructs such as recent changes, watchlists, contributions, etc. Feel free to file bugs for anything those integrations might break that previously worked.
EBernhardson (WMF) ( talk) 17:41, 3 February 2014 (UTC)
?action=foo
) on the page and implements its own; at the API level it implements its own actions, but it seems existing API actions succeed (
T62808). --
S Page (WMF) (
talk) 04:17, 4 February 2014 (UTC)Using PyWikiBot, my program keeps on failing whenever a page contains non-ASCII characters. (Actually, it only fails when regex-searching the text or when outputting it to Command Prompt.) -- Ypnypn ( talk) 19:58, 6 February 2014 (UTC)
Hi. Could someone please look at this? Things are quite dead, and this need to be done soon. A number of articles are partially updated, and so is the main {{ Infobox dam}}. It's a mess. There is consensus, and no objections. Can we get this going right away please? Reh man 10:21, 9 February 2014 (UTC)
When running a custom script with pywikibot, what's the best way to catch that the bot has been blocked if you want it to do certain things only in that case (e.g. save a log to your hard drive instead of posting it to a user subpage on-wiki)? This is in regard to the compat release, as I had trouble getting the core release to work on my computer. I tried using a try/except block to catch pywikibot.UserBlocked
(based on the help text at the top of wikipedia.py), but it didn't catch it when I blocked my bot, and I'm not sure what else to try. Any help here would be appreciated. (If anyone cares, this bot is for
an external wiki, but this page seems to be the place where I'm most likely to get a reasonably quick reply. Hope you don't mind the quick question; I don't plan to make asking questions here a habit.)
jcgoble3 (
talk) 06:53, 27 February 2014 (UTC)
There isn't a problem for MiszaBot I or other bots to do archiving on template talk pages, is there? Harold O'Brian ( talk) 03:05, 4 March 2014 (UTC)
Cyberbot I and II are being migrated to new data centers in labs, and will be down for a bit. Bot ops using tool labs are encouraged to migrate at their earliest convenience.— cyberpower ChatAbsent 15:29, 5 March 2014 (UTC)
Is there a bot which find non free files (logos; covers for movies and singles) with specified license via template, but without information/source or N-F use rationale, and it automatically add missing rationale template? // XXN ( talk) 16:57, 6 March 2014 (UTC)
This notice is to inform the people that monitor this page that a topic has been brought up on Wikipedia:Administrators' noticeboard#User talk:Hasteur#HasteurBot being naughty? that you may be interested in. — {{U| Technical 13}} ( t • e • c) 18:36, 18 March 2014 (UTC)
How can I create a new entrywith wikipedia? — Preceding unsigned comment added by Garrynewyork ( talk • contribs) 13:35, 4 April 2014 (UTC)
CBM implemented and ran VeblenBot and PeerReviewBot, but is retiring from Wikipedia. I am in occasional email contact with CBM who wrote:
"It would be a good idea to find a different person to run the bot jobs. With the WMF Tools setup, I can actually just hand them the entire bot as a turnkey, they would not need to re-implement it. If you can find someone, please ask them to email me (and you email me) and I will be able to communicate with them that way."
If you are interested in taking over these bots please reply here. They are usually pretty trouble free. My email and CBM's email are both enabled.
I do the monthly PR bot maintenance (making the files and categories) and that includes adding the new PR category each month on the VeblenBot account - I would be glad to keep doing that (and give details on email).
Thanks, Ruhrfisch ><>°° 13:50, 5 April 2014 (UTC)
contributions JV Smithy ( talk) 06:10, 12 April 2014 (UTC)
Can anything be done to restore the automatic updating of the Defcon score and level, which VoxelBot until recently was doing every half-hour? Lots of counter-vandalism workers will be looking at out-of-date information on the many displays based on {{Vandalism information}} that are fed by this process : Noyster (talk), 16:11, 3 April 2014 (UTC)
A request for comment on MediaWiki.org is seeking feedback on the possible deprecation of pywikibot/compat. If you're running that framework, you may be interested in the discussion. -- Ricordi samoa 23:52, 3 May 2014 (UTC)
Ever since late yesterday Wikipedia time, User:10.68.16.31 has been archiving discussions on assorted pages and labeling them as (BOT). I'm not sure, but I don't think bots should be doing work logged out, so I brought it here. I've got no idea which bot it is. Supernerd11 :D Firemind ^_^ Pokedex 03:27, 12 April 2014 (UTC) Sorry about being such a noob!
As of June 1st 2014, Cobi's BOT is still causing issues. He is not an active moderator.
User:Citation bot ( | talk | history | links | watch | logs) is a much-loved fixer of citation templates, but its creator/operator is busy IRL, and has been so for quite a while. His appeal for assistance or relief in maintenance and operation of that bot has gone largely unanswered.
After the switch to Lua for the CS1 templates, a substantial rewrite was done, but some nasty bugs are not yet dealt with. The op has barely been onwiki and hasn't touched the code in eight weeks. Even code reviews would be a big help, and if someone can find corrections, so much the better. I've tried but my brain won't wrap itself around the language used (php).
The code is open, available at http://code.google.com/p/citation-bot/ for anyone considering helping out. LeadSongDog come howl! 15:31, 7 May 2014 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | Archive 6 | Archive 7 | Archive 8 | Archive 9 | Archive 10 | → | Archive 15 |
This bot, which maintains the list of requested moves, suddenly stopped working after 17:30, 18 July 2012. I'm asking this group if anybody knows how to kick start the bot. I recall the bot has been stopped before: Wikipedia:Bot owners' noticeboard/Archive 7#RM bot inactive – Wbm1058 ( talk) 00:16, 20 July 2012 (UTC)
Discussion at Wikipedia:Administrators' noticeboard/IncidentArchive761#RM bot inactive. Wbm1058 ( talk) 13:35, 20 July 2012 (UTC)
...and at Wikipedia:Village pump (technical)#RM bot inactive – Wbm1058 ( talk) 12:44, 23 July 2012 (UTC)
Howdy all! Very long time no-really-do-anything-with-Wikipedia! For those (everyone? :)) who has no idea who I am, I "operate" HBC Archive Indexerbot, and ages ago helped out with some of the development of that bot and the HBC AIV Helperbots. Well, I've been horrifically inactive and unhelpful here on Wikipedia for ages now, and have decided it's finally time to try to see if someone else wants to take over the operation and maintenance of HBCAI, rather than it continuing to languish and get repeatedly ignored by me. This seemed the most logical place to suggest that, but please let me know if anyone thinks there's a better place to move the discussion.
HBCAI is written is Perl and designed to run on a UNIX/Linux-like system. I've been running him on FreeBSD, but he should work just as well on any *NIX really. I'd strongly recommend that whomever takes over this bot be reasonably well versed in both Perl and *nix administration, as getting it up and running on a new system may be a bit of an adventure, and I'm afraid I probably won't be much help due to time constraints. The bot uses considerable CPU and RAM, between 1 and 2 GB of RAM active during a run. I've been running it 2x/day, and each run lasts close to an hour, if memory serves.
The source is available on the wiki via the bot's user page, but I'll be happy to provide a bundle with the exact sources including Mediawiki.pm that I'm using, as it's somewhat finnicky about that sort of thing. To be honest I'm not even 100% sure it's working at all at the moment; it seems to break periodically when things change in the Mediawiki software. It's really not a bad bot to run, but I'm just so out of the loop on all things Wikipedia that it's too much effort to try to figure out what's up every time something breaks, and I'm terribly non-responsive on my talk page, and it makes me feel like a jerk. Plus the whole thing really could use a total rewrite, or at least some serious TLC, because it hasn't had much in the last five years or so.
I'll try to check in on the discussion here, if you want to volunteer specifically, though, please also drop me an e-mail via my user page so I know to check in and I'll try to be reasonably responsive! Thanks! — Krellis ( Talk) 00:21, 23 July 2012 (UTC)
lowercase
, but my personal preference is mixedCase
.
Lego
Kontribs
TalkM 08:28, 24 July 2012 (UTC)
Hi -Could anyone here tell me what effect adding this protection tool had on our vandal bots/edit filter? Is it possible for bots/edit filter to search the pending edit queue and reject a "desired addition" that is not yet reviewed and not yet added to an article ? Youreally can 18:24, 26 July 2012 (UTC)
I would expect bot operation to be unafffected by pending changes. For me, the issue is that sometimes we would want the bot operation to be tweaked slightly. For example, if an anti-vandal bot reverts an edit and the version before the reverted edit was approved then we want the bot to approve its new version. If the version before the vandal edit was not approved then we would want the bot to leave its new version unapproved also. Yaris678 ( talk) 11:57, 30 July 2012 (UTC)
Hi all,
I've been playing with pywikipedia a little bit in - and so far using it only to read and parse pages (which has been really quite useful for a number of things). I'd like to move towards using pywikipedia to make changes in a 'approved by human' way.
Now, it's trivial for me to, say, print out the original wikitext of a page/section and then print out the proposed new text and ask the user at the command line if they approve the change - but it would be much more useful/fancy, if when the pywikipedia script had an edit that it wanted to make, it opened up a browser window and gave a preview page that the editor could view. My question is: is that sort of functionally buried anywhere in the pywikipedia librarys? and if not are there any approximations I could use? Fayedizard ( talk) 07:55, 12 August 2012 (UTC)
Is VIAFbot ( talk · contribs) an approved bot or in the process of getting approval? I see a lot of test edits from it today, and it looks like it may be a port from or otherwise related to de.wikipedia. I'm not an expert on bots; that's why I'm asking here before acting. — C.Fred ( talk) 21:28, 16 August 2012 (UTC)
Hi all,
I'm considering the idea of a bot that looks at 'See Also' sections of articles, and does things like remove elements if they are already in the main article (per the "As a general rule the "See also" section should not repeat links which appear in the article's body or its navigation boxes" part of Wikipedia:Manual_of_Style/Layout#See_also_section). I enjoy writing code and it would be quite nice to write a bit of python that works it's way though a wikiproject and presents some edits to the bot runner for approval. I'm interested to know if a) this is reasonable functionality for a bot and b) if other bots already have this capability. Fayedizard ( talk) 11:48, 21 August 2012 (UTC)
I have blocked the seemingly unapproved Commons fair use upload bot ( talk · contribs), and opened a discussion at the incidents board. Input from those familiar with bots/bot policy would be appreciated. J Milburn ( talk) 15:31, 24 August 2012 (UTC)
Is there something wrong with the bot used for giving updates with assessments in WikiProject? WP:LT/A has not been updated for over a month. Also, manually accessing the bot through toolserver is apprently forbidden, according to this. Simply south.... .. flapping wings into buildings for just 6 years 15:40, 26 August 2012 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
I just received a newsletter that I never asked for The Olive Branch: A Dispute Resolution Newsletter (Issue #1), and whoever is behind it is forcing editors to add their names to Wikipedia:Dispute Resolution Improvement Project/NewsletterOptOut to stop being spammed. I am not involved in this process, I do not want to be involved, and I should not have to be put on some list of shame for something that some other editor thought was important. I can't believe that the process actually approved a newsletter that was opt-out instead of opt-in, that's crazy. ▫ JohnnyMrNinja 19:22, 4 September 2012 (UTC)
Seriously. Participating in a WP board is not an opt-in to be spammed. Not okay, and I'm surprised that there isn't anything in guidelines forbidding users from signing up others to receive spam without their permission. – Roscelese ( talk ⋅ contribs) 19:44, 4 September 2012 (UTC)
Seriously, Ocaasi, if you write an apology at the top of the newsletter and add an opt-in list there may still be time to save it. At this point most of the people haven't logged in yet. This is going to create a negative impression of the newsletter that will take a while to wear off, if it survives. People resent things forced on them, even if they would normally like it. Accept that there will not be another opt-out mailing and move on. ▫ JohnnyMrNinja 21:02, 4 September 2012 (UTC)
As I just posted at EdwardsBot, I'd really like to see an RFC or some other similar process implemented to gauge consensus about when it's appropriate (or inappropriate) to send messages like this. And when messages should be opt-in, opt-out, or otherwise. And how often it's appropriate to send out messages.
These are important questions and I'm perfectly happy to see the bot blocked or its access list wiped clean if these issues can't be resolved reasonably. I think there is some utility in having such a message delivery mechanism, but it shouldn't be causing so many editors to be annoyed. -- MZMcBride ( talk) 01:31, 5 September 2012 (UTC)
P.S. And, of course, the name of the bot's configuration page (" User:EdwardsBot/Spam") is completely tongue-in-cheek.
I posted a legite question about a bot. [3] My question was ignored. This bot does not need to create stubs any faster. It creates 100 stubs at a time, and each stub is supposed to be verified by a member of the project; unless the project suddenly gained a dozen new really fast snail editors, there is no reason for this bot to increase its stub-creation output.
Or if there was a reason, it was sure not readily available in answer to a question by a community member.
What is going on? Why did this bot have to be approved to create stubs at 5X its current rate? How is Wikiproject:Gastropod handling the approval of these bot-created stubs. The community has spoken a number of times about bot-created stubs, and not usually favorably. This bot operator has "misinterpretted" prior approvals to mean something entirely different from what was intended. This is not a bot and operator to be speedy approved when a community member has asked a question.
I would like the discussion re-opened, and the question answered. I don't care how old the operator is.
68.107.140.60 ( talk) 03:14, 9 September 2012 (UTC)
As usual, a Wikipedian quotes an essay he has not read and that does not apply to the situation. Thanks. Much appreciated. My focus is on the task, as done by this bot operator who does not seem to think the rules apply to him. So, fresh start, let's get an explanation for how I tell the bot owner is in compliance, when it appears that he thinks 100 means some other number, 141, or, I may be wrong, but I was not allowed to discuss the situation because you heedlessly speedied the closure of the BRFA. Let's get an exact explanation of what the bottleneck is, also, because if it is the slow speed of the bot, something is not right, although with your 11 years of experience, I wonder that you think one can cure a badly written slow program by giving it more to do. Maybe it has something to do with the count variable, and that would explain the 141/100 computer approximation.... 68.107.140.60 ( talk) 07:02, 9 September 2012 (UTC)
Note: There is a lot of history being brought up here. On one side, Ganeshbot was approved in Ganeshbot 4 to create about 580 articles for the genus Conus, limited to 100 per month. In Wikipedia talk:Bot Approvals Group/Archive 7#Wrong way of the close a BRFA, the 100-per-month limit was lifted. Somehow or other, the members of WikiProject Gastropods thought they were allowed to have the bot create 15000+ articles for other gastropods without further approval or any rate limiting. This, understandably, caused much consternation. Ganeshbot 5, asking permission to finish creating these tens of thousands of articles, eventually was denied for no consensus; Ganeshbot 10 was eventually approved with the rate of creation limited to the rate of review by the project.
On the other side, the IP user 68.107.140.60 seems to be the same user who has been around for the anybot mess (see pretty much all of Wikipedia talk:Bots/Requests for approval/Archive 4) and other related discussions. If it is the same user, he/she serves a valuable function in watching bot activity and approvals generally related to "species" articles with a critical eye, but this is counterbalanced by the user being extremely sensitive to perceived slights against IP editors and being generally quick (to the point of disruption) to throw around accusations of being ignored or suppressed, of being on the receiving end of incivility, and of editors in "power" being biased against their viewpoint. Anomie ⚔ 08:38, 9 September 2012 (UTC)
This is just about an understanding of this bot operator's request. Let's not make it about me or anything else. - 68.107.140.60 ( talk) 17:42, 9 September 2012 (UTC)
I don't have enough coding experience to make a bot, but I was wondering if someone could make the code for a bot that automatically posts a message on a person's talk page after they make one edit. This is for a different wiki where I have been asked to make a bot that does that. ad Intellige ad nuntius 02:26, 16 September 2012 (UTC)
Hello everyone-
I wanted to let you know that my dissertation, "Network of Knowledge: Wikipedia as a Sociotechnical System of Intelligence" is now available on my website with a CC BY-NC-SA 3.0 license. Over a year ago I began this project with the WMF Research Committee and the University of Oregon IRB's approval. Nearly 50 bot operators, WP contributors, and WMF administrators were kind enough to participate in the study, offering their time, opinions, and expertise on issues around bots and bot creation. Feel free to download the document or peruse it online, and I look forward to your comments either on the site or via email.
The manuscript is a bit long (~320 pages) and includes some standard dissertation sections (literature review, methods chapter, etc.). Interviewee contributions are featured most in Chapters 5 and 6 (if you want to skip to the good stuff).
I am at a new institution now and will be going through a new IRB approval process to continue this research, but I do indeed want to continue chatting with the bot and semi-automated tool community. Please let me know if you're interested in connecting this fall, and thank you so much to those who have already participated!
Randall Livingstone UOJComm ( talk) 23:55, 20 September 2012 (UTC)
Why is there no "this article created by a bot tag" on the article? And, no, I don't give a dang about the edit history. Was this a decision made, or has it never been discussed? 68.107.140.60 ( talk) 01:56, 16 September 2012 (UTC)
Well, you've caught my curiosity. To my knowledge, I've never seen an article that was created by a bot, though it's easy to find bot edits to articles created by humans. Please post some links to some typical bot-created articles so I can see what the discussion is all about. And I mean actual articles, not pages like Wikipedia:Requested moves/Current discussions that the bot I operate writes. Thanks, Wbm1058 ( talk) 18:28, 19 September 2012 (UTC)
I have temporarily shut down my bot due to internal malfunctions. I was worried the malfunctions can beginning manifesting themselves onto Wikipedia.— cyberpower ChatOffline(Now using HTML5) 12:39, 23 September 2012 (UTC)
Amidst all the Toolserver/Labs turmoil, I acquired a 2008 Mac mini, which I then installed Linux upon. It now runs fairly speedily 24/7 (assuming there isn't a power outage), and, as I have no real use for it, I am offering the use of it (just send me scripts) up for grabs to any disgruntled bot operator(s) who'd like me to run their programs on it. Just an FYI - please drop me a message if you're interested - I can send you system details/whatever else you'd like. Just trying to do my part! Theo polisme 07:40, 30 September 2012 (UTC)
I noticed that some closed threads at NFCR haven't been archived yet. Looking at Special:Contributions/ClueBot_III shows that it has made no edits since 7 October. Will the bot be revived? -- Toshio Yamaguchi ( tlk− ctb) 07:14, 13 October 2012 (UTC)
I have an issue with Yobot. In particular, the bot is making too many changes at once and often with edit summaries that are not specific enough or simply tangential to the actual edits made. My initial comment on the matter started with this conversation which continued here. I think User:Magioladitis has put little weight on my comments, has not tried very hard to understand my point, and has been slow to respond, so I bring the matter here. After our exchange, my main contention is that Yobot's edit summaries are not specific enough. This makes checking its changes difficult and annoying for watchers. In particular, Yobot can sort ref tags while still making other minor changes. This can change large blocks of text in a single article while the edit summary might only refer to some minor edit sprinkled in among those changes. Jason Quinn ( talk) 16:43, 15 October 2012 (UTC)
Comment I would additionally add that it's good advice that bots should do one thing and do it well and that Yobot seems to tackle too many tasks at a time. Jason Quinn ( talk) 16:52, 15 October 2012 (UTC)
My current edit summary is of the following form: "[[WP:CHECKWIKI]] error #xx fix and [[WP:GENFIXES|general fixes]]" which is followed by a short text that explains which rule I apply for the specific CHECKWIKI fix. The remaining edits are explained in WP:GENFIXES. Last week I worked mainly in the direction of creating skip conditions for each CHECKWIKI error fix. Instead of loading all lists together I now work on each list separately. I am open in ideas of making my edit summaries smarter without having to generate a very lengthy edit summary. -- Magioladitis ( talk) 18:41, 15 October 2012 (UTC)
Sorry if I'm about to defame a hardworking user, and an administrator at that, but I noticed something a bit strange on my watchlist today - two identical edits by User:Redrose64 on completely unrelated pages on a fairly esoteric technical point - here and here. Looking at the users edit history reveals large numbers of edits to various pages identical to those I encountered. Looking further back there are similar instances of these editing patterns on various different technical points, sometimes at a rate of several per minute and accounting for hundreds of identically described edits. I can make some fast edits sometimes, and I'm not averse to hard work, but nothing approaching this...
For all I know this may be normal AWB activity, I really am no expert. I'm sure this is a hard working administrator doing a good job - but it just seemed a little odd so I thought I'd flag this up to some people who are more knowledgeable than I am! MatthewHaywood ( talk) 00:14, 17 October 2012 (UTC)
|day=
, |accessed=
and |access-date=
. --
Redrose64 (
talk) 10:43, 17 October 2012 (UTC)I am going to step down as the maintainer of the WP 1.0 bot at the end of November. A new maintainer for the bot is needed. More information can be found here. — Carl ( CBM · talk) 17:10, 28 October 2012 (UTC)
Edits by User:Hammocks and Honey that say db-a9|bot=ExampleBot and put the A9 csd incorrectly on articles where the artist is bluelinked. I'm reverting from the earliest on the list, but I can't see hoe to stop ot. I've messaged Hammocks and Honey. Peridon ( talk) 17:41, 8 November 2012 (UTC)
The RM bot, User:RMCD bot aka User:RM bot has stopped running. Can it be restarted or is yet another fork needed? Apteva ( talk) 16:12, 25 November 2012 (UTC)
Hi all,
A few months ago SuggestBot got approval to update the Community Portal's list of open tasks ( Wikipedia:Community portal/Opentask, BRFA Wikipedia:Bots/Requests for approval/SuggestBot 7)). I'm now interested in having it update a smaller list of tasks ( Template:Opentask-short) for experiments done in the Onboarding new Wikipedians project.
Would it be necessary to do a separate BRFA for this? Maybe it would instead be possible to refer to the previous BRFA and that the bot serves the same purpose? Would appreciate some input on this. Cheers, Nettrom ( talk) 18:38, 20 November 2012 (UTC)
Automating tasks on Wikipedia
Uploading hundreds of files or changing thousands of pages can be tedious. We allow limited automation unless it interferes with normal systems operations. You always can grab your favorite scripting language and write a bot, but there's no need to reinvent the wheel: take a look at PyWikipediaBot, a quite complex automation framework for Wikipedia. If you are more into Perl, libwww-perl is a very useful library for automating web tasks. If you have tested your bot and intend to run it over a longer period of time, please get in touch with the developers first (preferably using the wikitech-l mailing list) or by requesting a flag here. We then can register your bot, so it can be hidden from the list of recent changes.
---cut here---
The above is being put out as the "tipof the day" it looks like it was written in 2005. Somebotty might like to find and update the tip.
Rich
Farmbrough, 02:59, 16 December 2012 (UTC).
There are some threads that may be of interest to bot operators on the mediawiki-api mailing list. In summary:
Feel free to join the mailing list and the discussions (you can sign up for gmail or another free email service if you don't want to reveal your personal address), or I'll try to summarize replies posted here at some point. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Personally, while I think the current system could be cleaned up somewhat, I don't much care for removing all option from the client in how continuation is processed. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Personally, I'm particularly interested in pros and cons for introducing versioning at all, which the original proposal seems to have assumed as a given. Anomie ⚔ 14:25, 22 December 2012 (UTC)
Hi; could somebody who understands these things please check which bots are actually fixing double redirects at the moment? The only one I have personally recently witnessed working is AvocatoBot ( task list · contribs) – but, in any case, could someone do a more thorough/scientific check and update the relevant pages as appropriate? Thanks It Is Me Here t / c 14:23, 14 December 2012 (UTC)
When I discovered the problem, I inserted dates myself. Someone else is doing it as well. I can't be here every day.— Vchimpanzee · talk · contributions · 15:39, 26 December 2012 (UTC)
Is it possible make List of test wikipedia of incubator that renew by bot? Attempt in Russian Wiki: ru:Википедия:Список Википедий в инкубаторе Its talks: ru:Обсуждение Википедии:Список Википедий в инкубаторе, ru:Википедия:Форум ботоводов#Википедия:Список Википедий в инкубаторе -- Kaiyr ( talk) 14:53, 27 December 2012 (UTC)
I am currently applying to be a member of BAG and input is greatly appreciated.— cyberpower Offline Happy 2013 13:31, 2 January 2013 (UTC)
Hi folks, I'd like to expand how I'm contributing to WP by trying to write some bots. The first idea is simple:
I'm sure an experienced bot-writer could knock this out in an hour, but I want to do it myself. Can anybody point me to the source code of a simple Python-based bot that might be a useful shell for me to build such a bot? Appreciate it...
Zad
68
03:50, 3 January 2013 (UTC)
Any comments would be appreciated ·Add§hore· Talk To Me! 16:16, 10 January 2013 (UTC)
Is there a way to get the edit count of an IP account? Although the mw:API:Users example includes an IP, it doesn't seem to work. NE Ent 22:27, 10 January 2013 (UTC)
I am currently (self) nominated to become a member of BAG (Bot Approvals group). Any questions and input you may have is invited with open arms [[ here. ·Add§hore· Talk To Me! 21:39, 16 January 2013 (UTC)
There is a discussion on the administrators' noticeboard regarding a new template designed for alerting bot operators that their bot has been blocked: Wikipedia:Administrators' noticeboard/Archive244#Blocking misbehaving bots. Feedback from bot operators is welcome. 28bytes ( talk) 20:31, 18 January 2013 (UTC)
I'd like to introduce LinqToWiki: a new library for accessing the MediaWiki API from .Net languages (e.g. C#). Its main advantage is that it knows the API and is strongly-typed, which means autocompletion works on API modules, module parameters and result properties and correctness is checked at compile time. Any comments are welcome. User<Svick>. Talk() ; 17:56, 17 February 2013 (UTC)
I wrote a script to search templates with many parser functions that are worth to convert to Lua now: mw:Special:Code/pywikipedia/11099. Bináris ( talk) 06:17, 21 February 2013 (UTC)
Due to a bug, I have temporarily shut down adminstats.— cyberpower ChatOnline 13:38, 21 February 2013 (UTC)
Does anybody have Python code to parse an article and extract all the refs, maybe even parse through a few of the more popular template:cites (cite book, cite journal, etc.)? I'd like to develop a number of little utilities that manipulate refs and cite info, but wanted to see if someone has already laid this sort of groundwork. Ideally it'd be a library function that took a page as input and returned a list of data structures containing the ref and cite info, bonus points if the library had an API to manipulate the data and apply it back to the page for writing. I scanned through a bunch of the existing code I was able to find but didn't come across anything. Any help appreciated, cheers!
Zad
68
15:07, 27 February 2013 (UTC)
As some of you may know, Wikidata interwiki links went live on the Hungarian Wikipedia today. Many editors started removing interwiki links en masse. However, it was soon realized that the interwiki bots were still running, and they started readding links. I assume that we want to prevent this from happening when Wikidata is turned on for the English Wikipedia... -- Rs chen 7754 21:30, 14 January 2013 (UTC)
[5] goes into more detail. Speaking of which, if you have an interwiki bot that runs on the Hungarian Wikipedia, we would appreciate it if you changed your code... -- Rs chen 7754 22:45, 14 January 2013 (UTC)
(edit conflict) Hi everybody, I wrote to pywiki developers to update interwiki bots. (I am a developer myself, but I have never worked with interwiki.py.) I don't think 126 bot owners should be notified; if a developer updates the code, they will be responsible to update their bots in a reasonable time. Let's see what happens. There are more phases: Hungarian Wikipedia will be followed by Hebrew and Italian in the second step and English in the next phase. At last the remnant. Link Fa templates will still be handled by interwiki bots for a while until Wikidata integrates them. It's easier to follow these happenings in pywiki code, I suppose. Cheers, Bináris ( talk) 22:50, 14 January 2013 (UTC)
A temporary solution might be to use the edit filter to block (disallow) either bot changes to interwikis or (and I would think this is more efficient) block bot edits with an interwiki.py edit summary. Of course, 1% of those edits are still probably going to be good ones; but in any case it would generate a handy list of current interwiki bots and give owners some breathing space if they forget to convert. - Jarry1250 Deliberation needed 09:55, 16 January 2013 (UTC)
Just another FYI since this wasn't brought up - the bots will still need to add the FA/GA stars when the article is FA/GA on other Wikipedias. Unfortunately that's not in Wikidata yet. -- Rs chen 7754 10:03, 16 January 2013 (UTC)
Wikipedia:Wikidata interwiki RFC has been started. -- Rs chen 7754 09:33, 17 January 2013 (UTC)
The date is February 11th: http://blog.wikimedia.de/2013/01/30/wikidata-coming-to-the-next-two-wikipedias/ -- Rs chen 7754 20:11, 30 January 2013 (UTC)
So, Wikidata goes live on English Wikipedia in two days. How can we contact all the bot owners to let them know that their bots will no longer be needed to update the interwikilinks in the same way as they have done? It would be great to try for a nice clean transition where interwiki code can begin to slowly be removed from en.wiki. Bot owners should try and direct their bots towards organising interwikis on wikidata instead. Del♉sion23 (talk) 01:02, 9 February 2013 (UTC)
And we're live now! -- Rs chen 7754 21:15, 13 February 2013 (UTC)
I notice that Rubinbot is still editing, that its bot approval request lists it as being based on pywikipedia, and Legoktm's argument that pywikipedia-based bots that don't update themselves and continue re-adding interwiki links should be blocked because of the listed policy. This is not an area I'm experienced in. Is my understanding correct here?, and if so, is blocking here premature? No criticism intended of the author, I just suspect that at some point people will want to stop seeing back-and-forth interwiki adding/removing. (Don't panic, I have absolutely no intention of taking action without some discussion.) -- j⚛e decker talk 18:42, 15 February 2013 (UTC)
Can someone remind me where the test wiki is? I'm updating my bot framework and would like to try out a new function I just created.— cyberpower ChatOnline 03:29, 6 March 2013 (UTC)
The simple testcase at User:Chartbot/simplified dies with an HTTP 517 error as it is, but works fine if I try to send 634 bytes instead of the 635 that the testcase uses. It seems that PHP is sending a Expect Continue 100 header when the request gets too long, and that's getting refused. Anyone know either how to get PHP to stop sending the header or get the Wikimedia server to not get cranky about it? Without me having to learn yet another PHP library? All the parts of the bot that I thought would be hard work, and it's frustrating to be stumped with my toe on the finish line.— Kww( talk) 06:30, 6 March 2013 (UTC)
I'm getting started with a little bot coding. My question isn't necessarily even Wikipedia-specific, but here it is:
Given a particular identifier, a "PMID" (looks like "9736873") which identifies a journal article, I'm trying to go to the NIH's PubMed site and pull an XML file full of metadata about the article. A typical lookup URL looks like: http://www.ncbi.nlm.nih.gov/pubmed/9736873?report=xml&format=text
. In my browser I get back something that looks like:
<PubmedArticle> <MedlineCitation Owner="NLM" Status="MEDLINE"> <PMID Version="1">9736873</PMID> <DateCreated> <Year>1998</Year> <Month>10</Month> <Day>01</Day> </DateCreated> <DateCompleted> <Year>1998</Year>
..etc. I want to use BeautifulSoup to parse the resulting XML and pull out particular fields. My code looks like:
if param.name == 'pmid' and param.value: pmid = str(param.value).strip() url = " http://www.ncbi.nlm.nih.gov/pubmed/" + pmid + \ "?report=xml&format=text" f = urllib.urlopen(url) xml = f.read() f.close() soup = BeautifulSoup(xml)
My problem is that the BeautifulSoup parse doesn't work because what I'm actually getting back from PubMed looks like:
<?xml version="1.0" encoding="utf-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" " http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> < pre> <PubmedArticle> <MedlineCitation Owner="NLM" Status="MEDLINE"> <PMID Version="1">9736873</PMID> <DateCreated> <Year>1998</Year>
so instead of < and > characters I'm getting < and >, and BeautifulSoup isn't parsing it. If I copy the text out of the browser and paste it into a file, and read from the file, it works.
I tried using the requests
library instead of urllib and got the same result. What am I doing wrong? Cheers...
Zad
68
19:10, 7 March 2013 (UTC)
soup = BeautifulSoup(xml.replace('<','<').\
replace('>','>') )
Zad
68
19:41, 7 March 2013 (UTC)
xml = html_parser.unescape(xml_in_html)
and that worked for a few but then bombed on something. Gee what a pain... I'm just going to go with my manual decoding! Thanks for the note that PubMed is "lying" about delivering XML, appreciate it.
Zad
68
19:56, 7 March 2013 (UTC)
File "/usr/lib/python2.7/HTMLParser.py", line 472, in unescape return re.sub(r"&(#?[xX]?(?:[0-9a-fA-F]+|\w{1,8}));", replaceEntities, s) File "/usr/lib/python2.7/re.py", line 151, in sub return _compile(pattern, flags).sub(repl, string, count) UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 1: ordinal not in range(128)
M\xc3\xb6ls\xc3\xa4
. If I pre-process what PubMed gives me and remove that last name with html_parser.unescape(xml_in_html.replace('M\xc3\xb6ls\xc3\xa4',' '))
it works. Any insight appreciated!
Zad
68
14:15, 8 March 2013 (UTC)
All botops should be aware that Wikipedia just experienced a server failure that affected all bots. If your bot just edited in a strange and unusual manner, it may not have been the bot's fault.— cyberpower ChatOnline 00:49, 13 March 2013 (UTC)
Hey folks, I'd like to modify my bot (coded in Python) to pull the Wikicode for an article and extract all the <ref>...</ref>
definitions, put them in a list, and do whatever I want with them. First use will be to create a table of refs for me to review in doing GA or FA reviews. Bot already uses the mwparserfromhell
libraries, and with it I can parse out templates nicely, but now I'm trying to use it to parse out ref tags and having trouble, they appear to be treated as text by the library:
>>> import mwparserfromhell >>> text = "I has a template!<ref>{{foo|bar|baz|eggs=spam}}</ref> See it?" >>> wikicode = mwparserfromhell.parse(text) >>> wikicode.filter_templates() [u'{{foo|bar|baz|eggs=spam}}'] >>> wikicode.filter_tags() [] >>> wikicode.filter_text() [u'I has a template!<ref>', u'</ref> See it?']
Any idea what I'm doing wrong? Cheers...
Zad
68
15:33, 13 March 2013 (UTC)
It is imperative that any out of the norm edits as of now be reported to me. I am about load an update to the edit function of the framework into my bot which will affect all tasks. I will of course be watching myself, but I may miss something as I recode Peachy.— cyberpower ChatOnline 23:22, 14 March 2013 (UTC)
Can we routinely ask users to post the function overview even if it is a request to take over another bot's task? There is no standard way that bot tasks are linked that I can understand, and if it is another bot's task, it takes forever for me to search for it, sometimes with no result.
I think it is straightforward to not create a trail of breadcrumbs/links and just include a sentence in the function overview about what the bot does.
Thanks, - 68.99.89.234 ( talk) 11:43, 10 March 2013 (UTC)
Howdy. After a discussion at ANI, I'm hoping that somebody here might be able to help. Over the last few weeks, HBC AIV helperbot5 ( talk · contribs) and HBC AIV helperbot7 ( talk · contribs) have been intermittently disappearing from WP:AIV for anywhere from a few hours to nearly a day. The bots appear to be functioning normally at WP:UAA - or at least more normally - so it's not as if the bots are just dead. Because AIV maintenance is a fairly important function and I'm not sure if the bots' operators are around at the moment, I was wondering if anybody here had any idea what the problem might be? Thanks. -- Bongwarrior ( talk) 01:33, 22 March 2013 (UTC)
FYI, DASHBot has been blocked as it is malfuncioning, Tim is aware. As soon as the code is fixed I'll unblock. Regards, Giant Snowman 20:05, 23 March 2013 (UTC)
From now on you may use two new exceptions in Pywikipedia (r11300). When replacing texts with replace.py/fixes, you may add to 'inside-tags' section of exceptions:
Cheers, Bináris ( talk) 09:38, 30 March 2013 (UTC)
See discussion at User talk:cyberpower678#Buggy bot, and feel free to jump in and help if you're familiar with the framework that cyberpower is using. ‑Scottywong | speak _ 22:29, 3 April 2013 (UTC)
As described here, Citation bot ( BRFA · contribs · actions log · block log · flag log · user rights) needs an update to be fully compatible with the new Lua-based citations when there are more than 9 authors or more than 4 editors. However, the bot's original author doesn't seem to have much time for it anymore [6]. He invites others to improve the improve the source though, and I was hoping that someone else might feel inclined to work on the needed improvements. Dragons flight ( talk) 22:52, 3 April 2013 (UTC)
In addition to the core Framework being updated, and bug fixes being implemented on the currently shutdown tasks. Cyberbots I and II will be slowly migrating to labs. Services such as RfX reporter, tally, cratstats, and adminstats that many of you use may get cut for a bit. Also because I am unfamiliar with labs at the moment, I have absolutely no idea how adminstats will run on labs. If there are any errors or out of the norm edits in the coming future, please let me know.— cyberpower ChatOffline 15:17, 10 April 2013 (UTC)
Could someone please check Special:Contributions/EdinBot - I think it may be operating on the wrong wiki. Thanks! GoingBatty ( talk) 01:58, 15 April 2013 (UTC)
See Special:Contributions/185.15.59.211. Of course, I have no idea which one it is. What can/should be done when a bot is editing logged out? Thanks. Someguy1221 ( talk) 03:26, 20 April 2013 (UTC)
We have an existing consensus to softblock the toolserver IP addresses, for this very reason. I guess from the above that they have new IP addresses now; that would be 185.15.59.192/27, judging by whois data? Which is part of 185.15.56.0/22 that is assigned to WMF. And then there's 2620:0:860::/46 in IPv6 assigned to WMF too, of which we currently have 2620:0:862:101::2:0/124 blocked as toolserver.
We should probably also do the same for Labs projects; it seems edits from Tool Labs will currently come from 10.64.0.126, and I see past edits from 10.64.0.123, 10.64.0.169, and 10.64.0.170, but the full range of possible IPs is not obvious (random guess: 10.64.0.123–170, plus a few others in 10.64.0.0/12, plus some in 2620:0:861:101:10::/80 if IPv6 happens to be used). OTOH, should logged-out edits be coming from anything in 10.0.0.0/8 at all? Anomie ⚔ 03:35, 21 April 2013 (UTC)
Cyberbot II is now running on labs. I am now migrating Cyberbot I.— cyberpower ChatOffline 18:47, 24 April 2013 (UTC)
Hello Bot owners. It seems that User:Tedder has recently had difficulty with TedderBot. The bot has been out of commission for nearly 2 months. This lack of service is a great loss for the staff of many of the various WIkiProjects, who usually reply on that bot's listings of new articles that are in their subject areas. Is it possible for other people to help Tedder in some way in order to get TedderBot up and running properly? Perhaps I should apologize for going behind Tedder's back in this way, but I believe my motivation is good. Many thanks, Invertzoo ( talk) 00:57, 2 May 2013 (UTC)
I left a discussion about this on the bot owner's talk page, as well as the MfD talk page. It seems like since January 2013, One bot has been doing a great job removing closed discussions on Wikipedia:Miscellany for deletion, but has not been archiving any of the discussions. This can be proven on the bot's edit log. Steel1943 ( talk) 07:13, 23 April 2013 (UTC)
Does anything need to be done about User:Riley Huntley's approved bots? He retired under a cloud due to sharing his password, a single incident, but I think this should raise concerns about flagged bots and approved tasks. - 68.107.137.178 ( talk) 15:19, 1 May 2013 (UTC)
Actually, I also noticed there were a lot of such unsubstituted templates, plus as I mentioned in my BRFA, I was just extending a task my bot already does on Commons and Wikidata. Hazard-SJ ✈ 02:42, 3 May 2013 (UTC)
@ Addshore: I talked to MBisanz back when Riley first announced his retirement (Apr 29th) and the idea is just that, if his bot dies, I'll file a takeover request (Riley emailed me all of his code). Like others said above, though, until it dies, we don't have an issue. Theopolisme ( talk) 11:07, 14 May 2013 (UTC)
Went ahead and filed a BRFA here. There's no real hurry, but, per the above arguments, "better safe than sorry." Theopolisme ( talk) 14:05, 18 May 2013 (UTC)
Please go to WP:VPR and offer your opinions in the "Remove bot flag from inactive bots" thread. Nyttend ( talk) 00:07, 23 May 2013 (UTC)
Just to let everyone know that I have blocked User:EmausBot for incorrectly removing interwiki links from articles. This was first reported here over a week ago and nothing has been done. After giving Emaus an extra day to respond I have blocked the bot.
The way i see it, the bot removed a link from Stockert Radio Telescope to de:Astropeiler Stockert after looking at d:Q2350652 which when the bot looked at it contained only two links. These links linked de:Stockert (Berg) and Stockert Radio Telescope. After looking further de:Astropeiler Stockert is a section redirect to de:Astropeiler_Stockert#Radioteleskop_Astropeiler_Stockert which means it should have been left and not removed.
·Add§hore· Talk To Me! 11:07, 20 May 2013 (UTC)
Please see User talk:Citation bot/Archive1#Update required to avoid deleterious impact on new Lua-based citations. This issue is not being addressed, and there are various other complaints on the bot talk page which are similarly being ignored. Should I block the bot? -- Redrose64 ( talk) 21:34, 18 May 2013 (UTC)
I haven't been around on-wiki much lately, and I don't see that situation changing anytime in the near future. My bot has still been running on autopilot, but I haven't been monitoring it at all. The code is fairly stable, so maybe it's not a big deal. But, if anyone is interested in taking over the tasks that Snotbot still runs, let me know. I believe the main tasks that it runs are task 10 (cleaning up various things with AfD's), task 12 (archiving requests at RFPP), and another task that doesn't have a BRFA because it only edits userspace (updating the summary table at CAT:RFU). The code is Python using the pywikipedia library. If someone does take over the tasks, I'd prefer it to be someone who has some familiarity with Python and pywikipedia. ‑Scottywong | converse _ 13:43, 23 May 2013 (UTC)
If they're truly "communal" bots then we can just make LabsBot1, fill 'er up with 10 tasks, then create LabsBot2, fill 'er up with 10 more tasks, etc...what would be really awesome is if someone created a UI for managing them, though...i.e., JIRA on steroids? So, you can "submit new task" and it gets automatically assigned to the next available bot, you just have to upload source code and it can be automatically scheduled into the crontab?...I smell unicorns.... Theopolisme ( talk) 04:14, 29 May 2013 (UTC)
Hi! There is currently a request for global editinterface rights for Addbot open on meta wiki here to allow the bot to edit protected pages to remove interwiki links that are already on wikidata. It has been proposed that a second global bot group be created that includes the following flags (edit, editprotected, autoconfirmed). This is not something stewards want to rush into as the flag would allow the bot to operate on protected pages and would prefer to have a wider participation in the request for approval before any action is taken. All comments should be posted at (meta:Steward_requests/Global_permissions#New_global_permissions_group) ·addshore· talk to me! 14:54, 1 June 2013 (UTC)
A request for comment has been started at Wikipedia:Requests for comment/The bot flag regarding removing the bot flag from inactive bots and potentially modifying the bot flag itself. The RFC started after the discussion on VP/Prop. ·addshore· talk to me! 10:28, 6 June 2013 (UTC)
Hello to everyone, I`m a user from greek (el) wikipedia. Could you please check the bot contibutions of CarsracBot; He added interwikis in greel (el) wikipedia.-- Vagrand ( talk) 19:54, 12 June 2013 (UTC)
In recent weeks some bot is using Perl literals to access articles. In the situation of an article name having diacritics or en-dash/em-dash it fails. For example using Gal\xC3\xA1pagos Islands rather then Galápagos Islands and using Karush\xE2\x80\x93Kuhn\xE2\x80\x93Tucker conditions rather then Karush–Kuhn–Tucker conditions. This is causing Perl literal names to appear on WP:TOPRED and also distort the Wikipedia article traffic statistics. Of course it might be automation beyond the scope of WP:BOT, but perhaps posting here will give someone a realisation. From the number of redlink hits it appears the page visits EXCEED those of user article views, so we are looking at some major bot doing a lot of page visits! Something like over a quarter of a million hits are not going directly to the articles weekly. Regards, Sun Creator( talk) 23:47, 24 June 2013 (UTC)
I created a litte PHP script/"bot" to upload pictures from my web site to Commons, just to avoid that I had to go through the file upload wizard and enter all the information manually that is already in our database anyway and should thus be transferred automatically. I only used it occasionally, and all it did was upload one picture at a time. Now it appears that the server's IP address got blocked (the response contains ":"Unknown error: \"globalblocking-ipblocked\""), so I guess somebody does not like what I'm doing. Thus:
Thanks! -- Kabelleger ( talk) 19:20, 25 June 2013 (UTC)
There appears to be a problem with the Wikipedia back-link information returned by "What links here". See Wikipedia:Administrators'_noticeboard/Incidents#Hazard-Bot_false_positives_flood. This caused Hazard-Bot to start flagging thousands of images as orphaned. This may be due to a corrupted database index. Please watch your 'bot behavior closely until this problem is cleared up. Any 'bots that rely on "what links here" data should be temporarily suspended. -- John Nagle ( talk) 20:26, 22 July 2013 (UTC)
(Posted only at WP:VPT; thought it would be worthwhile to repost here. -- John Broughton (♫♫) 03:18, 24 July 2013 (UTC) )
Hello, Sorry for English but It's very important for bot operators so I hope someone translates this. Pywikipedia is migrating to Git so after July 26, SVN checkouts won't be updated If you're using Pywikipedia you have to switch to git, otherwise you will use out-dated framework and your bot might not work properly. There is a manual for doing that and a blog post explaining about this change in non-technical language. If you have question feel free to ask in mw:Manual talk:Pywikipediabot/Gerrit, mailing list, or in the IRC channel. Best Amir (via Global message delivery). 13:06, 23 July 2013 (UTC)
NoomBot was shut down by its creator (see User talk:NoomBot#Bot shutoff), and that user went inactive the next day and hasn't made a single edit since April 22, 2013. I thought I should report this and that maybe someone would step up and takeover. The operator states in the talk page post I linked that if anyone wants the source code to email him.-- Fuhghettaboutit ( talk) 12:40, 26 June 2013 (UTC)
Please see this thread about a weird occurence related to MadmanBot. De728631 ( talk) 13:15, 7 August 2013 (UTC)
I can't seem to find a record anywhere that User:RotlinkBot is an approved bot. It doesn't have the bot bit set. Am I just missing something? ElKevbo ( talk) 18:00, 18 August 2013 (UTC)
I need someone to take over my bots for me. This has been a long time coming and I simply don't have the time to maintain them anymore. Furthermore my response time to issues/bugs has started to become unacceptable for a bot op. They're all written in PHP, have some quirks, and could use a bit of work, but they are pretty easy to run as long as you keep on top of things. So if anyone is interested, please drop me a line and I'll send you the latest source code, and details of how to set the bot up.
-- Chris 03:18, 22 August 2013 (UTC)
Hi all. User:28bot is going to be offline for a while, so I thought it would be best to solicit other bot ops to take over the tasks it currently performs. Those are:
Thanks, 28bytes ( talk) 07:04, 11 August 2013 (UTC)
Starting Aug 29, my pywikipedia-based bot can't edit anymore - it returns an error message saying "Token not found on wikipedia:en. You will not be able to edit any page". Anyone else seeing this? -- Rick Block ( talk) 19:47, 31 August 2013 (UTC)
use_api = True
)
Theopolisme (
talk) 20:01, 31 August 2013 (UTC)
Updating to the latest git clone fixed it. Thanks! -- Rick Block ( talk) 23:33, 31 August 2013 (UTC)
FYI: The Tools cluster is in a period of instability ( http://ganglia.wmflabs.org/latest/?r=hour&cs=&ce=&m=load_one&s=by+name&c=tools&h=&host_regex=&max_graphs=0&tab=m&vn=&sh=1&z=small&hc=4). If your bot runs on the tools cluster let them know that the labs administrators are made aware of the problem and are in the process of cleaning up the problem. Hasteur ( talk) 11:00, 10 September 2013 (UTC)
I am getting significant push back from a single editor (who happens to be an Admin) who is wanting to significantly change the way that approved tasks function. I fundamentally disagree with their complaint and want to know how far am I required to bend to accommodate individual editors complaints about the bot's approved action. I am intentionally not making it about the individual editor or about the specific complaint, (but am willing to accept advice within the current context). I think I've already been very accommodating with respect to the approval and taking onboard reasonable requests, but I think that fundamentally changing the bot's activities to make it effectively toothless and a disservice to Wikipedia (why even bother nominating G13s if we're never making headway on the backlog). Hasteur ( talk) 19:13, 13 September 2013 (UTC)
MiszaBot III ( talk · contribs) has not edited for a week, and Misza13 ( talk · contribs) has not edited here since May. Did something happen to the API/framework/etc on September 11 that would explain the failure? Can anyone suggest how to get it going again? -- John of Reading ( talk) 16:00, 18 September 2013 (UTC)
I'll check the logs when I get back from work and have SSH access. On a different note, if anyone wants to take over this mess from me, they're more than welcome. The bot is mostly running correctly on pages where configuration is okay, but the amount of pages where it errors out due to misconfiguration, blacklisted links etc. is simply staggering. And I have neither time nor interest to clean up people's mess anymore. — Миша 13 07:42, 20 September 2013 (UTC)
Please remember to stop the weblink checking bots for now! Library of Congress is already down. [8] -- Hedwig in Washington (TALK) 02:00, 2 October 2013 (UTC)
I thought it worth pointing out to the botops about this RfC.— cyberpower ChatOnline 23:52, 2 October 2013 (UTC)
Some input would be appreciated here about the possibility of a bot with admin rights. Giant Snowman 11:49, 11 October 2013 (UTC)
See m:October 2013 private data security issue. If your bot tries logging in through the API, it will fail unless you manually login and reset the password. Legoktm ( talk) 06:05, 3 October 2013 (UTC)
MiszaBot I ( talk · contribs) has not edited since 04:16, 2 October 2013; MiszaBot II ( talk · contribs) not since 22:35, 2 October 2013; and MiszaBot III ( talk · contribs) not since 00:18, 3 October 2013. Is this an effect of the API password issue described above? -- Redrose64 ( talk) 16:50, 4 October 2013 (UTC)
I switched from MiszaBot to ClueBot for my user talk archival when the former went down. Now it looks like User:ClueBot III is down as well -- it hasn't edited since 10/20 (PDT). Any news on that front? — Darkwind ( talk) 18:57, 25 October 2013 (UTC)
See this [9] ANI discussion re SporkBot "fixing" the use of now-deleted templates on archived comments from article talk pages. As I explain there I see unquantifiable (but nonzero) risk, and zero value, of tampering with already-archived talkpage comments, such as here [10]. I'd appreciate others' thoughts on this. EEng ( talk) 13:21, 27 October 2013 (UTC)
You dont understand how the reports are created. Lets say we have template X. It is used on a total of say 10000 pages. A deletion discussion as been closed as delete. We have two options, substitute and then delete, or change links and then delete. Otherwise you will end up with thousands of template transculsions that are broken. This causes several issues. It adds pointless entries to Special:WantedTemplates, and makes general housekeeping of template issues messier. Outright deletion of templates would modify the posts in the exact same way that you want to not do. Lets say {{ Keep}} was deleted/merged/moved/whatever to make it mean {{ Delete}} anywhere where that template was modified the meaning of the original poster would get changed. Hosekeeping bots would then subst the old Keep template, making it an orphan, maintaining the meaning of the post, while also allowing the template to be changed. Also you where talking about filtering out archive pages from the report, How do you define an archive? I bet I can find 100 archive pages that fail to meet your definition of an archive but are still an archive. These reports are often created via processing template links in the database, requiring the bots to access the live wiki, download and process the text of a page to filter out archives. If template X is used on 10,000 pages there might be 2,000 pages where its on an archive, WantedTemplates cannot take that into consideration, and each time the bot ran it would need to process all 10,000 pages each time it ran in order to find which pages are archives and which are not. That just ends up being a megalithic process that would succumb to its own weight in a short amount of time, and fail. The best solution is for a bot to just cleanup those templates converting the transclusions into either substitutions or links. Werieth ( talk) 23:54, 28 October 2013 (UTC)
There seems to be a problem with the above - it is adding Today's Featured Article to some IP talk pages instead of a warning. Regards Denisarona ( talk) 15:54, 6 November 2013 (UTC)
Being lazy I always used to code {{subst:dated|clarify}}, letting some bot come in to add the date. But recently I discovered I could code {{subst:dated|clarify}}. It's a minor difference but it does eliminate the need for an extra history entry (and possible edit conflict) where the bot does the dating. The funny thing is that other editors can't learn from my example, because once the subst is done the code left in the source looks the same as if I had added the date by hand.
Not to take away work from your lovely bots, but wouldn't it be good to publicize this? It occurs to me that one way to do that would be in the bot's edit summary as it adds dates e.g. instead of just saying
say something like
Just a thought. EEng ( talk) 03:10, 29 October 2013 (UTC)
{{
subst:dated}}
is that it doesn't pass any parameters through: it assumes that the only parameter used by the wrapped template is |date=
, which is often not the case. To take your example, {{
clarify}}
recognises four parameters: |reason=
|date=
|pre-text=
|post-text=
- but {{
subst:dated|clarify}}
would only fill in |date=
. If you were to use {{
subst:dated|clarify|reason=does this description refer to the whole station, or just the ticket hall?}}
, what you would get back is {{
clarify|date=October 2013}}
- the |reason=
parameter has been lost. However, if you subst: {{
clarify}}
in the manner described by its documentation, i.e. {{
subst:clarify}}
- that is, without using dated|
- it yields {{
Clarify|reason=does this description refer to the whole station, or just the ticket hall?|date=October 2013}}
and the |reason=
parameter is preserved. --
Redrose64 (
talk) 08:33, 29 October 2013 (UTC)
Dead end caused by EEng not reading what people said carefully
|
---|
Well, gee mister, why dintcha say so in the first place? That's the very thing needed! Works beautifully
becomes wikisource
which renders on page as The need for all caps DATE is unfortunate, as is the need to remember to subst and not not to code date=date=October 2013 which would be a natural mistake to make. Please, no lectures on car, cdr, nil and so on, but isn't there some way to make a template selfsubsting, so we could code something like
(where SDATE means self-substing DATE, or something). Anyway, assuming no such improvements, what would y'all think of bot edit summaries like:
Along another line of thought, isn't there some way to make a template add the date on its own, by default? EEng ( talk) 00:07, 30 October 2013 (UTC)
To be blunt my cognitive faculties were neutralized by your run-on paragraph of examples. Why didn't you just say:
Being by then in an impatient frame of mind, when I saw "AWB" in Anomie's post I figured, "Oh, the technogeeks are having a coding fest again" and kind of tuned out. (Please understand when I say this that I myself am part technogeek on my father's side.) I am taking the liberty of collapsing this side discussion, which is my fault. |
Anomie's idea sounds great, though I lack the knowledge to comment on potential technical problems, and I take it this is not the forum to gain agreement on this. I'd be happy to join the discussion there. BTW, I have the recollection that at least some of the usual templates (claify, cn, ...) don't seem to understand reason= in the sense they don't show the reason text when one hovers. May I suggest that all of these types of improvement-needed templates consistently take such a parm and show it on hovering. Synonyms for reason (such as concern=, explanation= ...) might be good too. EEng ( talk) 16:02, 30 October 2013 (UTC)
I'm not sure if the right forum for this question, but I'm having trouble with PyWikiBot. When running any program from the Command Prompt, I get the message 'git' is not recognized as an internal or external command, operable program or batch file.
Running login.py lets me log in (after displaying that message), but nothing else works.
Any help would be appreciated. My computer uses Windows 7. – Ypnypn ( talk) 17:39, 19 November 2013 (UTC)
I think I figured out the problem: The current version of PWB uses the basic import pywikibot
, but
the instructions on MediaWiki say said to use import wikipedia
. Thanks for your help, Werieth! -
Ypnypn (
talk) 23:05, 19 November 2013 (UTC)
People are encouraged to place {{ Experimental archiving}} on talk pages if they used MiszaBot for archival. Wikipedia:Bots/Requests for approval/Lowercase sigmabot III 2 is growing stagnant, and such a task should be moving faster than it is. → Σ σ ς. ( Sigma) 23:31, 19 November 2013 (UTC)
Wanted to let this board know that User:DumbBOT has not performed any edits since November 23. Since this bot creates and transcludes WP:RFD daily subpages, I would hope that this can be resolved ASAP. I went ahead and posted this issue on the bot's owner Tizio's talk page. Hopefully, this gets resolved soon. Steel1943 ( talk) 08:11, 25 November 2013 (UTC)
As with all of Misza13's bots, User:Wikinews Importer Bot hasn't run since October 26th 2013. Is there a replacement in the works or another bot that could run this task? Nanonic ( talk) 01:40, 9 December 2013 (UTC)
I have been nominated for BAG membership. Input is invited. The request can be found at Wikipedia:Bot Approvals Group/nominations/Cyberpower678 2.— cyberpower Online Merry Christmas 14:24, 22 December 2013 (UTC)
A new "Draft" namespace has been configured on enwiki for suitable AfDs and new articles (voluntary except for IPs). Ids: Draft - 118, Draft talk - 119. See Wikipedia:Drafts for more details. -- Bamyers99 ( talk) 19:31, 24 December 2013 (UTC)
Just for the records: Pywikibot owners and developers may contribute there patches to the pywikibot framework without having git installed on their local computer, but using the Gerrit Patch Uploader tool. Have fun! @ xqt 15:46, 30 December 2013 (UTC)
The proposal was closed as having consensus to move all orphan tags to the talk namespace, including with a bot. Any bots or scripts that currently add {{ orphan}} to articles should be modified accordingly. Ramaksoud2000 ( Talk to me) 20:51, 19 December 2013 (UTC)
AWB is almost ready to disactivate orphan tagging for the English Wikipedia and can also guarantee that AWB bots can add orphan tags in the correct place in the talk pages. Only some final minor changes should be done. AWB moreover, will discontinue automated orphan tagging/untagging on article space and won't auto tag on the talk pages. -- Magioladitis ( talk) 23:46, 21 December 2013 (UTC)
Is there a list of all currently approved adminbots? WJBscribe (talk) 12:16, 6 January 2014 (UTC)
Hi,
can someone recommend me a bot (software) for mass creation of articles/categories? (It's not a task for english wikipedia, so i can't post to requests). Thanks in advance. --
XXN (
talk) 22:32, 26 December 2013 (UTC)
«Contribs» 16:17, 3 January 2014 (UTC)
I've brought this to the attention of DeltaQuad. MM (Report findings) (Past espionage) 02:30, 18 January 2014 (UTC)
Issue is now resolved. DQ didn't say that he'd sorted it but his last edit was on the 19th to DQBot and it's now working properly. MM (Report findings) (Past espionage) 00:49, 21 January 2014 (UTC)
The new Flow extension is being deployed to enwiki today. It is being deployed to only two wikiproject pages as a test run to get real users trying out the new interface constructs so they can be tweaked or completely changed based on real world usage until we arrive at a discussion system that can serve the needs of wikipedians.
Because Flow is in such an early stage, with many things uncertain, the API modules it enables are a shim exposing the internals which is sufficient only for the existing ajax calls. These will change without notice, and I encourage you to not yet build out integrations with these APIs.
We have a regular integrated MediaWiki API in the works ( T59659 and others) which bots will be able to integrate with, we expect to have this merged and deployed well before expanding from our initial test runs in the wiki project space. Flow integrates with a number of MediaWiki constructs such as recent changes, watchlists, contributions, etc. Feel free to file bugs for anything those integrations might break that previously worked.
EBernhardson (WMF) ( talk) 17:41, 3 February 2014 (UTC)
?action=foo
) on the page and implements its own; at the API level it implements its own actions, but it seems existing API actions succeed (
T62808). --
S Page (WMF) (
talk) 04:17, 4 February 2014 (UTC)Using PyWikiBot, my program keeps on failing whenever a page contains non-ASCII characters. (Actually, it only fails when regex-searching the text or when outputting it to Command Prompt.) -- Ypnypn ( talk) 19:58, 6 February 2014 (UTC)
Hi. Could someone please look at this? Things are quite dead, and this need to be done soon. A number of articles are partially updated, and so is the main {{ Infobox dam}}. It's a mess. There is consensus, and no objections. Can we get this going right away please? Reh man 10:21, 9 February 2014 (UTC)
When running a custom script with pywikibot, what's the best way to catch that the bot has been blocked if you want it to do certain things only in that case (e.g. save a log to your hard drive instead of posting it to a user subpage on-wiki)? This is in regard to the compat release, as I had trouble getting the core release to work on my computer. I tried using a try/except block to catch pywikibot.UserBlocked
(based on the help text at the top of wikipedia.py), but it didn't catch it when I blocked my bot, and I'm not sure what else to try. Any help here would be appreciated. (If anyone cares, this bot is for
an external wiki, but this page seems to be the place where I'm most likely to get a reasonably quick reply. Hope you don't mind the quick question; I don't plan to make asking questions here a habit.)
jcgoble3 (
talk) 06:53, 27 February 2014 (UTC)
There isn't a problem for MiszaBot I or other bots to do archiving on template talk pages, is there? Harold O'Brian ( talk) 03:05, 4 March 2014 (UTC)
Cyberbot I and II are being migrated to new data centers in labs, and will be down for a bit. Bot ops using tool labs are encouraged to migrate at their earliest convenience.— cyberpower ChatAbsent 15:29, 5 March 2014 (UTC)
Is there a bot which find non free files (logos; covers for movies and singles) with specified license via template, but without information/source or N-F use rationale, and it automatically add missing rationale template? // XXN ( talk) 16:57, 6 March 2014 (UTC)
This notice is to inform the people that monitor this page that a topic has been brought up on Wikipedia:Administrators' noticeboard#User talk:Hasteur#HasteurBot being naughty? that you may be interested in. — {{U| Technical 13}} ( t • e • c) 18:36, 18 March 2014 (UTC)
How can I create a new entrywith wikipedia? — Preceding unsigned comment added by Garrynewyork ( talk • contribs) 13:35, 4 April 2014 (UTC)
CBM implemented and ran VeblenBot and PeerReviewBot, but is retiring from Wikipedia. I am in occasional email contact with CBM who wrote:
"It would be a good idea to find a different person to run the bot jobs. With the WMF Tools setup, I can actually just hand them the entire bot as a turnkey, they would not need to re-implement it. If you can find someone, please ask them to email me (and you email me) and I will be able to communicate with them that way."
If you are interested in taking over these bots please reply here. They are usually pretty trouble free. My email and CBM's email are both enabled.
I do the monthly PR bot maintenance (making the files and categories) and that includes adding the new PR category each month on the VeblenBot account - I would be glad to keep doing that (and give details on email).
Thanks, Ruhrfisch ><>°° 13:50, 5 April 2014 (UTC)
contributions JV Smithy ( talk) 06:10, 12 April 2014 (UTC)
Can anything be done to restore the automatic updating of the Defcon score and level, which VoxelBot until recently was doing every half-hour? Lots of counter-vandalism workers will be looking at out-of-date information on the many displays based on {{Vandalism information}} that are fed by this process : Noyster (talk), 16:11, 3 April 2014 (UTC)
A request for comment on MediaWiki.org is seeking feedback on the possible deprecation of pywikibot/compat. If you're running that framework, you may be interested in the discussion. -- Ricordi samoa 23:52, 3 May 2014 (UTC)
Ever since late yesterday Wikipedia time, User:10.68.16.31 has been archiving discussions on assorted pages and labeling them as (BOT). I'm not sure, but I don't think bots should be doing work logged out, so I brought it here. I've got no idea which bot it is. Supernerd11 :D Firemind ^_^ Pokedex 03:27, 12 April 2014 (UTC) Sorry about being such a noob!
As of June 1st 2014, Cobi's BOT is still causing issues. He is not an active moderator.
User:Citation bot ( | talk | history | links | watch | logs) is a much-loved fixer of citation templates, but its creator/operator is busy IRL, and has been so for quite a while. His appeal for assistance or relief in maintenance and operation of that bot has gone largely unanswered.
After the switch to Lua for the CS1 templates, a substantial rewrite was done, but some nasty bugs are not yet dealt with. The op has barely been onwiki and hasn't touched the code in eight weeks. Even code reviews would be a big help, and if someone can find corrections, so much the better. I've tried but my brain won't wrap itself around the language used (php).
The code is open, available at http://code.google.com/p/citation-bot/ for anyone considering helping out. LeadSongDog come howl! 15:31, 7 May 2014 (UTC)