This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | Archive 6 | → | Archive 10 |
-BOT Process | [ watch this thread][ |
So from time to time we have an issue with a bot running out of control, unapproved bots being run, etc. In a recent matter (actually its still going on, but thats beside the point), members of the community seems to discuss a proposal that would have (IMHO) compelled a bot owner to change the operation of his bot. But the bot owner was already on record of saying he would not pay attention to that proposal. I asked the crats what sort of consensus they would look for, and WJBscribe indicated they'd look towards the BAG [1] and that it might be nice if the BAG had some formal process "where someone can raise problems with bots and BAG can evaluate whether to require changes to the bot's operation be made in order for approval not to be withdrawn." Im thinking a possible extension might be an RFC-bot, modeled on the RFC-user conduct and RFC-policy systems. Or something akin to Admins Recall, if it could be applied to all bots equally (not 500 different processes). Other ideas? MBisanz talk 07:56, 21 February 2008 (UTC)
(copied from BN) IMHO
WP:BRFA isn't enough in this respect. Consensus (and the bots themselves) can and do change. There needs to be a process for governing bot (and bot owner) activity including withdrawing approval if necessary. Sure, bots can be blocked but that tends to be reactionary and only takes one admin. I had a bot blocked a few days ago (see bot out of control from above) too and it just seems that, for lack of sufficient process, the block (which was not set a time limit) was just forgotten. We, as a community, need the ability to govern bots because when it comes down to it they are just too efficient. This bitterness and resentment seems to stem mostly from the lack of binding recourse either for the sake of justifying a bot, or for governing one. But as I said it's just my opinion.
Adam McCormick (
talk) 07:46, 24 February 2008 (UTC)
{{t1|nobots}} proposal. | [ watch this thread][ |
Just wanted to drop a note here, there is presently a proposal underway at WT:BOTS, to require that all bots be {{ nobots}} compliant. SQL Query me! 03:28, 9 March 2008 (UTC)
nobots needs to be redesigned | [ watch this thread][ |
I just got around to looking at the nobots system, and realized how far from best practices it is. The system is premised on the historical practice of downloading the entire content of a page before making any edit, even if the edit is only to append a new section to the bottom. Once the API editing is implemented, we probably won't need to download any page text at all to get an edit token and commit the new section. At that point, the nobots system will be completely broken.
It seems to me that we should discuss a nobots system that doesn't require bots to perform lots of needless downloads. Perhaps a database of per-bot exclusion lists, like Wikipedia:Nobots/BOTNAME or something like that, which would only require one fetch to get the full list. — Carl ( CBM · talk) 12:59, 9 March 2008 (UTC)
{{bots|allow={{MyBotAllowList}}}}
.{{nobots|BOT1|BOT2|BOT3}}
puts the page into the appropriate categories. — Carl (
CBM ·
talk) 18:39, 9 March 2008 (UTC)exclude [[User:SineBot]] from [[User talk:Happy-melon]] exclude [[User:MelonBot]] from [[User:Happy-melon]] [[User:Happy-melon/About]] [[User:Happy-melon/Boxes]] exclude [[User:ClueBot]] from all exclude all from [[User:Happy-melon/Articles]]
Happy-melon: what do you mean the bot would hold the entire category tree in memory? There would be at most three categories to read: the list of pages forbidding all bots, the list permitting that particular bot, and the list forbidding that particular bot. This would mean (unless any of the lists is over 5000 entries long) only three HTTP queries, one time, to load the exclusions list. That's reasonable.
On the other hand, any system that requires an extra HTTP query for every edit that must be made is unreasonable because it is vastly inefficient. It would be possible to reduce the number of extra queries if you were just looking for page existence, but still every single bot.css file or whatever would have to be loaded, every time the bot wants to edit the corresponding page. That's far from ideal design. — Carl ( CBM · talk) 22:59, 9 March 2008 (UTC)
BJBot | [ watch this thread][ |
I would like to urge those who approve bots, that bots like BJBot — which left an unwanted long notice on my talk page because I made a single edit to Adam Powell, telling me that it was listed on AfD — should honour {{ nobots}}.
As a side note this response is rather uncalled for behaviour for a bot operator. I'm glad he struck that later, but it's still disappointing. Requests by useres not to notify them should only be ignored if there is a good reason to do so. -- Ligulem ( talk) 19:00, 9 March 2008 (UTC)
I would like to see the approval for this task looked into further by BAG. The notifying of people with very few edits to articles seems rather an annoyance and the bot seems to be notifying a lot of people (IPs included) - I count about 50 notifications about the proposed deletion of Prussian Blue (duo) alone. This was probably a request that should have been scrutinised a little longer... WjB scribe 18:22, 10 March 2008 (UTC)
There was in fact two different bugs that allowed Ligulem to get a notice. The first was me playing around with nobots early in the morning and had been fixed for hours (what he requested fixed on my talk), the second I didn't notice until he posted his rant here ("only one edit" got my interest), I also fixed that. If anybody sees unwarranted notices, leave a message on the bots talk with a diff. I also plan do redisable IP notices per a message on my talk, that should further reduce notices. BJ Talk 01:48, 11 March 2008 (UTC)
I've added a new section on the approval discussion page at Wikipedia:Bots/Requests for approval/BJBot 4, proposing to use an opt-in procedure for task 4 (delete notifications). I suggest to follow-up at Wikipedia:Bots/Requests for approval/BJBot 4#Opt-in instead of opt-out. -- Ligulem ( talk) 10:40, 12 March 2008 (UTC)
Bot owner's essay | [ watch this thread][ |
Is there an essay or guideline for how to deal with bot owners? I have, in the course of the past year, gotten comments and requests about my bot's behavior that range from polite through negative to downright abusive. I'm sure I've read something somewhere, but can someone point me to it? -- SatyrTN ( talk / contribs) 21:14, 9 March 2008 (UTC)
Pywikipedia getVersionHistory | [ watch this thread][ |
Something changed in the format of history pages which broke pywikipedia's getVersionHistory. I've fixed it for my own needs, but heads up in case any other bots use this function. Gimmetrow 23:03, 10 March 2008 (UTC)
Bot roles for nobots | [ watch this thread][ |
I propose to extend the {{ bots}} specification to allow easier restriction of particular bot types. This involves creating pseudo-usernames to be used in allow and deny parameters, for example, username "AWB" relates to all AWB-based bots (already supported), other bot framework names could include "pywikipedia", "perlwikipedia", "WikiAccess", etc. Additionally, we could classify bots by roles they perform: "interwiki", "recat", "fairuse", "antivandal", "notifier", "RETF", "AWB general fixes" and so on. For convenience, these roles should be case-insensitive. MaxSem( Han shot first!) 10:23, 12 March 2008 (UTC)
I need some one who operates bots on OS X | [ watch this thread][ |
I use
OS X Tiger and Im trying to run bots for the
Telugu Wikipedia. I downloaded the python framework from
this page. and I created the user-config.py file which reads
mylang='te'
family='wikipedia'
usernames['wikipedia']['te']=u'Sai2020'
Sai2020 is my username. I open Terminal and type in python login.py
I get the error python: can't open file 'login.py'
Can someone help me please Σαι ( Talk) 12:19, 12 March 2008 (UTC)
That was the problem. once I cd'd it worked but i get a different error this time
Sais-MacBook:~/Desktop/pywikipedia Sai$ python login.py Traceback (most recent call last): File "login.py", line 49, in <module> import wikipedia, config File "/Users/Sai/Desktop/pywikipedia/wikipedia.py", line 127, in <module> import config, login File "/Users/Sai/Desktop/pywikipedia/config.py", line 364, in <module> execfile(_filename) File "./user-config.py", line 1 {\rtf1\mac\ansicpg10000\cocoartf824\cocoasubrtf440 ^ SyntaxError: unexpected character after line continuation character
whats going on? I'm not very good at these kind of stuff.. Σαι ( Talk) 01:27, 13 March 2008 (UTC)
Thank you very much people. I can now login :) Σαι ( Talk) 08:57, 13 March 2008 (UTC)
New Bot? | [ watch this thread][ |
Any chance of a bot that automatically reverts any blanked page? One may already exist but, if so I'm not familiar with it. I've been chasing alot of blankings lately in my anti-vandalism crusade. Thanks either way. Jasynnash2 ( talk) 17:09, 14 March 2008 (UTC)
Proposal on WT:BRFA | [ watch this thread][ |
Please offer input there if you have any :). Mart inp23 19:33, 17 March 2008 (UTC)
Extended help wanted | [ watch this thread][ |
I'm interested in developing my bot skills, particularly to running bots which operate on a continuous basis, rather than the more script-oriented bots I'm already operating. I'm looking for a more experienced bot coder/operator who can help me get to grips with the extra knowledge and tools required to operate continuously-running bots. Kind of an
adopt-a-bot-owner system :D
. I can work in C++ and VB, but all of my previous bot-coding experience has been in python. Anyone interested and willing to give me a hand?
Happy‑
melon 10:40, 18 March 2008 (UTC)
Problem with dotnetwikibot | [ watch this thread][ |
Is anyone having a problem with dotnetwikibot today? As of this morning, any attempt to FillAllFromCategory is not working. I changed nothing in my code, which was working fine yesterday.
I placed a query about this at sourceforge.net dotnetwikibot framework forum, but it doesn't appear to get alot of traffic.
Any help would be appreciated. -- Kbdank71 15:17, 20 March 2008 (UTC)
This page now under bot care | [ watch this thread][ |
As a trial for the CorenANIBot, this page is now automatically archived into subpages when new sections are created. There is an automatically generated link right of the titles to edit or watch the subpages, allowing you to watch the individual threads.
Watching this page itself will allow you to see new threads.
Warn me if it breaks! — Coren (talk) 20:31, 21 March 2008 (UTC)
That's certainly interesting. I'm not sure whether I like it or not, but this is a good noticeboard to try it on. What are the perceived benefits? I can see 1) being able to watch individual threads and 2) a sort of 'instant archive', since they're already sorted by date. But I'm not sure how it would fit in with the archiving schemes currently in place at WP:AN, WP:ANI, etc, or what a newbie making their first post to WP:AN would make of a long list of page transclusions. Perhaps this system is best placed at boards which are frequented by regulars, like WP:ANI or WP:AN3RR. Happy‑ melon 10:46, 22 March 2008 (UTC)
I think this is a great idea for AN and ANI. Trying to watch for changes to any given thread there at the moment is rather impossible, especially on ANI. To address HappyMelon's concern, we'd just have to make it clear via some notices at the top not to try and edit the page itself and to use the edit and add section links. This could also be extremely useful for addressing vandalism attempts on those pages; the main pages themselves could be semiprotected or protected if necessary without shutting down discussions, the same could be done to the transcluded pages without disrupting other discussions.-- Dycedarg ж 22:24, 22 March 2008 (UTC)
Adding a thread | [ watch this thread][ |
...just to see what happens to it under bot care... Franamax ( talk) 12:56, 22 March 2008 (UTC)
That was a little weird. First it didn't show up at all, then it showed as a redlink. I did a server purge and it showed up fine. Seems a little confusing, maybe I missed something? Franamax ( talk) 13:01, 22 March 2008 (UTC)
Clarify: first it was on the page normally (I could see it in edit page), then it vanished and redlinked. Perhaps the bot could leave a "reformatting" message? Also, is this a failsafe method? What if there's an edit conflict along the way? Keeping in mind that the only thing important to me is my post and I want to make sure it's there because to me it's the most important thing in the world. :) Franamax ( talk) 13:13, 22 March 2008 (UTC)
Resolving conflicts with article maintainers | [ watch this thread][ |
Do you have any good ideas on how to build consensus in discussions like this? Whenever an autonomous interwiki bot links the article Monoicous, the bot owner gets angry comments from the article maintainers. I have tried to explain how interwiki bots work and how we can solve the problem by correcting all the links manually, but the discussion always seems to drift toward “just fix the bots”... -- Silvonen ( talk) 04:19, 11 April 2008 (UTC)
WP:BOT has been completely rewritten | [ watch this thread][ |
...just in case anybody didn't notice. It would be much easier to pretend that the rewrite has consensus, and attempt to gain consensus for more radical kinds of change, if we could get more people commenting there.-- Dycedarg ж 20:31, 12 April 2008 (UTC)
{{tl|bots}} change | [ watch this thread][ |
Are we going with this? This is directed ST47 and Carnildo mainly, as I don't think Beta would follow it without force. I'm sure this has already been talked about but I stopped reading that debate a while ago. BJ Talk 15:25, 16 April 2008 (UTC)
Per WP:BOT
-- Chris 12:45, 7 September 2008 (UTC)
Automated tools, such as Twinkle and CSDWarnBot, are leaving messages on users whose talk page redirects to their user page, which are usually banned users and sockpuppeteers. See this diff on User talk:Antidote for an example. This is pointless because the only way to read those messages is to go into edit mode. A more serious consequence of leaving messages on redirect pages is shown on the history of the same page. User:Yamanbaiia notified Antidote about the deletion of Image:EKusturica.jpg with Twinkle. After the image was deleted, User:Mushroom deleted User talk:Antidote, apparently because Twinkle thought that the page was a redirect to Image:EKusturica.jpg, which it absolutely was not.
Automated tools should therefore take redirected user talk pages into account. They should either remove the redirect while posting the message or avoid posting to the talk page altogether. Another solution to this problem is to protect redirected user talk pages so non-admins can't add to them, but the policy on protecting user talk pages currently doesn't allow that.
I would also be interested if any other user talk pages have been deleted in this way. I had encountered User:Antidote before and wanted to read through his user talk page to understand why he was banned. If I was a non-admin, this would have been impossible for me without assistance. Graham 87 12:58, 9 September 2008 (UTC)
There is also the case where a user has been renamed and the user_talk page of the old one redirects to the new name. This is also commonly done for publicly declared socks or bot accounts where the owner wants all inquiries to go to the "main" talk page.
If the target of the redirect is another user_talk page, an intelligent bot/script/tool would assume "this is probably another name of the same user" and follow the redirect before posting the message. Otherwise it should abort the message and alert the operator. In no case should it over-write or post below the redirect. —
CharlotteWebb 16:28, 9 September 2008 (UTC)
Actually another part of the problem is that according to whatlinkshere for the image, User_talk:Antidote did contain a link to the deleted page, and was listed as a redirect. However 2 + 2 did not equal 4 in this case as it did not "redirect to a deleted page". This is probably a bug in itself. If a page consists of:
#redirect [[Foo]] == XYZ == [[Bar]]
...it should be listed as a "(redirect)" only in Special:Whatlinkshere/Foo, not in Special:Whatlinkshere/Bar. — CharlotteWebb 16:41, 9 September 2008 (UTC)
A change in r40621 changed the behavior of Special:UserLogin for successful logins. The 'login successful' message is no longer displayed. Instead, on successful login, you are redirected to the page you came from (Main Page by default). This change may break bots that rely on the "login successful" text to detect successful logins. My initial reading and testing says you will get a HTTP 302 code on successful login and a HTTP 200 code on login failure, when you load the wpCookieCheck page. — Carl ( CBM · talk) 13:45, 21 September 2008 (UTC)
Brion has just announced on the wikitech-l list that he's planning to (finally) change the canonical name of the "Image:" namespace to "File:". This is likely affect a lot of bots that deal with images, as well as other bots that just want to tell if a title is an image or not. Please note that, if everything goes as planned, we'll only have about one week to fix our bots before the change goes live. More details in Brion's post and at bugzilla:44. — Ilmari Karonen ( talk) 02:35, 7 October 2008 (UTC)
Just looking through Wikipedia:Bots/Requests_for_approval/Approved, which took me to a request for Kaspobot. This request was withdrawn. It seems to not only be categorized incorrectly in the approved section, but the user also appears to have been flagged as a bot? Maybe I am missing something? Matthew Yeager 06:50, 27 October 2008 (UTC)
See bugzilla:4253: to reduce the likelihood of edits to pages with long titles exceeding the 512 byte limit of IRC messages, rev:42695 modifies the message format so that the diff URL no longer repeats the page title. This may affect some bots that follow the IRC feed: if your code treats the URL as an opaque string or only uses it to extract the revision IDs, everything should be fine, but any code that expects to find the page title in the URL should be changed before the new revision goes live (which, as usual, will take an unpredictable amount of time from minutes to months). — Ilmari Karonen ( talk) 23:46, 27 October 2008 (UTC)
Hello,
I have a question regarding ArticleAlertbot, for which I write the code. The bot reads articles from certain categories, e.g. Category:Articles for deletion, and notifies the corresponding WikiProjects, identified by their project tags on the talk pages. (See User:B. Wolterding/Article alerts for details.)
Several users have asked me whether I could include also WP:DRV in this alert system. Of course there is an apparent problem: For most articles on DRV, their talk page is deleted, so the bot can't find out which projects they correspond to. Of course, this information is in principle contained in the wiki, since deleted pages are not physically removed from the database, but de facto the bot has no access to it.
If however the bot had admin rights, it should be feasible to retrieve the deleted talk pages and scan them for WikiProject banners. (Haven't checked the API details.) On the other hand, it seems a bit exaggerated to flag the bot as adminbot only for this little detail feature. Would you think that such functionality could pass WP:BRFA via the new "adminbot approval" process? (Of course one would need an admin willing to run the bot.) Or is it possible to grant the bot restricted admin rights, in some sense, so that it could read deleted revisions? Do you have any other suggestions for this problem? -- B. Wolterding ( talk) 17:59, 2 November 2008 (UTC)
Is there somewhere other than the BRFA where I can find out what this bot is doing?
I'm sure the intentions are good, but on balance the results are not. What's happening here? Franamax ( talk) 11:44, 13 November 2008 (UTC)
I agree with Franamax: the idea to subst existing {{ unsigned}} has been rejected many times in the past. Thousands of extra edits put much more stress on the servers than the transclusion of templates that are almost never changed. Besides, linking to WP:SUBST is simply misleading because the page does not say that {{ unsigned}} should be used substed, let alone replaced after it's been used. — AlexSm 19:35, 13 November 2008 (UTC)
If you're going to anything you should do a replacement based on Special:Expandtemplates rather than a regular subst, in order to strip out any html comments and needless branching of parser functions. Try it with a complex template and you'll see what I mean. Eventually you would want to ask Brion to enable the SubstAll extension. — CharlotteWebb 21:04, 13 November 2008 (UTC)
I'm interested in creating my own bot, and having it run continuously, maybe based off of the recent changes so that it detects any edits made. Can anyone explain to me how continuous running bots work, or point me to some technical documentation or examples of bots that run continuously? Thanks much! Redalert200 ( talk) 17:36, 13 November 2008 (UTC)
Looks like User:ImageRemovalBot is buggy. While the images are available ( see this version ) , User:ImageRemovalBot removed the linking here and here.
The images are still available
I have left a note for the Operator
User:Carnildo .
Any idea whtz wrong ? --
Tinu
Cherian - 11:35, 17 November 2008 (UTC)
" :The images are lacking an image description page, and the Wikipedia API says the images are missing. It appears to be related to Wikipedia
bug #15430. Since Wikipedia insists that the images are missing, there's not really anything I can do about it. You could try editing the image description page to add proper license and source information, and see if that fixes things. --
Carnildo (
talk) 20:47, 17 November 2008 (UTC) " ( copied the explaination of the bot operator from his talk page.)
Done: The issue is also that some how the image lost its description when automatically moved to commons from en.wiki. Issues solved after adding the same. Thanks -- Tinu Cherian - 12:50, 4 December 2008 (UTC)
Not sure if this is the right place. How do you get a bot stopped? See action of User:OKBot and comments at User talk:OsamaK Traveler100 ( talk) 19:57, 25 November 2008 (UTC)
Since archive 497 was started, we seem to be having only one thread an archive. I assume that this is a bot problem. Is this where I shoudl raise this?-- Peter cohen ( talk) 21:15, 4 December 2008 (UTC)
I have been nominated for BAG, so per instructions I am posting this to invite comments at Wikipedia:Bot Approvals Group/nominations/Anomie. Thanks. Anomie ⚔ 03:13, 6 December 2008 (UTC)
I've built a test bot. It logs in, grabs my user page and tries to append some text to it, but it can't; it gets a "IP address blocked" type message (open proxy). This is because, I think the IP address of the server its on is similar to the one quoted at me by the block message. Still I shouldn't get this message, I'm logged in (I can edit my user talk page just fine!), so what's up?
Any help appreciated. Jarry1250 ( talk) 21:37, 5 January 2009 (UTC)
This is a notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. APologies for the delay getting this notice out, but I've been busy over the holidays etc. Best wishes, Fritzpoll ( talk) 10:26, 8 January 2009 (UTC)
There are issues with the ClueBot false positives URL ( http://24.40.131.153/cluebot.php) today. Anyone else experiencing issues connecting to the site? Thanks, Willking1979 ( talk) 15:44, 11 January 2009 (UTC)
Hi,
there's currently a discussion at User talk:B. Wolterding/Article alerts#Subscription parameter request whether bot edits to project banners can cause performance issues. More precisely, User:ArticleAlertbot produces report pages that it updates on a daily basis; and some users wish to transclude these into project banners. The question is whether this may overload the job queue, and so whether these transclusion should be allowed or avoided, or maybe even prevented on the technical side. Input by some uninvolved experts would be appreciated. -- B. Wolterding ( talk) 00:37, 12 January 2009 (UTC)
This is a notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. Foxy Loxy Pounce! 03:30, 25 January 2009 (UTC)
A dab page keeps getting hit by bots adding interwiki links. [5] I think that once a particular edit or style of edit, such as adding all of the interwiki links to a page, has been made, future bots ought to just leave the page alone without the need to add a nobots template. -- KP Botany ( talk) 10:07, 25 January 2009 (UTC)
I have created a {{ nobots}} implementation function in 2 languages, to make it easy for non-compliant bots to comply with nobots. You can see the 2 functions (in PHP and Perl) at Template:Nobots#Example implementations. Hope this helps! X clamation point 21:23, 29 January 2009 (UTC)
I've released the first "official version" of my Python bot framework if anyone's interested. It uses the API exclusively for getting data and editing; I currently use it for all my bots and scripts. Downloads are available in .tar.gz and .zip archives with a setup.py script as well as a Windows installer. Mr. Z-man 06:33, 31 January 2009 (UTC)
It appears that SoxBot X is not working correctly. It also appears that the operator does not respond to messages posted on the bot's user page. Please see the following post and do something to fix the problem - User_talk:SoxBot_X#Not_archiving_properly - Thank you
PS the not responding to posts does not relate to this post (as it is a very recent post) but to the previous posts over the last several months to which there have been no responses. Dbiel ( Talk) 03:35, 5 February 2009 (UTC)
As east718 has stated that he is no longer working on this, would any Python-competent admin like to take up the reins and continue with what seems to be a very useful bot task? (ask east for the code if you do) If not, the task can be marked as withdrawn/expired. Richard 0612 18:05, 6 February 2009 (UTC)
User:LilHelpa claims to not use a bot for reverting, but this mistake seems so blatant that it appears likely to have been made in an automted manner. I think this should be investigated. TubeToner ( talk) 23:23, 7 February 2009 (UTC)
There's a big backlog at WP:BLP/N, and the bot is archiving four-day-old threads that haven't been resolved. THF ( talk) 13:55, 14 February 2009 (UTC)
May I tentatively suggest that some consultation is required about the variety of PHP frameworks currently available. I know of many that exist; they range from the bad (e.g. my own) to the very, very good and yet we do not have even a list of frameworks that is even half-complete! This contrasts with other languages, such as perl, which have a handful of well documented, well supported solutions.
Whilst I admit that it is naive to assume that all the existing PHP bots could be made to work from the same framework, I think a possible starting point for helping newcomers develop bots in PHP might be to draw up a comprehensive list of solutions that worked with the present API. Then they could at least choose the one that best suited their needs.
The ones I know of:
I know the above list is pretty shoddy, so please correct the glaring mistakes with it. - Jarry1250 ( t, c) 16:15, 14 February 2009 (UTC)
The instructions and documentation that currently exist at Requests for bot approval are very out-of-date, and in some cases, confusing. The 'Organisation' section was virtually identical to its current state way back in 2006. Practices have moved on by light years from then. I propose that something like this is implemented to make the process more accessible and the approvals page less cluttered. I have also added brief guidance for BAG members on how to close BRFAs (based on SQL's How to close a BRFA). All the essential information is still there, just in a different, more user-friendly format.
Comments, suggestions, ideas?
P.S. I know that the colour scheme isn't particularly brilliant, I chose colours that wouldn't clash with/obscure the links. If anyone with more... aesthetic abilities than me has a better colour scheme: by all means implement it! Richard 0612 17:51, 14 February 2009 (UTC)
Actually done, it does look very good! Nice job Richard. IMHO the TOC should go left of the table, but I'm not really bothered so I'll leave it. - Jarry1250 ( t, c) 09:24, 16 February 2009 (UTC)
I have accepted a nomination to join the Bot Approvals Group - relevant discussion is just a click on the link above away. - Jarry1250 ( t, c) 20:56, 16 February 2009 (UTC)
Is there a(n easier) way to get the number of transclusions a template has?
My php-based bot could load a webpage, so that's not out of the question if a suitable web page does exist. (I did ask on the WP help desk, but I think the people watching this page will be better/quicker at making suggestions.) If worst comes to worst, it could include links to the template as well; redirects should be included (if this is possible). At the moment, you see, I'm having to resort to loading a list of transclusions, and then counting them, which is pretty resource intensive. Cheers! - Jarry1250 ( t, c) 17:13, 31 January 2009 (UTC)
SmackBot ( BRFA · contribs · actions log · block log · flag log · user rights) is an ambitious effort to make detailed corrections to tags and text of articles. It performs useful tasks well. However, in its current state, SmackBot fails the do-no-harm test. It needs testing on degenerate cases, and a more complete understanding of the grammar rules of the tags it edits.
The critical failure with SmackBot is that it doesn't fail-safe when it encounters an unexpected condition. Before SmackBot is reactivated, it must demonstrate that it does no harm in degenerate test cases. In the software development sense, SmackBot needs Verification and Validation quality control.
I've communicated my concern to SmackBot's developer/owner. He's working on a problem with damage to tags. The larger problem is fail-safe and do no harm.
-- Mtd2006 ( talk) 06:03, 18 February 2009 (UTC)
This overview explains how SmackBot works. The bot leverages AWB to edit articles. It combines WP templates with exception rules in a automated process to generate its ruleset, a list of edit rules for AWB. Rich has said, the ruleset is generated from "over a thousand templates, each of which can be formatted in hundreds of ways, it is necessary to canonicalise templates to make the problem tractable." This as another way to say that SmackBot is complex and it makes arbitrary assumptions to simplify its ruleset. Arbitrary choices in software design is the opposite of a deterministic algorithm. Arbitrary decision making implies unpredictable failure modes. SmackBot requires Verification and Validation too prove it does no harm, see WP:BOTPOL#Bot_requirements.
Because of SmackBot's complexity and the potential for damage to live articles caused by unpredictable failure modes, I propose some guidelines for its use.
A test standard should apply to all bots.
This warning is found on complex templates. Templates are, in effect, complex software. {{ intricate template}} As the template warning says, a flaw in the template can appear on a large number of articles. The nice thing about templates is that if a flaw is repaired, the errors are automatically reverted. This is not the case with SmackBot. Flaws in SmackBot are permanently applied until reverted. Because SmackBot's ruleset is generated from many templates, it's vulnerable to errors and conflicts in them. These problems are "fixed" in SmackBot with exception rules. However, when a problem is fixed in SmackBot, its incorrect edits are not automatically reverted. They remain until a human editor or an automated process repairs them.
While there are applications for non-deterministic algorithms, a bot should be strictly deterministic. That is, a bot must do what it's designed to do without side effects. The tasks that a bot performs are its design specification. SmackBot's design specification is User:SmackBot#Tasks. One of SmackBot's design specs (or tasks) is to add missing dates or repair incorrect dates in various tags, e.g., {{fact}} to {{fact|date=February 2009}}. However, when SmackBot fixes a date in an article, it also makes trivial changes. AWB users are cautioned not to make trivial edits.
SmackBot makes trivial edits because it does not conform to its own design specification. There is no SmackBot task that says "change the case of every tag to upper case". Human editors are not allowed to hide substantive changes in a long set of trivial edits (changing case, removing blanks or new lines, etc.) Humans are cautioned for not using a sandbox for experiments and tests. A bot must follow the same rules because it may perform these disruptive edits to thousands of articles. A bot can waste resources, be disruptive and cause damage much faster than the fastest human editor.
SmackBot problems:
SmackBot's bugs can be fixed, up to a point. SmackBot's owner is responsive to trouble reports. But until a bug is reported, SmackBot has made changes that are difficult to find, because of trivial edits, and hard to revert, again because it makes too many trivial edits. Beyond this, the most critical flaw of SmackBot is its failure to be harmless.
Mtd2006 ( talk) 03:34, 19 February 2009 (UTC)
Just an FYI for bot owners/writers. The recent squid rollout changed the behavior when encountering Expect headers. So either have to be sure to not send them, or correctly handle now recieving a 'HTTP/1.0 417 Expectation failed' response instead of the 100 Continue. Q T C 22:22, 18 February 2009 (UTC)
Hi all,
I have put a request in at Bot Requests for Approval for my bot, Thehelpfulbot to be able to use pywikipedia's Python script of redirect.py to delete broken redirects. pywikipedia has been extensively tested and the bot has already been speedily approved for using the same script, but fixing double redirects. As far as I can tell, no other bot is running this task, as User:RedirectCleanupBot is no longer in use as WJBscribe left Wikipedia. This bot will require the admin flag to run this task, which is why I am posting on this board - to let you know about the bot.
If you wish to comment, please do so at Wikipedia:Bots/Requests for approval/Thehelpfulbot 5.
Thanks,
The Helpful One 20:45, 19 February 2009 (UTC)
Hi again all,
Thehelpfulbot now has another request, using pywikipedia's Python script nowcommons.py to delete local images that are also on Commons. You can have a look at the code if you wish, by seeing the pywikipedia library here.
This task will also require the admin flag to run, which is why I am posting on this board again, to let you know about the second admin bot task.
If you wish to comment, please do so at Wikipedia:Bots/Requests for approval/Thehelpfulbot 6.
Thanks,
The Helpful One 20:45, 19 February 2009 (UTC)
Just a quick notice to inform you that I have posted a proposal at the Proposals Village Pump regarding giving BAG the bot-flagging right. Comments, questions, trouts, etc welcome there! Richard 0612 11:11, 20 February 2009 (UTC)
Could another bot operator voice an opinion here please?-- Rockfang ( talk) 16:41, 23 February 2009 (UTC)
Just to let everyone know that my BAG nomination was successful one - so yes, I am now a member of the BAG! Thanks to everyone who showed their support, I hope to now show that your trust was correctly placed. However, in the unlikely event that I do get something wrong - however small - I hope you all will put me right ASAP ;) - Jarry1250 ( t, c) 20:31, 23 February 2009 (UTC)
A template on this page (the header I think) is putting this page into the Intricate Templates category. Is there a way to fix this seeing as how this page itself isn't a template? :) Rockfang ( talk) 20:41, 23 February 2009 (UTC)
In my opinion, Category: Wikipedia bots could do with some substantial re-organisation. Rather than doing it all bit-by-bit, I think it might be best to start with an adventurous design, which then gets modified until consensus is reached. To reiterate, feasibility was not considered when drawing up this design. At the moment, we have this:
The category system, as it is. "An effing mess."
|
---|
|
I would propose a system more like this:
My grand plan
|
---|
These may just have to be left as-is/renamed:
|
I know there are bots which rely on these categories, so it would be good to get everyone's view on this. Another obstacle may lie with the {{ Bot}} template, which is compulsory for all bots, because it adds all bots to "Wikipedia bots". With some tweaking, however, it too could become a useful tool in the categorisation process - simply asking for a status and a purpose would help enormously. Also, that reminds me - if this were to be implemented, we'd need to work out what to do with bots with many different tasks (when it came down to "purpose") - multiple progress categories per bot, perhaps? Anyhow, let's see how far we can get.
- Jarry1250 ( t, c) 20:18, 18 February 2009 (UTC)
If anyone has any worries/criticisms - however minor - please shout; I'm about to contact some experts on category naming to check the exact wording of the categories and to see how we can move this along. (I'm sorry, that's just my way - I hate doing nothing when something can be done.) - Jarry1250 ( t, c) 14:40, 20 February 2009 (UTC)
Revised grand plan
|
---|
These may just have to be left as-is/renamed:
|
Something that does need doing is generating a list of templates that categorize into these templates so if this ever passes they can be updated. Q T C 05:08, 23 February 2009 (UTC)
I'll check out the rest when I get the chance. - Jarry1250 ( t, c) 07:50, 23 February 2009 (UTC)
brfa
parameter? ;) Other than that, it looks ok to me.
Anomie
⚔ 02:34, 25 February 2009 (UTC)
Recently, a change to the Wikimedia servers has caused many bots to break. To fix it, you have to tell your bot not to expect a 100 continue code.
In PHP: curl_setopt($this->ch, CURLOPT_HTTPHEADER, array( 'Expect:' ) );
In VB.NET: ServicePointManager.Expect100Continue = False
X clamation point 17:00, 25 February 2009 (UTC)
I've started doing the most obvious changes to the category system. Apologies for any short-term inconvenience caused. Willing helpers (especially admins for the templates) are of course welcome to help. - Jarry1250 ( t, c) 17:43, 26 February 2009 (UTC)
I'm sure I remember reading that commented out interwikis aren't re-added. Is this correct? Rich Farmbrough, 20:20 7 September 2008 (GMT).
Archive bot doesn't like your stupid timestamp ST47 ( talk) 02:12, 4 March 2009 (UTC)
If someone has an off-wiki contact with Cobi, can you inform him that Cluebot's False Positive reporting page is off-line (and seems to have been offline, at least periodically, since last November). Not urgent, but worth noting. I'll leave a note on his talk page as well, but I assume if he'd been on-wiki he'd have noticed this already. -- Ludwigs2 06:43, 28 February 2009 (UTC)
While I have been trying to broadly use the comments interested parties have left over the past fortnight be be as bold as I can, there are a couple of outstanding proposals that will need some thought.
That's quite a long list, I know, but it demonstrates the possibilities that are starting to open up.
- Jarry1250 ( t, c) 11:02, 28 February 2009 (UTC)
One particular user, ST47, has a bot that has received several complaints about how it conducts speedy-delete warnings to page authors. For example, I placed a speedy delete on an article that had a clearly mis-spelled title: the newly created Zhao'an Country vs Zhao'an County the correctly spelled name where an article already existed.
After placing a speedy delete tag on the article, I went to the author's talk page ( User talk:Isatcn) to notify him of the discrepancy and to request him to place a {{db-author}} tag in the article to avoid any confusion on why the article was tagged for speedy. Before I could finish typing my message, CSDWarnBot placed one of its own in his talk page. The user continued to edit the existing article, and attempted to place a {{hangon}} tag. This was likely because he failed to see my short, succinct message because due to the larger message by CSDWarnBot with all the bells and whistles (image graphic, bolding, 2 paragraphs, etc.) See here.
One administrator failed to see what the problem with the article was (not realizing the typo problem with the title) and left the article in tact, even fixing the malformed {{hangon}}.
I eventually left another message on the talk page of the article. The author finally realized what he had done and placed the appropriate {{db-author}} tag. However, a process that should have taken me less than 5 minutes ended up taking me 4-5 times that long.
I also filed a complaint on the ST47's talk page here. It was then that I learned there have been several other complaints about the very same behavior over the last few months, as well as other disruptive/problematic actions by the bot (see User_talk:CSDWarnBot). While I understand that bots can serve a beneficial purpose, the usefulness of a bot is negated by the extra effort and confusing communication to authors of pages who are tagged for speedy delete, many of whom are new users. ++ Arx Fortis ( talk) 18:18, 1 March 2009 (UTC)
ST47, can you stop being so awkward, this is the place to resolve this, not your talkpage, as for you refusing to fix your bot, in light of that I have to agree with others that the bot should just be blocked intill such a time as you are ready to fix it (or someone else), to be honest, I really couldn't care less if you end up not seeing this because you couldn't be bothered to watchlist this page, thanks Spitfire Tally-ho! 19:22, 4 March 2009 (UTC)
So yes, shut down the bot until ST47 agrees to fix it.
Incidentally, while I lack the coding ability to implement such an idea, I suspect that the simplest solution would be for the bot to run once every ten minutes and post talk page messages pertaining to deletion warnings discovered during the previous run. That would ensure a delay of 10–20 minutes (instead of the current range of 0–15 minutes). — David Levy 21:33, 4 March 2009 (UTC)
Being bold and blocking the bot until this is sorted out. "I really don't feel like bothering to go figure out how to do it again" is not a very helpful attitude. If ST47 can't be bothered fixing it now, we'll wait until he can (or until one of you computer buffs does). yandman 08:33, 5 March 2009 (UTC)
When I submitted a BRFA last week, Anomie looked over my code and made a good number of helpful suggestions that I implemented before the code went live. It then set me off on a code-writing spree and I have written a new PHP bot framework from scratch. I think the code is clearer and easier to use than most of the others that are currently available, and I have made an effort to document the code in phpdoc format.
Obviously it is not finished yet and doubtless there is significant functionality that could be added. If you want to contribute to it or give me ideas that I could implement, I'd be delighted.
The Google code site is http://pillar.googlecode.com/
Generated code documentation (along with highlighted and cross-referenced code) is available at http://toolserver.org/~samkorn/pillar/doc/index.html
I have converted my cricket bot ( BRFA) to this code: the converted version (20% smaller than the original!) is available at http://toolserver.org/~samkorn/pillar/example/
Comments/flames/trouts welcomed!
[[Sam Korn]] (smoddy) 22:17, 4 March 2009 (UTC)
I have raised the possibility of a bot which scans database dumps looking for blue links to absent sections in actual articles, e.g. George W. Bush#Olympic medals. I was advised to advertise it here and in WP:Bot requests#Broken section links, but please add any comments to the main discussion at WP:Village pump (proposals)#Broken section links. Certes ( talk) 20:11, 8 March 2009 (UTC)
Xqbot removes links when xqbot shouldn't... http://en.wikipedia.org/?title=City-Bahn&diff=275931839&oldid=266407217
It looks like I'm not the only one:
http://de.wikipedia.org/wiki/Benutzer_Diskussion:Xqt#xqbot
Please stop xqbot FengRail ( talk) 00:55, 9 March 2009 (UTC)
OK, this needs to be shutdown. First, there is not consensus on what constitutes an orphan at Wikipedia talk:WikiProject Orphanage. Secondly, Wikipedia:Orphan is not a policy nor a guideline. Thus it seems to fail the requirements for a bot. Along those lines, since there is not a policy/guideline, it should be up to human editors to decide if one, two, three, four incoming links are enough. But no, if you remove with only two the bot re-adds the tag. Aboutmovies ( talk) 20:22, 3 March 2009 (UTC)
Per the bullet points, above, where was consensus for Addbot's addition of 114k orphan tags established? Because everywhere I've looked, I've found no consensus about the posting or orphan. -- Tagishsimon (talk) 20:16, 4 March 2009 (UTC)
Please do not insert gibberish like {{Sam1649}}
into the deletion log (or any log), especially if there's a high likelihood that the general public will read the log summaries (like the deletion summaries of articles). "Robot:" is sufficiently clear for bot trials. Thanks! --
MZMcBride (
talk) 04:15, 11 March 2009 (UTC)
The {{ Sam1649}} template is to be able to run a database query so that people can see what deletions were made on my admin account by my bot. This allows me to differentiate my edits from the bots. Note: This is only for the trial, not for the real bot edits.
FWIW, should probably leave AntiAbuseBot running until all the filters can be worked out. Q T C 23:37, 17 March 2009 (UTC)
please run m:reflinks.py on eBay with pywikipediabot thanks Amir ( talk) 16:59, 18 March 2009 (UTC)
This looks very suspicious, and many of the edits appear to be corrupted, adding the surrounding boilerplate text but without the actual data. Is there an approval for this (in which case which BAG member do we need to trout?) or do we need some corrective action...? Happy‑ melon 21:43, 23 March 2009 (UTC)
If you look, for example, in the article Chiton, you'll find, in the wikitext of the "General anatomy" section, a footnote that reads, in its entirety:
That footnote uses the source information found in Template:Cite doi/10.1002.2Fhlca.200390096, a page created by User:Citation bot. That source information (to continue the example) is this:
Now the reader of the article will see a footnote with the expected information.
What we have here is a system where (a) the footnote in the wikitext has different information than what shows in the "References" section; (b) someone wanting to improve the footnote needs to understand that he/she has to edit the template; and (c) template space gets populated with a page whose sole purpose is to insert information into (usually, as in this case) a single article.
Here's the cite doi templates that exist so far. I'm guessing that all this is so that editors can just let User:Citation bot create a footnote by providing its doi. My first question is why Citation bot needs to do this via a template, rather than (say) simply overwriting a redlink in the article itself, or something else that doesn't add yet another layer of complexity to Wikipedia articles?
Also, I don't see (perhaps I missed it) approval for this system, in any of the following three pages. (I do see a mention of DOI bot editing cite doi pages, but not creating them.) Thus my second question: If this was approved, where was it discussed?
-- John Broughton (♫♫) 18:49, 25 March 2009 (UTC)
Yuck, Get It Out Of Here. This brings back nasty echoes from the past. Happy‑ melon 21:29, 26 March 2009 (UTC)
Reading back through the approvals, they don't match my memory of them, so I'm not going to adamantly say that it does. Consensus should be quickly reached here, and I've suspended this bot function until it is. (Users will still be able to manually request the bot act.) Martin ( Smith609 – Talk)
Templates such as {{ Bruscabrusca}}, {{ WonderfulLife}}, and {{ PalAss2008}} suggest 'yes' - see for example Category:Biology source templates. Martin ( Smith609 – Talk)
This probably isn't the place to establish consensus, but this question is important. Martin ( Smith609 – Talk) 12:27, 28 March 2009 (UTC)
Here is some data (my attempt at a random sample, picking a dozen doi cite templates, with the only criteria for including in the sample being that a template must have been created by Citation bot). The count refers to the number of articles that link to a specific template:
One:
Two:
I wasn't surprised that there were more cases of only a single article linking to a doi cite template than there were cases where there were two or more; I was surprised at the number of doi cite templates with no articles linking to them. -- John Broughton (♫♫) 00:02, 29 March 2009 (UTC)
Assuming that single source templates are sometimes a good idea (which is discussed in the section above), there doesn't seem to be a good argument for making editors create single-source templates by hand when the process can be automated - have I missed one? Martin ( Smith609 – Talk)
DSisyphBot ( talk · contribs) added over 20 interwiki links to Wikipedia:Bot requests, but its operator indicates he only understands three languages. My understanding of bot policy is that ops should be able to minimally verify that language links are correct. Do we waive this requirement if the bot is running standard software like pywikipedia? Wronkiew ( talk) 16:35, 30 March 2009 (UTC)
I have proposed a policy change here to resolve this. Wronkiew ( talk) 00:55, 1 April 2009 (UTC)
(crossposted from WP:AN)
DediBox is a cheap dedicated hosting solution operated by Proxad (France). I have hardblocked it since there are apparently some abused open proxies there. However, some bots operate from these servers and I expect some collateral. I have given IPBE to WP 1.0 bot ( talk · contribs) and MystBot ( talk · contribs). If an other operator complains, please give them the bit (don't forget to log it) and poke me so I can double check. -- lucasbfr talk 09:00, 31 March 2009 (UTC)
← The bot has recently conducted a number of very minor (seemingly insignificant edits) that I believe fall afoul of current bot guidelines (concern was raised at WP:AN#User:D6). I dropped a note for Docu to comment here or there. – xeno ( talk) 18:40, 1 April 2009 (UTC)
Thanks for removing the claim. Can you also remove the claim that the bot reverts others? The diff you provided seems to show that someoneelse used twinkle in a way he shouldn't have. -- User:Docu
Xenocidic ( talk · contribs · deleted · filter log · SUL · Google) • ( block · soft · promo · cause · bot · hard · spam · vandal)
Xeno ( talk · contribs · deleted · filter log · SUL · Google) • ( block · soft · promo · cause · bot · hard · spam · vandal)
processed a large number edits to remove attribution notices from a large number of articles ( contribution details). Is there a consensus for such a change or was there a task approval? -- User:Docu —Preceding undated comment added 15:35, 17 April 2009 (UTC).
This bot task was explicitly approved at Wikipedia:Bots/Requests for approval/Xenobot 6. – Quadell ( talk) 17:42, 17 April 2009 (UTC)
modified a series of templates on IP talk pages (contribution details). Has this been reviewed or specifically approved? -- User:Docu —Preceding undated comment added 09:46, 18 April 2009 (UTC).
Is there a bot that removes {{ ifdc}} from captions? Which ones, and on what conditions? Thanks, – Quadell ( talk) 14:33, 17 April 2009 (UTC)
Hello folks!
We have updated our mailing lists! :)
Until now, pywikipedia-l was automatically spammed on each bug update, and on each svn commit, which made its subscription painful for users not interested in pywikipedia development.
To solve this issue, two lists have been created, pywikipedia-bugs, for automated bug updates, and pywikipedia-svn, for automated svn commit message. This way, the traffic on pywikipedia-l should be greatly reduced: only human discussions should take place there. We would like to encourage advanced pywikipedia users to subscribe to this list: it should have moderate traffic, and it would allow us to get feedback on our development.
But more important, we have created an announce mailing-list, pywikipedia-announce. This mailing-list will be used for important announcements, such as breaking changes. The aim of this list is to have a minimal traffic: only a couple of folks can post on it, and we should not have to use it more than once a month. (any mail sent to pywikipedia-announce is also sent to pywikipedia-l, no need to subscribe to both).
We would like our users to subscribe to either pywikipedia-announce or to pywikipedia-l to be sure to receive those important announcements
We hope that in this way, we'll be able to significantly improve pywikipediabot quality, and response time to urgent matters: no more foundation-wide running around for developers if something is utterly broken ;)
Thank you,
NicDumZ ~ 11:12, 18 April 2009 (UTC)
I'm making a list of image/media-related bots at Wikipedia:WikiProject Images and Media/Bots. Obviously the formatting leaves something to be desired, but if anyone here operates such a bot (or knows of one) feel free to add it to the list. I'd add them myself, but I don't know that many active bots off the top of my head, which is why we need the list in the first place. :) Thanks! ▫ JohnnyMrNinja 00:37, 25 April 2009 (UTC)
How can I get a list of all bot-flagged accounts? How can I get a list of all bot-flagged accounts that have edited in the last 30 days? Or that haven't edited in the last year? – Quadell ( talk) 03:00, 25 April 2009 (UTC)
This is a ' mandatory' notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. Best wishes,-- Tinu Cherian - 10:48, 1 May 2009 (UTC)
Hi all,
Another user brought up to me that my bot's edits are showing up on his watchlist, even with the "hide bots" option enabled. I took a look at Special:RecentChanges, and noticed that they are showing up on there as well. The account has the bot flag, and I double-checked my code to verify that it is actually flagging the edits as bot edits. In looking at the RecentChanges, I noticed a couple other bots on the list (SPCUClerkBot, XLinkBot), so I'm guessing the problem isn't isolated to just my account. The bot is making all of its changes through the MediaWiki API. Any suggestions? Matt ( talk) 23:31, 1 May 2009 (UTC)
Just FYI, I made a template that might be useful to you.
{{botlinks3|Polbot}}
The "task list" link is a list of all pages starting with "Bots/Requests for approval/Polbot".
{{botlinks3|Polbot|11}}
The "task" link points directly to task 11.
{{botlinks3|Polbot|-}}
The "task" link points to the RfBA without a numerical suffix. (Polbot never had one, which is why it's a redlink.) – Quadell ( talk) 00:46, 28 April 2009 (UTC)
This escalates a clearly erroneous page-move, as only an admin can scrape the crud away in order to revert this. — CharlotteWebb 13:32, 3 May 2009 (UTC)
The design is still flawed as redirects "from other capitalisation" are among the most likely to need reversing. I've noticed other cases in the past where a user has created a redirect from a more correct or equally plausible title, and this (again, worse than useless) bot comes along to add road-block edits preventing the page from being moved to that title. One might as well write a bot to move-protect every bloody article as that would (from my perspective) have the same practical effect.
Let's step back and ask if/why this bot was approved and whether it serves any meaningful purpose. — CharlotteWebb 17:57, 3 May 2009 (UTC)
Following problems with some bots operating in template namespace, the bot policy now mentions that interwikis should appear on all articles using a template. ( Wikipedia:Bot_policy#Restrictions_on_specific_tasks ) -- User:Docu
Hi all,
I had an idea for a bot I could write, but I don't know if there's already a bot that does it, or if the idea would be very well received, so I'm asking for opinions.
Would it be a good idea to write a bot that replaces links to redirects with a link to the redirect's target (assuming that the target is not a disambiguation page)?
Thanks in advance, Matt ( talk) 23:04, 10 May 2009 (UTC)
As another editor has expressed concern over ListasBot 3's approved functions (in short, whether or not talk pages of redirects should be replaced with a redirect to the new talk page), I've set up a discussion on how to proceed with this bot. Input would be appreciated. The discussion is at User:Mikaey/Request for Input/ListasBot 3.
Thanks, Matt ( talk) 02:44, 12 May 2009 (UTC)
It appears to me that Matt's talkpage notes were good-faith attempts to gain wider exposure. They don't look like intentionally leading questions or ballot-stuffing to me. I'm glad Matt is trying to gauge community consensus, and I don't think the rudeness is called for. – Quadell ( talk) 13:16, 12 May 2009 (UTC)
I have written a bot to get the current Quote of the day from Wikiquote & put it on a page so it can then be used as a template on user pages etc. I haven't requested approval yet because although it works fine from my computer, I need to run it from somewhere else. I've uploaded it to a web server with Dreamhost, yet when I try to run it-the following error comes up.(sorry don't know how to make it smaller!) It's a Python script using pywikipedia.
<small>/home/tris1601/thewikipediaforum.com/pywikipedia/wikitest.py 35 site = wikipedia.getSite() 36 newpage = wikipedia.Page(site, u"User:Dottydotdot/test") 37 newpage.put(text + "<br><br>'''Imported from [http://en.wikiquote.org '''Wikiquote'''] by [[User:DottyQuoteBot|'''DottyQuoteBot''']]", u"Testing") 38 39 wikipedia.stopme() newpage = Page{[[User:Dottydotdot/test]]}, newpage.put = <bound method Page.put of Page{[[User:Dottydotdot/test]]}>, text = u'You have so many things in the background that y... could possibly work?" <p> [[Ward Cunningham]] \n' /home/tris1601/thewikipediaforum.com/pywikipedia/wikipedia.py in put(self=Page{[[User:Dottydotdot/test]]}, newtext=u"You have so many things in the background that y...''] by [[User:DottyQuoteBot|'''DottyQuoteBot''']]", comment=u'Testing', watchArticle=None, minorEdit=True, force=False, sysop=False, botflag=True) 1380 1381 # Check blocks 1382 self.site().checkBlocks(sysop = sysop) 1383 1384 # Determine if we are allowed to edit self = Page{[[User:Dottydotdot/test]]}, self.site = <bound method Page.site of Page{[[User:Dottydotdot/test]]}>, ).checkBlocks undefined, sysop = False /home/tris1601/thewikipediaforum.com/pywikipedia/wikipedia.py in checkBlocks(self=wikipedia:en, sysop=False) 4457 if self._isBlocked[index]: 4458 # User blocked 4459 raise UserBlocked('User is blocked in site %s' % self) 4460 4461 def isBlocked(self, sysop = False): global UserBlocked = <class wikipedia.UserBlocked>, self = wikipedia:en UserBlocked: User is blocked in site wikipedia:en args = ('User is blocked in site wikipedia:en',)</small>
I don't know why it's saying I'm blocked-I'm clearly not & I've checked the IP address for the server it's on-69.163.128.253 which doesn't seem to be blocked either, so now I can't work out what's wrong! Any help would be greatly appreciated-as you can guess I'm pretty new to Python & coding in general!
Thanks!
dottydotdot (
talk) 14:43, 26 May 2009 (UTC)
In searching for bot approvals for ArthurBot ( talk · contribs), I've only successfully located an approval from November 2008 which gave approval for "adding/modifying interwiki links and Link_FA templates". Earlier today, ArthurBot (seemingly counter to the guidelines regarding valid redirects) changed links from MAN AG (the former article name, now a redirect) to MAN SE (new article name). While I'm not sure of the reasoning, I'd like to ask if this is out of scope for the bot's approval and/or is there an approval that I am not finding regarding this activity? — Bellhalla ( talk) 14:41, 27 May 2009 (UTC)
This search shows that "ArthurBot" is only mentioned on a "Wikipedia:Bots/" subpage in two places: here and here. Neither of these approves the task you mention. – Quadell ( talk) 15:10, 27 May 2009 (UTC)
Hi. I blocked Jigbot ( talk · contribs · logs) yesterday for it's username. Now the owner Jigesh ( talk · contribs) requests the account to be unblocked, as he wants to use it as an interwiki bot. What is my best course of action? Should interwiki bots be approved? — Edokter • Talk • 18:39, 29 May 2009 (UTC)
Recently I have been trying to find a version of AWB to use for PascalBot. My search has led me to the conclusion that there is no current version of AWB that is safe for use as a bot, with general fixes enabled. I am thinking it may be useful to have a centralized location to discuss which version(s) of AWB should be used as a bot, perhaps to include "safe" and "unsafe" lists of AWB versions.
Versions of AWB before rev 4382 corrupt {{ Article issues}}. More recent versions add incorrect DEFAULTSORTs, remove valid orphan tags, and add commented out categories. -- Pascal 666 20:52, 31 May 2009 (UTC)
{{ Article issues}} | DEFAULTSORT | orphan tags | commented out categories | ||
Disable gen fix: | SetDefaultSort | ||||
4.5.0.0 | rev 3834 | ||||
4.5.1.0 | rev 3906 | ||||
4.5.2.0 | rev 4100 | ||||
4.5.3.2 | rev 4312 | ||||
4.5.3.3 | rev 4382 | ||||
rev 4395 | |||||
rev 4400 | |||||
rev 4419 |
Anyone know the names of the other gen fixes to disable? -- Pascal 666 00:22, 1 June 2009 (UTC)
The above table is skewed towards issues present in recent versions. Does anyone know of any reason 4.5.0.0 should not be used? -- Pascal 666 03:21, 1 June 2009 (UTC)
rev 4426 can now be downloaded here. -- Pascal 666 17:55, 2 June 2009 (UTC)
So from time to time we have an issue with a bot running out of control, unapproved bots being run, etc. In a recent matter (actually its still going on, but thats beside the point), members of the community seems to discuss a proposal that would have (IMHO) compelled a bot owner to change the operation of his bot. But the bot owner was already on record of saying he would not pay attention to that proposal. I asked the crats what sort of consensus they would look for, and WJBscribe indicated they'd look towards the BAG [15] and that it might be nice if the BAG had some formal process "where someone can raise problems with bots and BAG can evaluate whether to require changes to the bot's operation be made in order for approval not to be withdrawn." Im thinking a possible extension might be an RFC-bot, modeled on the RFC-user conduct and RFC-policy systems. Or something akin to Admins Recall, if it could be applied to all bots equally (not 500 different processes). Other ideas? MBisanz talk 07:56, 21 February 2008 (UTC)
(copied from BN) IMHO
WP:BRFA isn't enough in this respect. Consensus (and the bots themselves) can and do change. There needs to be a process for governing bot (and bot owner) activity including withdrawing approval if necessary. Sure, bots can be blocked but that tends to be reactionary and only takes one admin. I had a bot blocked a few days ago (see bot out of control from above) too and it just seems that, for lack of sufficient process, the block (which was not set a time limit) was just forgotten. We, as a community, need the ability to govern bots because when it comes down to it they are just too efficient. This bitterness and resentment seems to stem mostly from the lack of binding recourse either for the sake of justifying a bot, or for governing one. But as I said it's just my opinion.
Adam McCormick (
talk) 07:46, 24 February 2008 (UTC)
I would like to urge those who approve bots, that bots like BJBot — which left an unwanted long notice on my talk page because I made a single edit to Adam Powell, telling me that it was listed on AfD — should honour {{ nobots}}.
As a side note this response is rather uncalled for behaviour for a bot operator. I'm glad he struck that later, but it's still disappointing. Requests by useres not to notify them should only be ignored if there is a good reason to do so. -- Ligulem ( talk) 19:00, 9 March 2008 (UTC)
I would like to see the approval for this task looked into further by BAG. The notifying of people with very few edits to articles seems rather an annoyance and the bot seems to be notifying a lot of people (IPs included) - I count about 50 notifications about the proposed deletion of Prussian Blue (duo) alone. This was probably a request that should have been scrutinised a little longer... WjB scribe 18:22, 10 March 2008 (UTC)
There was in fact two different bugs that allowed Ligulem to get a notice. The first was me playing around with nobots early in the morning and had been fixed for hours (what he requested fixed on my talk), the second I didn't notice until he posted his rant here ("only one edit" got my interest), I also fixed that. If anybody sees unwarranted notices, leave a message on the bots talk with a diff. I also plan do redisable IP notices per a message on my talk, that should further reduce notices. BJ Talk 01:48, 11 March 2008 (UTC)
I've added a new section on the approval discussion page at Wikipedia:Bots/Requests for approval/BJBot 4, proposing to use an opt-in procedure for task 4 (delete notifications). I suggest to follow-up at Wikipedia:Bots/Requests for approval/BJBot 4#Opt-in instead of opt-out. -- Ligulem ( talk) 10:40, 12 March 2008 (UTC)
Is there an essay or guideline for how to deal with bot owners? I have, in the course of the past year, gotten comments and requests about my bot's behavior that range from polite through negative to downright abusive. I'm sure I've read something somewhere, but can someone point me to it? -- SatyrTN ( talk / contribs) 21:14, 9 March 2008 (UTC)
I propose to extend the {{ bots}} specification to allow easier restriction of particular bot types. This involves creating pseudo-usernames to be used in allow and deny parameters, for example, username "AWB" relates to all AWB-based bots (already supported), other bot framework names could include "pywikipedia", "perlwikipedia", "WikiAccess", etc. Additionally, we could classify bots by roles they perform: "interwiki", "recat", "fairuse", "antivandal", "notifier", "RETF", "AWB general fixes" and so on. For convenience, these roles should be case-insensitive. MaxSem( Han shot first!) 10:23, 12 March 2008 (UTC)
I'm interested in developing my bot skills, particularly to running bots which operate on a continuous basis, rather than the more script-oriented bots I'm already operating. I'm looking for a more experienced bot coder/operator who can help me get to grips with the extra knowledge and tools required to operate continuously-running bots. Kind of an
adopt-a-bot-owner system :D
. I can work in C++ and VB, but all of my previous bot-coding experience has been in python. Anyone interested and willing to give me a hand?
Happy‑
melon 10:40, 18 March 2008 (UTC)
I use
OS X Tiger and Im trying to run bots for the
Telugu Wikipedia. I downloaded the python framework from
this page. and I created the user-config.py file which reads
mylang='te'
family='wikipedia'
usernames['wikipedia']['te']=u'Sai2020'
Sai2020 is my username. I open Terminal and type in python login.py
I get the error python: can't open file 'login.py'
Can someone help me please Σαι ( Talk) 12:19, 12 March 2008 (UTC)
That was the problem. once I cd'd it worked but i get a different error this time
Sais-MacBook:~/Desktop/pywikipedia Sai$ python login.py Traceback (most recent call last): File "login.py", line 49, in <module> import wikipedia, config File "/Users/Sai/Desktop/pywikipedia/wikipedia.py", line 127, in <module> import config, login File "/Users/Sai/Desktop/pywikipedia/config.py", line 364, in <module> execfile(_filename) File "./user-config.py", line 1 {\rtf1\mac\ansicpg10000\cocoartf824\cocoasubrtf440 ^ SyntaxError: unexpected character after line continuation character
whats going on? I'm not very good at these kind of stuff.. Σαι ( Talk) 01:27, 13 March 2008 (UTC)
Thank you very much people. I can now login :) Σαι ( Talk) 08:57, 13 March 2008 (UTC)
Any chance of a bot that automatically reverts any blanked page? One may already exist but, if so I'm not familiar with it. I've been chasing alot of blankings lately in my anti-vandalism crusade. Thanks either way. Jasynnash2 ( talk) 17:09, 14 March 2008 (UTC)
Is anyone having a problem with dotnetwikibot today? As of this morning, any attempt to FillAllFromCategory is not working. I changed nothing in my code, which was working fine yesterday.
I placed a query about this at sourceforge.net dotnetwikibot framework forum, but it doesn't appear to get alot of traffic.
Any help would be appreciated. -- Kbdank71 15:17, 20 March 2008 (UTC)
Please offer input there if you have any :). Mart inp23 19:33, 17 March 2008 (UTC)
Something changed in the format of history pages which broke pywikipedia's getVersionHistory. I've fixed it for my own needs, but heads up in case any other bots use this function. Gimmetrow 23:03, 10 March 2008 (UTC)
As a trial for the CorenANIBot, this page is now automatically archived into subpages when new sections are created. There is an automatically generated link right of the titles to edit or watch the subpages, allowing you to watch the individual threads.
Watching this page itself will allow you to see new threads.
Warn me if it breaks! — Coren (talk) 20:31, 21 March 2008 (UTC)
That's certainly interesting. I'm not sure whether I like it or not, but this is a good noticeboard to try it on. What are the perceived benefits? I can see 1) being able to watch individual threads and 2) a sort of 'instant archive', since they're already sorted by date. But I'm not sure how it would fit in with the archiving schemes currently in place at WP:AN, WP:ANI, etc, or what a newbie making their first post to WP:AN would make of a long list of page transclusions. Perhaps this system is best placed at boards which are frequented by regulars, like WP:ANI or WP:AN3RR. Happy‑ melon 10:46, 22 March 2008 (UTC)
I think this is a great idea for AN and ANI. Trying to watch for changes to any given thread there at the moment is rather impossible, especially on ANI. To address HappyMelon's concern, we'd just have to make it clear via some notices at the top not to try and edit the page itself and to use the edit and add section links. This could also be extremely useful for addressing vandalism attempts on those pages; the main pages themselves could be semiprotected or protected if necessary without shutting down discussions, the same could be done to the transcluded pages without disrupting other discussions.-- Dycedarg ж 22:24, 22 March 2008 (UTC)
I just got around to looking at the nobots system, and realized how far from best practices it is. The system is premised on the historical practice of downloading the entire content of a page before making any edit, even if the edit is only to append a new section to the bottom. Once the API editing is implemented, we probably won't need to download any page text at all to get an edit token and commit the new section. At that point, the nobots system will be completely broken.
It seems to me that we should discuss a nobots system that doesn't require bots to perform lots of needless downloads. Perhaps a database of per-bot exclusion lists, like Wikipedia:Nobots/BOTNAME or something like that, which would only require one fetch to get the full list. — Carl ( CBM · talk) 12:59, 9 March 2008 (UTC)
{{bots|allow={{MyBotAllowList}}}}
.{{nobots|BOT1|BOT2|BOT3}}
puts the page into the appropriate categories. — Carl (
CBM ·
talk) 18:39, 9 March 2008 (UTC)exclude [[User:SineBot]] from [[User talk:Happy-melon]] exclude [[User:MelonBot]] from [[User:Happy-melon]] [[User:Happy-melon/About]] [[User:Happy-melon/Boxes]] exclude [[User:ClueBot]] from all exclude all from [[User:Happy-melon/Articles]]
Happy-melon: what do you mean the bot would hold the entire category tree in memory? There would be at most three categories to read: the list of pages forbidding all bots, the list permitting that particular bot, and the list forbidding that particular bot. This would mean (unless any of the lists is over 5000 entries long) only three HTTP queries, one time, to load the exclusions list. That's reasonable.
On the other hand, any system that requires an extra HTTP query for every edit that must be made is unreasonable because it is vastly inefficient. It would be possible to reduce the number of extra queries if you were just looking for page existence, but still every single bot.css file or whatever would have to be loaded, every time the bot wants to edit the corresponding page. That's far from ideal design. — Carl ( CBM · talk) 22:59, 9 March 2008 (UTC)
Just a note, bots is only used on a handful of pages, so the point is moot. Rich Farmbrough, 20:19 7 September 2008 (GMT).
Just wanted to drop a note here, there is presently a proposal underway at WT:BOTS, to require that all bots be {{ nobots}} compliant. SQL Query me! 03:28, 9 March 2008 (UTC)
...just to see what happens to it under bot care... Franamax ( talk) 12:56, 22 March 2008 (UTC)
That was a little weird. First it didn't show up at all, then it showed as a redlink. I did a server purge and it showed up fine. Seems a little confusing, maybe I missed something? Franamax ( talk) 13:01, 22 March 2008 (UTC)
Clarify: first it was on the page normally (I could see it in edit page), then it vanished and redlinked. Perhaps the bot could leave a "reformatting" message? Also, is this a failsafe method? What if there's an edit conflict along the way? Keeping in mind that the only thing important to me is my post and I want to make sure it's there because to me it's the most important thing in the world. :) Franamax ( talk) 13:13, 22 March 2008 (UTC)
Do you have any good ideas on how to build consensus in discussions like this? Whenever an autonomous interwiki bot links the article Monoicous, the bot owner gets angry comments from the article maintainers. I have tried to explain how interwiki bots work and how we can solve the problem by correcting all the links manually, but the discussion always seems to drift toward “just fix the bots”... -- Silvonen ( talk) 04:19, 11 April 2008 (UTC)
...just in case anybody didn't notice. It would be much easier to pretend that the rewrite has consensus, and attempt to gain consensus for more radical kinds of change, if we could get more people commenting there.-- Dycedarg ж 20:31, 12 April 2008 (UTC)
Are we going with this? This is directed ST47 and Carnildo mainly, as I don't think Beta would follow it without force. I'm sure this has already been talked about but I stopped reading that debate a while ago. BJ Talk 15:25, 16 April 2008 (UTC)
What's it do, faced with a random 4th level header? SQL Query me! 10:55, 8 April 2008 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | Archive 6 | → | Archive 10 |
-BOT Process | [ watch this thread][ |
So from time to time we have an issue with a bot running out of control, unapproved bots being run, etc. In a recent matter (actually its still going on, but thats beside the point), members of the community seems to discuss a proposal that would have (IMHO) compelled a bot owner to change the operation of his bot. But the bot owner was already on record of saying he would not pay attention to that proposal. I asked the crats what sort of consensus they would look for, and WJBscribe indicated they'd look towards the BAG [1] and that it might be nice if the BAG had some formal process "where someone can raise problems with bots and BAG can evaluate whether to require changes to the bot's operation be made in order for approval not to be withdrawn." Im thinking a possible extension might be an RFC-bot, modeled on the RFC-user conduct and RFC-policy systems. Or something akin to Admins Recall, if it could be applied to all bots equally (not 500 different processes). Other ideas? MBisanz talk 07:56, 21 February 2008 (UTC)
(copied from BN) IMHO
WP:BRFA isn't enough in this respect. Consensus (and the bots themselves) can and do change. There needs to be a process for governing bot (and bot owner) activity including withdrawing approval if necessary. Sure, bots can be blocked but that tends to be reactionary and only takes one admin. I had a bot blocked a few days ago (see bot out of control from above) too and it just seems that, for lack of sufficient process, the block (which was not set a time limit) was just forgotten. We, as a community, need the ability to govern bots because when it comes down to it they are just too efficient. This bitterness and resentment seems to stem mostly from the lack of binding recourse either for the sake of justifying a bot, or for governing one. But as I said it's just my opinion.
Adam McCormick (
talk) 07:46, 24 February 2008 (UTC)
{{t1|nobots}} proposal. | [ watch this thread][ |
Just wanted to drop a note here, there is presently a proposal underway at WT:BOTS, to require that all bots be {{ nobots}} compliant. SQL Query me! 03:28, 9 March 2008 (UTC)
nobots needs to be redesigned | [ watch this thread][ |
I just got around to looking at the nobots system, and realized how far from best practices it is. The system is premised on the historical practice of downloading the entire content of a page before making any edit, even if the edit is only to append a new section to the bottom. Once the API editing is implemented, we probably won't need to download any page text at all to get an edit token and commit the new section. At that point, the nobots system will be completely broken.
It seems to me that we should discuss a nobots system that doesn't require bots to perform lots of needless downloads. Perhaps a database of per-bot exclusion lists, like Wikipedia:Nobots/BOTNAME or something like that, which would only require one fetch to get the full list. — Carl ( CBM · talk) 12:59, 9 March 2008 (UTC)
{{bots|allow={{MyBotAllowList}}}}
.{{nobots|BOT1|BOT2|BOT3}}
puts the page into the appropriate categories. — Carl (
CBM ·
talk) 18:39, 9 March 2008 (UTC)exclude [[User:SineBot]] from [[User talk:Happy-melon]] exclude [[User:MelonBot]] from [[User:Happy-melon]] [[User:Happy-melon/About]] [[User:Happy-melon/Boxes]] exclude [[User:ClueBot]] from all exclude all from [[User:Happy-melon/Articles]]
Happy-melon: what do you mean the bot would hold the entire category tree in memory? There would be at most three categories to read: the list of pages forbidding all bots, the list permitting that particular bot, and the list forbidding that particular bot. This would mean (unless any of the lists is over 5000 entries long) only three HTTP queries, one time, to load the exclusions list. That's reasonable.
On the other hand, any system that requires an extra HTTP query for every edit that must be made is unreasonable because it is vastly inefficient. It would be possible to reduce the number of extra queries if you were just looking for page existence, but still every single bot.css file or whatever would have to be loaded, every time the bot wants to edit the corresponding page. That's far from ideal design. — Carl ( CBM · talk) 22:59, 9 March 2008 (UTC)
BJBot | [ watch this thread][ |
I would like to urge those who approve bots, that bots like BJBot — which left an unwanted long notice on my talk page because I made a single edit to Adam Powell, telling me that it was listed on AfD — should honour {{ nobots}}.
As a side note this response is rather uncalled for behaviour for a bot operator. I'm glad he struck that later, but it's still disappointing. Requests by useres not to notify them should only be ignored if there is a good reason to do so. -- Ligulem ( talk) 19:00, 9 March 2008 (UTC)
I would like to see the approval for this task looked into further by BAG. The notifying of people with very few edits to articles seems rather an annoyance and the bot seems to be notifying a lot of people (IPs included) - I count about 50 notifications about the proposed deletion of Prussian Blue (duo) alone. This was probably a request that should have been scrutinised a little longer... WjB scribe 18:22, 10 March 2008 (UTC)
There was in fact two different bugs that allowed Ligulem to get a notice. The first was me playing around with nobots early in the morning and had been fixed for hours (what he requested fixed on my talk), the second I didn't notice until he posted his rant here ("only one edit" got my interest), I also fixed that. If anybody sees unwarranted notices, leave a message on the bots talk with a diff. I also plan do redisable IP notices per a message on my talk, that should further reduce notices. BJ Talk 01:48, 11 March 2008 (UTC)
I've added a new section on the approval discussion page at Wikipedia:Bots/Requests for approval/BJBot 4, proposing to use an opt-in procedure for task 4 (delete notifications). I suggest to follow-up at Wikipedia:Bots/Requests for approval/BJBot 4#Opt-in instead of opt-out. -- Ligulem ( talk) 10:40, 12 March 2008 (UTC)
Bot owner's essay | [ watch this thread][ |
Is there an essay or guideline for how to deal with bot owners? I have, in the course of the past year, gotten comments and requests about my bot's behavior that range from polite through negative to downright abusive. I'm sure I've read something somewhere, but can someone point me to it? -- SatyrTN ( talk / contribs) 21:14, 9 March 2008 (UTC)
Pywikipedia getVersionHistory | [ watch this thread][ |
Something changed in the format of history pages which broke pywikipedia's getVersionHistory. I've fixed it for my own needs, but heads up in case any other bots use this function. Gimmetrow 23:03, 10 March 2008 (UTC)
Bot roles for nobots | [ watch this thread][ |
I propose to extend the {{ bots}} specification to allow easier restriction of particular bot types. This involves creating pseudo-usernames to be used in allow and deny parameters, for example, username "AWB" relates to all AWB-based bots (already supported), other bot framework names could include "pywikipedia", "perlwikipedia", "WikiAccess", etc. Additionally, we could classify bots by roles they perform: "interwiki", "recat", "fairuse", "antivandal", "notifier", "RETF", "AWB general fixes" and so on. For convenience, these roles should be case-insensitive. MaxSem( Han shot first!) 10:23, 12 March 2008 (UTC)
I need some one who operates bots on OS X | [ watch this thread][ |
I use
OS X Tiger and Im trying to run bots for the
Telugu Wikipedia. I downloaded the python framework from
this page. and I created the user-config.py file which reads
mylang='te'
family='wikipedia'
usernames['wikipedia']['te']=u'Sai2020'
Sai2020 is my username. I open Terminal and type in python login.py
I get the error python: can't open file 'login.py'
Can someone help me please Σαι ( Talk) 12:19, 12 March 2008 (UTC)
That was the problem. once I cd'd it worked but i get a different error this time
Sais-MacBook:~/Desktop/pywikipedia Sai$ python login.py Traceback (most recent call last): File "login.py", line 49, in <module> import wikipedia, config File "/Users/Sai/Desktop/pywikipedia/wikipedia.py", line 127, in <module> import config, login File "/Users/Sai/Desktop/pywikipedia/config.py", line 364, in <module> execfile(_filename) File "./user-config.py", line 1 {\rtf1\mac\ansicpg10000\cocoartf824\cocoasubrtf440 ^ SyntaxError: unexpected character after line continuation character
whats going on? I'm not very good at these kind of stuff.. Σαι ( Talk) 01:27, 13 March 2008 (UTC)
Thank you very much people. I can now login :) Σαι ( Talk) 08:57, 13 March 2008 (UTC)
New Bot? | [ watch this thread][ |
Any chance of a bot that automatically reverts any blanked page? One may already exist but, if so I'm not familiar with it. I've been chasing alot of blankings lately in my anti-vandalism crusade. Thanks either way. Jasynnash2 ( talk) 17:09, 14 March 2008 (UTC)
Proposal on WT:BRFA | [ watch this thread][ |
Please offer input there if you have any :). Mart inp23 19:33, 17 March 2008 (UTC)
Extended help wanted | [ watch this thread][ |
I'm interested in developing my bot skills, particularly to running bots which operate on a continuous basis, rather than the more script-oriented bots I'm already operating. I'm looking for a more experienced bot coder/operator who can help me get to grips with the extra knowledge and tools required to operate continuously-running bots. Kind of an
adopt-a-bot-owner system :D
. I can work in C++ and VB, but all of my previous bot-coding experience has been in python. Anyone interested and willing to give me a hand?
Happy‑
melon 10:40, 18 March 2008 (UTC)
Problem with dotnetwikibot | [ watch this thread][ |
Is anyone having a problem with dotnetwikibot today? As of this morning, any attempt to FillAllFromCategory is not working. I changed nothing in my code, which was working fine yesterday.
I placed a query about this at sourceforge.net dotnetwikibot framework forum, but it doesn't appear to get alot of traffic.
Any help would be appreciated. -- Kbdank71 15:17, 20 March 2008 (UTC)
This page now under bot care | [ watch this thread][ |
As a trial for the CorenANIBot, this page is now automatically archived into subpages when new sections are created. There is an automatically generated link right of the titles to edit or watch the subpages, allowing you to watch the individual threads.
Watching this page itself will allow you to see new threads.
Warn me if it breaks! — Coren (talk) 20:31, 21 March 2008 (UTC)
That's certainly interesting. I'm not sure whether I like it or not, but this is a good noticeboard to try it on. What are the perceived benefits? I can see 1) being able to watch individual threads and 2) a sort of 'instant archive', since they're already sorted by date. But I'm not sure how it would fit in with the archiving schemes currently in place at WP:AN, WP:ANI, etc, or what a newbie making their first post to WP:AN would make of a long list of page transclusions. Perhaps this system is best placed at boards which are frequented by regulars, like WP:ANI or WP:AN3RR. Happy‑ melon 10:46, 22 March 2008 (UTC)
I think this is a great idea for AN and ANI. Trying to watch for changes to any given thread there at the moment is rather impossible, especially on ANI. To address HappyMelon's concern, we'd just have to make it clear via some notices at the top not to try and edit the page itself and to use the edit and add section links. This could also be extremely useful for addressing vandalism attempts on those pages; the main pages themselves could be semiprotected or protected if necessary without shutting down discussions, the same could be done to the transcluded pages without disrupting other discussions.-- Dycedarg ж 22:24, 22 March 2008 (UTC)
Adding a thread | [ watch this thread][ |
...just to see what happens to it under bot care... Franamax ( talk) 12:56, 22 March 2008 (UTC)
That was a little weird. First it didn't show up at all, then it showed as a redlink. I did a server purge and it showed up fine. Seems a little confusing, maybe I missed something? Franamax ( talk) 13:01, 22 March 2008 (UTC)
Clarify: first it was on the page normally (I could see it in edit page), then it vanished and redlinked. Perhaps the bot could leave a "reformatting" message? Also, is this a failsafe method? What if there's an edit conflict along the way? Keeping in mind that the only thing important to me is my post and I want to make sure it's there because to me it's the most important thing in the world. :) Franamax ( talk) 13:13, 22 March 2008 (UTC)
Resolving conflicts with article maintainers | [ watch this thread][ |
Do you have any good ideas on how to build consensus in discussions like this? Whenever an autonomous interwiki bot links the article Monoicous, the bot owner gets angry comments from the article maintainers. I have tried to explain how interwiki bots work and how we can solve the problem by correcting all the links manually, but the discussion always seems to drift toward “just fix the bots”... -- Silvonen ( talk) 04:19, 11 April 2008 (UTC)
WP:BOT has been completely rewritten | [ watch this thread][ |
...just in case anybody didn't notice. It would be much easier to pretend that the rewrite has consensus, and attempt to gain consensus for more radical kinds of change, if we could get more people commenting there.-- Dycedarg ж 20:31, 12 April 2008 (UTC)
{{tl|bots}} change | [ watch this thread][ |
Are we going with this? This is directed ST47 and Carnildo mainly, as I don't think Beta would follow it without force. I'm sure this has already been talked about but I stopped reading that debate a while ago. BJ Talk 15:25, 16 April 2008 (UTC)
Per WP:BOT
-- Chris 12:45, 7 September 2008 (UTC)
Automated tools, such as Twinkle and CSDWarnBot, are leaving messages on users whose talk page redirects to their user page, which are usually banned users and sockpuppeteers. See this diff on User talk:Antidote for an example. This is pointless because the only way to read those messages is to go into edit mode. A more serious consequence of leaving messages on redirect pages is shown on the history of the same page. User:Yamanbaiia notified Antidote about the deletion of Image:EKusturica.jpg with Twinkle. After the image was deleted, User:Mushroom deleted User talk:Antidote, apparently because Twinkle thought that the page was a redirect to Image:EKusturica.jpg, which it absolutely was not.
Automated tools should therefore take redirected user talk pages into account. They should either remove the redirect while posting the message or avoid posting to the talk page altogether. Another solution to this problem is to protect redirected user talk pages so non-admins can't add to them, but the policy on protecting user talk pages currently doesn't allow that.
I would also be interested if any other user talk pages have been deleted in this way. I had encountered User:Antidote before and wanted to read through his user talk page to understand why he was banned. If I was a non-admin, this would have been impossible for me without assistance. Graham 87 12:58, 9 September 2008 (UTC)
There is also the case where a user has been renamed and the user_talk page of the old one redirects to the new name. This is also commonly done for publicly declared socks or bot accounts where the owner wants all inquiries to go to the "main" talk page.
If the target of the redirect is another user_talk page, an intelligent bot/script/tool would assume "this is probably another name of the same user" and follow the redirect before posting the message. Otherwise it should abort the message and alert the operator. In no case should it over-write or post below the redirect. —
CharlotteWebb 16:28, 9 September 2008 (UTC)
Actually another part of the problem is that according to whatlinkshere for the image, User_talk:Antidote did contain a link to the deleted page, and was listed as a redirect. However 2 + 2 did not equal 4 in this case as it did not "redirect to a deleted page". This is probably a bug in itself. If a page consists of:
#redirect [[Foo]] == XYZ == [[Bar]]
...it should be listed as a "(redirect)" only in Special:Whatlinkshere/Foo, not in Special:Whatlinkshere/Bar. — CharlotteWebb 16:41, 9 September 2008 (UTC)
A change in r40621 changed the behavior of Special:UserLogin for successful logins. The 'login successful' message is no longer displayed. Instead, on successful login, you are redirected to the page you came from (Main Page by default). This change may break bots that rely on the "login successful" text to detect successful logins. My initial reading and testing says you will get a HTTP 302 code on successful login and a HTTP 200 code on login failure, when you load the wpCookieCheck page. — Carl ( CBM · talk) 13:45, 21 September 2008 (UTC)
Brion has just announced on the wikitech-l list that he's planning to (finally) change the canonical name of the "Image:" namespace to "File:". This is likely affect a lot of bots that deal with images, as well as other bots that just want to tell if a title is an image or not. Please note that, if everything goes as planned, we'll only have about one week to fix our bots before the change goes live. More details in Brion's post and at bugzilla:44. — Ilmari Karonen ( talk) 02:35, 7 October 2008 (UTC)
Just looking through Wikipedia:Bots/Requests_for_approval/Approved, which took me to a request for Kaspobot. This request was withdrawn. It seems to not only be categorized incorrectly in the approved section, but the user also appears to have been flagged as a bot? Maybe I am missing something? Matthew Yeager 06:50, 27 October 2008 (UTC)
See bugzilla:4253: to reduce the likelihood of edits to pages with long titles exceeding the 512 byte limit of IRC messages, rev:42695 modifies the message format so that the diff URL no longer repeats the page title. This may affect some bots that follow the IRC feed: if your code treats the URL as an opaque string or only uses it to extract the revision IDs, everything should be fine, but any code that expects to find the page title in the URL should be changed before the new revision goes live (which, as usual, will take an unpredictable amount of time from minutes to months). — Ilmari Karonen ( talk) 23:46, 27 October 2008 (UTC)
Hello,
I have a question regarding ArticleAlertbot, for which I write the code. The bot reads articles from certain categories, e.g. Category:Articles for deletion, and notifies the corresponding WikiProjects, identified by their project tags on the talk pages. (See User:B. Wolterding/Article alerts for details.)
Several users have asked me whether I could include also WP:DRV in this alert system. Of course there is an apparent problem: For most articles on DRV, their talk page is deleted, so the bot can't find out which projects they correspond to. Of course, this information is in principle contained in the wiki, since deleted pages are not physically removed from the database, but de facto the bot has no access to it.
If however the bot had admin rights, it should be feasible to retrieve the deleted talk pages and scan them for WikiProject banners. (Haven't checked the API details.) On the other hand, it seems a bit exaggerated to flag the bot as adminbot only for this little detail feature. Would you think that such functionality could pass WP:BRFA via the new "adminbot approval" process? (Of course one would need an admin willing to run the bot.) Or is it possible to grant the bot restricted admin rights, in some sense, so that it could read deleted revisions? Do you have any other suggestions for this problem? -- B. Wolterding ( talk) 17:59, 2 November 2008 (UTC)
Is there somewhere other than the BRFA where I can find out what this bot is doing?
I'm sure the intentions are good, but on balance the results are not. What's happening here? Franamax ( talk) 11:44, 13 November 2008 (UTC)
I agree with Franamax: the idea to subst existing {{ unsigned}} has been rejected many times in the past. Thousands of extra edits put much more stress on the servers than the transclusion of templates that are almost never changed. Besides, linking to WP:SUBST is simply misleading because the page does not say that {{ unsigned}} should be used substed, let alone replaced after it's been used. — AlexSm 19:35, 13 November 2008 (UTC)
If you're going to anything you should do a replacement based on Special:Expandtemplates rather than a regular subst, in order to strip out any html comments and needless branching of parser functions. Try it with a complex template and you'll see what I mean. Eventually you would want to ask Brion to enable the SubstAll extension. — CharlotteWebb 21:04, 13 November 2008 (UTC)
I'm interested in creating my own bot, and having it run continuously, maybe based off of the recent changes so that it detects any edits made. Can anyone explain to me how continuous running bots work, or point me to some technical documentation or examples of bots that run continuously? Thanks much! Redalert200 ( talk) 17:36, 13 November 2008 (UTC)
Looks like User:ImageRemovalBot is buggy. While the images are available ( see this version ) , User:ImageRemovalBot removed the linking here and here.
The images are still available
I have left a note for the Operator
User:Carnildo .
Any idea whtz wrong ? --
Tinu
Cherian - 11:35, 17 November 2008 (UTC)
" :The images are lacking an image description page, and the Wikipedia API says the images are missing. It appears to be related to Wikipedia
bug #15430. Since Wikipedia insists that the images are missing, there's not really anything I can do about it. You could try editing the image description page to add proper license and source information, and see if that fixes things. --
Carnildo (
talk) 20:47, 17 November 2008 (UTC) " ( copied the explaination of the bot operator from his talk page.)
Done: The issue is also that some how the image lost its description when automatically moved to commons from en.wiki. Issues solved after adding the same. Thanks -- Tinu Cherian - 12:50, 4 December 2008 (UTC)
Not sure if this is the right place. How do you get a bot stopped? See action of User:OKBot and comments at User talk:OsamaK Traveler100 ( talk) 19:57, 25 November 2008 (UTC)
Since archive 497 was started, we seem to be having only one thread an archive. I assume that this is a bot problem. Is this where I shoudl raise this?-- Peter cohen ( talk) 21:15, 4 December 2008 (UTC)
I have been nominated for BAG, so per instructions I am posting this to invite comments at Wikipedia:Bot Approvals Group/nominations/Anomie. Thanks. Anomie ⚔ 03:13, 6 December 2008 (UTC)
I've built a test bot. It logs in, grabs my user page and tries to append some text to it, but it can't; it gets a "IP address blocked" type message (open proxy). This is because, I think the IP address of the server its on is similar to the one quoted at me by the block message. Still I shouldn't get this message, I'm logged in (I can edit my user talk page just fine!), so what's up?
Any help appreciated. Jarry1250 ( talk) 21:37, 5 January 2009 (UTC)
This is a notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. APologies for the delay getting this notice out, but I've been busy over the holidays etc. Best wishes, Fritzpoll ( talk) 10:26, 8 January 2009 (UTC)
There are issues with the ClueBot false positives URL ( http://24.40.131.153/cluebot.php) today. Anyone else experiencing issues connecting to the site? Thanks, Willking1979 ( talk) 15:44, 11 January 2009 (UTC)
Hi,
there's currently a discussion at User talk:B. Wolterding/Article alerts#Subscription parameter request whether bot edits to project banners can cause performance issues. More precisely, User:ArticleAlertbot produces report pages that it updates on a daily basis; and some users wish to transclude these into project banners. The question is whether this may overload the job queue, and so whether these transclusion should be allowed or avoided, or maybe even prevented on the technical side. Input by some uninvolved experts would be appreciated. -- B. Wolterding ( talk) 00:37, 12 January 2009 (UTC)
This is a notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. Foxy Loxy Pounce! 03:30, 25 January 2009 (UTC)
A dab page keeps getting hit by bots adding interwiki links. [5] I think that once a particular edit or style of edit, such as adding all of the interwiki links to a page, has been made, future bots ought to just leave the page alone without the need to add a nobots template. -- KP Botany ( talk) 10:07, 25 January 2009 (UTC)
I have created a {{ nobots}} implementation function in 2 languages, to make it easy for non-compliant bots to comply with nobots. You can see the 2 functions (in PHP and Perl) at Template:Nobots#Example implementations. Hope this helps! X clamation point 21:23, 29 January 2009 (UTC)
I've released the first "official version" of my Python bot framework if anyone's interested. It uses the API exclusively for getting data and editing; I currently use it for all my bots and scripts. Downloads are available in .tar.gz and .zip archives with a setup.py script as well as a Windows installer. Mr. Z-man 06:33, 31 January 2009 (UTC)
It appears that SoxBot X is not working correctly. It also appears that the operator does not respond to messages posted on the bot's user page. Please see the following post and do something to fix the problem - User_talk:SoxBot_X#Not_archiving_properly - Thank you
PS the not responding to posts does not relate to this post (as it is a very recent post) but to the previous posts over the last several months to which there have been no responses. Dbiel ( Talk) 03:35, 5 February 2009 (UTC)
As east718 has stated that he is no longer working on this, would any Python-competent admin like to take up the reins and continue with what seems to be a very useful bot task? (ask east for the code if you do) If not, the task can be marked as withdrawn/expired. Richard 0612 18:05, 6 February 2009 (UTC)
User:LilHelpa claims to not use a bot for reverting, but this mistake seems so blatant that it appears likely to have been made in an automted manner. I think this should be investigated. TubeToner ( talk) 23:23, 7 February 2009 (UTC)
There's a big backlog at WP:BLP/N, and the bot is archiving four-day-old threads that haven't been resolved. THF ( talk) 13:55, 14 February 2009 (UTC)
May I tentatively suggest that some consultation is required about the variety of PHP frameworks currently available. I know of many that exist; they range from the bad (e.g. my own) to the very, very good and yet we do not have even a list of frameworks that is even half-complete! This contrasts with other languages, such as perl, which have a handful of well documented, well supported solutions.
Whilst I admit that it is naive to assume that all the existing PHP bots could be made to work from the same framework, I think a possible starting point for helping newcomers develop bots in PHP might be to draw up a comprehensive list of solutions that worked with the present API. Then they could at least choose the one that best suited their needs.
The ones I know of:
I know the above list is pretty shoddy, so please correct the glaring mistakes with it. - Jarry1250 ( t, c) 16:15, 14 February 2009 (UTC)
The instructions and documentation that currently exist at Requests for bot approval are very out-of-date, and in some cases, confusing. The 'Organisation' section was virtually identical to its current state way back in 2006. Practices have moved on by light years from then. I propose that something like this is implemented to make the process more accessible and the approvals page less cluttered. I have also added brief guidance for BAG members on how to close BRFAs (based on SQL's How to close a BRFA). All the essential information is still there, just in a different, more user-friendly format.
Comments, suggestions, ideas?
P.S. I know that the colour scheme isn't particularly brilliant, I chose colours that wouldn't clash with/obscure the links. If anyone with more... aesthetic abilities than me has a better colour scheme: by all means implement it! Richard 0612 17:51, 14 February 2009 (UTC)
Actually done, it does look very good! Nice job Richard. IMHO the TOC should go left of the table, but I'm not really bothered so I'll leave it. - Jarry1250 ( t, c) 09:24, 16 February 2009 (UTC)
I have accepted a nomination to join the Bot Approvals Group - relevant discussion is just a click on the link above away. - Jarry1250 ( t, c) 20:56, 16 February 2009 (UTC)
Is there a(n easier) way to get the number of transclusions a template has?
My php-based bot could load a webpage, so that's not out of the question if a suitable web page does exist. (I did ask on the WP help desk, but I think the people watching this page will be better/quicker at making suggestions.) If worst comes to worst, it could include links to the template as well; redirects should be included (if this is possible). At the moment, you see, I'm having to resort to loading a list of transclusions, and then counting them, which is pretty resource intensive. Cheers! - Jarry1250 ( t, c) 17:13, 31 January 2009 (UTC)
SmackBot ( BRFA · contribs · actions log · block log · flag log · user rights) is an ambitious effort to make detailed corrections to tags and text of articles. It performs useful tasks well. However, in its current state, SmackBot fails the do-no-harm test. It needs testing on degenerate cases, and a more complete understanding of the grammar rules of the tags it edits.
The critical failure with SmackBot is that it doesn't fail-safe when it encounters an unexpected condition. Before SmackBot is reactivated, it must demonstrate that it does no harm in degenerate test cases. In the software development sense, SmackBot needs Verification and Validation quality control.
I've communicated my concern to SmackBot's developer/owner. He's working on a problem with damage to tags. The larger problem is fail-safe and do no harm.
-- Mtd2006 ( talk) 06:03, 18 February 2009 (UTC)
This overview explains how SmackBot works. The bot leverages AWB to edit articles. It combines WP templates with exception rules in a automated process to generate its ruleset, a list of edit rules for AWB. Rich has said, the ruleset is generated from "over a thousand templates, each of which can be formatted in hundreds of ways, it is necessary to canonicalise templates to make the problem tractable." This as another way to say that SmackBot is complex and it makes arbitrary assumptions to simplify its ruleset. Arbitrary choices in software design is the opposite of a deterministic algorithm. Arbitrary decision making implies unpredictable failure modes. SmackBot requires Verification and Validation too prove it does no harm, see WP:BOTPOL#Bot_requirements.
Because of SmackBot's complexity and the potential for damage to live articles caused by unpredictable failure modes, I propose some guidelines for its use.
A test standard should apply to all bots.
This warning is found on complex templates. Templates are, in effect, complex software. {{ intricate template}} As the template warning says, a flaw in the template can appear on a large number of articles. The nice thing about templates is that if a flaw is repaired, the errors are automatically reverted. This is not the case with SmackBot. Flaws in SmackBot are permanently applied until reverted. Because SmackBot's ruleset is generated from many templates, it's vulnerable to errors and conflicts in them. These problems are "fixed" in SmackBot with exception rules. However, when a problem is fixed in SmackBot, its incorrect edits are not automatically reverted. They remain until a human editor or an automated process repairs them.
While there are applications for non-deterministic algorithms, a bot should be strictly deterministic. That is, a bot must do what it's designed to do without side effects. The tasks that a bot performs are its design specification. SmackBot's design specification is User:SmackBot#Tasks. One of SmackBot's design specs (or tasks) is to add missing dates or repair incorrect dates in various tags, e.g., {{fact}} to {{fact|date=February 2009}}. However, when SmackBot fixes a date in an article, it also makes trivial changes. AWB users are cautioned not to make trivial edits.
SmackBot makes trivial edits because it does not conform to its own design specification. There is no SmackBot task that says "change the case of every tag to upper case". Human editors are not allowed to hide substantive changes in a long set of trivial edits (changing case, removing blanks or new lines, etc.) Humans are cautioned for not using a sandbox for experiments and tests. A bot must follow the same rules because it may perform these disruptive edits to thousands of articles. A bot can waste resources, be disruptive and cause damage much faster than the fastest human editor.
SmackBot problems:
SmackBot's bugs can be fixed, up to a point. SmackBot's owner is responsive to trouble reports. But until a bug is reported, SmackBot has made changes that are difficult to find, because of trivial edits, and hard to revert, again because it makes too many trivial edits. Beyond this, the most critical flaw of SmackBot is its failure to be harmless.
Mtd2006 ( talk) 03:34, 19 February 2009 (UTC)
Just an FYI for bot owners/writers. The recent squid rollout changed the behavior when encountering Expect headers. So either have to be sure to not send them, or correctly handle now recieving a 'HTTP/1.0 417 Expectation failed' response instead of the 100 Continue. Q T C 22:22, 18 February 2009 (UTC)
Hi all,
I have put a request in at Bot Requests for Approval for my bot, Thehelpfulbot to be able to use pywikipedia's Python script of redirect.py to delete broken redirects. pywikipedia has been extensively tested and the bot has already been speedily approved for using the same script, but fixing double redirects. As far as I can tell, no other bot is running this task, as User:RedirectCleanupBot is no longer in use as WJBscribe left Wikipedia. This bot will require the admin flag to run this task, which is why I am posting on this board - to let you know about the bot.
If you wish to comment, please do so at Wikipedia:Bots/Requests for approval/Thehelpfulbot 5.
Thanks,
The Helpful One 20:45, 19 February 2009 (UTC)
Hi again all,
Thehelpfulbot now has another request, using pywikipedia's Python script nowcommons.py to delete local images that are also on Commons. You can have a look at the code if you wish, by seeing the pywikipedia library here.
This task will also require the admin flag to run, which is why I am posting on this board again, to let you know about the second admin bot task.
If you wish to comment, please do so at Wikipedia:Bots/Requests for approval/Thehelpfulbot 6.
Thanks,
The Helpful One 20:45, 19 February 2009 (UTC)
Just a quick notice to inform you that I have posted a proposal at the Proposals Village Pump regarding giving BAG the bot-flagging right. Comments, questions, trouts, etc welcome there! Richard 0612 11:11, 20 February 2009 (UTC)
Could another bot operator voice an opinion here please?-- Rockfang ( talk) 16:41, 23 February 2009 (UTC)
Just to let everyone know that my BAG nomination was successful one - so yes, I am now a member of the BAG! Thanks to everyone who showed their support, I hope to now show that your trust was correctly placed. However, in the unlikely event that I do get something wrong - however small - I hope you all will put me right ASAP ;) - Jarry1250 ( t, c) 20:31, 23 February 2009 (UTC)
A template on this page (the header I think) is putting this page into the Intricate Templates category. Is there a way to fix this seeing as how this page itself isn't a template? :) Rockfang ( talk) 20:41, 23 February 2009 (UTC)
In my opinion, Category: Wikipedia bots could do with some substantial re-organisation. Rather than doing it all bit-by-bit, I think it might be best to start with an adventurous design, which then gets modified until consensus is reached. To reiterate, feasibility was not considered when drawing up this design. At the moment, we have this:
The category system, as it is. "An effing mess."
|
---|
|
I would propose a system more like this:
My grand plan
|
---|
These may just have to be left as-is/renamed:
|
I know there are bots which rely on these categories, so it would be good to get everyone's view on this. Another obstacle may lie with the {{ Bot}} template, which is compulsory for all bots, because it adds all bots to "Wikipedia bots". With some tweaking, however, it too could become a useful tool in the categorisation process - simply asking for a status and a purpose would help enormously. Also, that reminds me - if this were to be implemented, we'd need to work out what to do with bots with many different tasks (when it came down to "purpose") - multiple progress categories per bot, perhaps? Anyhow, let's see how far we can get.
- Jarry1250 ( t, c) 20:18, 18 February 2009 (UTC)
If anyone has any worries/criticisms - however minor - please shout; I'm about to contact some experts on category naming to check the exact wording of the categories and to see how we can move this along. (I'm sorry, that's just my way - I hate doing nothing when something can be done.) - Jarry1250 ( t, c) 14:40, 20 February 2009 (UTC)
Revised grand plan
|
---|
These may just have to be left as-is/renamed:
|
Something that does need doing is generating a list of templates that categorize into these templates so if this ever passes they can be updated. Q T C 05:08, 23 February 2009 (UTC)
I'll check out the rest when I get the chance. - Jarry1250 ( t, c) 07:50, 23 February 2009 (UTC)
brfa
parameter? ;) Other than that, it looks ok to me.
Anomie
⚔ 02:34, 25 February 2009 (UTC)
Recently, a change to the Wikimedia servers has caused many bots to break. To fix it, you have to tell your bot not to expect a 100 continue code.
In PHP: curl_setopt($this->ch, CURLOPT_HTTPHEADER, array( 'Expect:' ) );
In VB.NET: ServicePointManager.Expect100Continue = False
X clamation point 17:00, 25 February 2009 (UTC)
I've started doing the most obvious changes to the category system. Apologies for any short-term inconvenience caused. Willing helpers (especially admins for the templates) are of course welcome to help. - Jarry1250 ( t, c) 17:43, 26 February 2009 (UTC)
I'm sure I remember reading that commented out interwikis aren't re-added. Is this correct? Rich Farmbrough, 20:20 7 September 2008 (GMT).
Archive bot doesn't like your stupid timestamp ST47 ( talk) 02:12, 4 March 2009 (UTC)
If someone has an off-wiki contact with Cobi, can you inform him that Cluebot's False Positive reporting page is off-line (and seems to have been offline, at least periodically, since last November). Not urgent, but worth noting. I'll leave a note on his talk page as well, but I assume if he'd been on-wiki he'd have noticed this already. -- Ludwigs2 06:43, 28 February 2009 (UTC)
While I have been trying to broadly use the comments interested parties have left over the past fortnight be be as bold as I can, there are a couple of outstanding proposals that will need some thought.
That's quite a long list, I know, but it demonstrates the possibilities that are starting to open up.
- Jarry1250 ( t, c) 11:02, 28 February 2009 (UTC)
One particular user, ST47, has a bot that has received several complaints about how it conducts speedy-delete warnings to page authors. For example, I placed a speedy delete on an article that had a clearly mis-spelled title: the newly created Zhao'an Country vs Zhao'an County the correctly spelled name where an article already existed.
After placing a speedy delete tag on the article, I went to the author's talk page ( User talk:Isatcn) to notify him of the discrepancy and to request him to place a {{db-author}} tag in the article to avoid any confusion on why the article was tagged for speedy. Before I could finish typing my message, CSDWarnBot placed one of its own in his talk page. The user continued to edit the existing article, and attempted to place a {{hangon}} tag. This was likely because he failed to see my short, succinct message because due to the larger message by CSDWarnBot with all the bells and whistles (image graphic, bolding, 2 paragraphs, etc.) See here.
One administrator failed to see what the problem with the article was (not realizing the typo problem with the title) and left the article in tact, even fixing the malformed {{hangon}}.
I eventually left another message on the talk page of the article. The author finally realized what he had done and placed the appropriate {{db-author}} tag. However, a process that should have taken me less than 5 minutes ended up taking me 4-5 times that long.
I also filed a complaint on the ST47's talk page here. It was then that I learned there have been several other complaints about the very same behavior over the last few months, as well as other disruptive/problematic actions by the bot (see User_talk:CSDWarnBot). While I understand that bots can serve a beneficial purpose, the usefulness of a bot is negated by the extra effort and confusing communication to authors of pages who are tagged for speedy delete, many of whom are new users. ++ Arx Fortis ( talk) 18:18, 1 March 2009 (UTC)
ST47, can you stop being so awkward, this is the place to resolve this, not your talkpage, as for you refusing to fix your bot, in light of that I have to agree with others that the bot should just be blocked intill such a time as you are ready to fix it (or someone else), to be honest, I really couldn't care less if you end up not seeing this because you couldn't be bothered to watchlist this page, thanks Spitfire Tally-ho! 19:22, 4 March 2009 (UTC)
So yes, shut down the bot until ST47 agrees to fix it.
Incidentally, while I lack the coding ability to implement such an idea, I suspect that the simplest solution would be for the bot to run once every ten minutes and post talk page messages pertaining to deletion warnings discovered during the previous run. That would ensure a delay of 10–20 minutes (instead of the current range of 0–15 minutes). — David Levy 21:33, 4 March 2009 (UTC)
Being bold and blocking the bot until this is sorted out. "I really don't feel like bothering to go figure out how to do it again" is not a very helpful attitude. If ST47 can't be bothered fixing it now, we'll wait until he can (or until one of you computer buffs does). yandman 08:33, 5 March 2009 (UTC)
When I submitted a BRFA last week, Anomie looked over my code and made a good number of helpful suggestions that I implemented before the code went live. It then set me off on a code-writing spree and I have written a new PHP bot framework from scratch. I think the code is clearer and easier to use than most of the others that are currently available, and I have made an effort to document the code in phpdoc format.
Obviously it is not finished yet and doubtless there is significant functionality that could be added. If you want to contribute to it or give me ideas that I could implement, I'd be delighted.
The Google code site is http://pillar.googlecode.com/
Generated code documentation (along with highlighted and cross-referenced code) is available at http://toolserver.org/~samkorn/pillar/doc/index.html
I have converted my cricket bot ( BRFA) to this code: the converted version (20% smaller than the original!) is available at http://toolserver.org/~samkorn/pillar/example/
Comments/flames/trouts welcomed!
[[Sam Korn]] (smoddy) 22:17, 4 March 2009 (UTC)
I have raised the possibility of a bot which scans database dumps looking for blue links to absent sections in actual articles, e.g. George W. Bush#Olympic medals. I was advised to advertise it here and in WP:Bot requests#Broken section links, but please add any comments to the main discussion at WP:Village pump (proposals)#Broken section links. Certes ( talk) 20:11, 8 March 2009 (UTC)
Xqbot removes links when xqbot shouldn't... http://en.wikipedia.org/?title=City-Bahn&diff=275931839&oldid=266407217
It looks like I'm not the only one:
http://de.wikipedia.org/wiki/Benutzer_Diskussion:Xqt#xqbot
Please stop xqbot FengRail ( talk) 00:55, 9 March 2009 (UTC)
OK, this needs to be shutdown. First, there is not consensus on what constitutes an orphan at Wikipedia talk:WikiProject Orphanage. Secondly, Wikipedia:Orphan is not a policy nor a guideline. Thus it seems to fail the requirements for a bot. Along those lines, since there is not a policy/guideline, it should be up to human editors to decide if one, two, three, four incoming links are enough. But no, if you remove with only two the bot re-adds the tag. Aboutmovies ( talk) 20:22, 3 March 2009 (UTC)
Per the bullet points, above, where was consensus for Addbot's addition of 114k orphan tags established? Because everywhere I've looked, I've found no consensus about the posting or orphan. -- Tagishsimon (talk) 20:16, 4 March 2009 (UTC)
Please do not insert gibberish like {{Sam1649}}
into the deletion log (or any log), especially if there's a high likelihood that the general public will read the log summaries (like the deletion summaries of articles). "Robot:" is sufficiently clear for bot trials. Thanks! --
MZMcBride (
talk) 04:15, 11 March 2009 (UTC)
The {{ Sam1649}} template is to be able to run a database query so that people can see what deletions were made on my admin account by my bot. This allows me to differentiate my edits from the bots. Note: This is only for the trial, not for the real bot edits.
FWIW, should probably leave AntiAbuseBot running until all the filters can be worked out. Q T C 23:37, 17 March 2009 (UTC)
please run m:reflinks.py on eBay with pywikipediabot thanks Amir ( talk) 16:59, 18 March 2009 (UTC)
This looks very suspicious, and many of the edits appear to be corrupted, adding the surrounding boilerplate text but without the actual data. Is there an approval for this (in which case which BAG member do we need to trout?) or do we need some corrective action...? Happy‑ melon 21:43, 23 March 2009 (UTC)
If you look, for example, in the article Chiton, you'll find, in the wikitext of the "General anatomy" section, a footnote that reads, in its entirety:
That footnote uses the source information found in Template:Cite doi/10.1002.2Fhlca.200390096, a page created by User:Citation bot. That source information (to continue the example) is this:
Now the reader of the article will see a footnote with the expected information.
What we have here is a system where (a) the footnote in the wikitext has different information than what shows in the "References" section; (b) someone wanting to improve the footnote needs to understand that he/she has to edit the template; and (c) template space gets populated with a page whose sole purpose is to insert information into (usually, as in this case) a single article.
Here's the cite doi templates that exist so far. I'm guessing that all this is so that editors can just let User:Citation bot create a footnote by providing its doi. My first question is why Citation bot needs to do this via a template, rather than (say) simply overwriting a redlink in the article itself, or something else that doesn't add yet another layer of complexity to Wikipedia articles?
Also, I don't see (perhaps I missed it) approval for this system, in any of the following three pages. (I do see a mention of DOI bot editing cite doi pages, but not creating them.) Thus my second question: If this was approved, where was it discussed?
-- John Broughton (♫♫) 18:49, 25 March 2009 (UTC)
Yuck, Get It Out Of Here. This brings back nasty echoes from the past. Happy‑ melon 21:29, 26 March 2009 (UTC)
Reading back through the approvals, they don't match my memory of them, so I'm not going to adamantly say that it does. Consensus should be quickly reached here, and I've suspended this bot function until it is. (Users will still be able to manually request the bot act.) Martin ( Smith609 – Talk)
Templates such as {{ Bruscabrusca}}, {{ WonderfulLife}}, and {{ PalAss2008}} suggest 'yes' - see for example Category:Biology source templates. Martin ( Smith609 – Talk)
This probably isn't the place to establish consensus, but this question is important. Martin ( Smith609 – Talk) 12:27, 28 March 2009 (UTC)
Here is some data (my attempt at a random sample, picking a dozen doi cite templates, with the only criteria for including in the sample being that a template must have been created by Citation bot). The count refers to the number of articles that link to a specific template:
One:
Two:
I wasn't surprised that there were more cases of only a single article linking to a doi cite template than there were cases where there were two or more; I was surprised at the number of doi cite templates with no articles linking to them. -- John Broughton (♫♫) 00:02, 29 March 2009 (UTC)
Assuming that single source templates are sometimes a good idea (which is discussed in the section above), there doesn't seem to be a good argument for making editors create single-source templates by hand when the process can be automated - have I missed one? Martin ( Smith609 – Talk)
DSisyphBot ( talk · contribs) added over 20 interwiki links to Wikipedia:Bot requests, but its operator indicates he only understands three languages. My understanding of bot policy is that ops should be able to minimally verify that language links are correct. Do we waive this requirement if the bot is running standard software like pywikipedia? Wronkiew ( talk) 16:35, 30 March 2009 (UTC)
I have proposed a policy change here to resolve this. Wronkiew ( talk) 00:55, 1 April 2009 (UTC)
(crossposted from WP:AN)
DediBox is a cheap dedicated hosting solution operated by Proxad (France). I have hardblocked it since there are apparently some abused open proxies there. However, some bots operate from these servers and I expect some collateral. I have given IPBE to WP 1.0 bot ( talk · contribs) and MystBot ( talk · contribs). If an other operator complains, please give them the bit (don't forget to log it) and poke me so I can double check. -- lucasbfr talk 09:00, 31 March 2009 (UTC)
← The bot has recently conducted a number of very minor (seemingly insignificant edits) that I believe fall afoul of current bot guidelines (concern was raised at WP:AN#User:D6). I dropped a note for Docu to comment here or there. – xeno ( talk) 18:40, 1 April 2009 (UTC)
Thanks for removing the claim. Can you also remove the claim that the bot reverts others? The diff you provided seems to show that someoneelse used twinkle in a way he shouldn't have. -- User:Docu
Xenocidic ( talk · contribs · deleted · filter log · SUL · Google) • ( block · soft · promo · cause · bot · hard · spam · vandal)
Xeno ( talk · contribs · deleted · filter log · SUL · Google) • ( block · soft · promo · cause · bot · hard · spam · vandal)
processed a large number edits to remove attribution notices from a large number of articles ( contribution details). Is there a consensus for such a change or was there a task approval? -- User:Docu —Preceding undated comment added 15:35, 17 April 2009 (UTC).
This bot task was explicitly approved at Wikipedia:Bots/Requests for approval/Xenobot 6. – Quadell ( talk) 17:42, 17 April 2009 (UTC)
modified a series of templates on IP talk pages (contribution details). Has this been reviewed or specifically approved? -- User:Docu —Preceding undated comment added 09:46, 18 April 2009 (UTC).
Is there a bot that removes {{ ifdc}} from captions? Which ones, and on what conditions? Thanks, – Quadell ( talk) 14:33, 17 April 2009 (UTC)
Hello folks!
We have updated our mailing lists! :)
Until now, pywikipedia-l was automatically spammed on each bug update, and on each svn commit, which made its subscription painful for users not interested in pywikipedia development.
To solve this issue, two lists have been created, pywikipedia-bugs, for automated bug updates, and pywikipedia-svn, for automated svn commit message. This way, the traffic on pywikipedia-l should be greatly reduced: only human discussions should take place there. We would like to encourage advanced pywikipedia users to subscribe to this list: it should have moderate traffic, and it would allow us to get feedback on our development.
But more important, we have created an announce mailing-list, pywikipedia-announce. This mailing-list will be used for important announcements, such as breaking changes. The aim of this list is to have a minimal traffic: only a couple of folks can post on it, and we should not have to use it more than once a month. (any mail sent to pywikipedia-announce is also sent to pywikipedia-l, no need to subscribe to both).
We would like our users to subscribe to either pywikipedia-announce or to pywikipedia-l to be sure to receive those important announcements
We hope that in this way, we'll be able to significantly improve pywikipediabot quality, and response time to urgent matters: no more foundation-wide running around for developers if something is utterly broken ;)
Thank you,
NicDumZ ~ 11:12, 18 April 2009 (UTC)
I'm making a list of image/media-related bots at Wikipedia:WikiProject Images and Media/Bots. Obviously the formatting leaves something to be desired, but if anyone here operates such a bot (or knows of one) feel free to add it to the list. I'd add them myself, but I don't know that many active bots off the top of my head, which is why we need the list in the first place. :) Thanks! ▫ JohnnyMrNinja 00:37, 25 April 2009 (UTC)
How can I get a list of all bot-flagged accounts? How can I get a list of all bot-flagged accounts that have edited in the last 30 days? Or that haven't edited in the last year? – Quadell ( talk) 03:00, 25 April 2009 (UTC)
This is a ' mandatory' notification to all interested parties that I have accepted a nomination to join the Bot Approvals Group - the above link should take you to the discussion. Best wishes,-- Tinu Cherian - 10:48, 1 May 2009 (UTC)
Hi all,
Another user brought up to me that my bot's edits are showing up on his watchlist, even with the "hide bots" option enabled. I took a look at Special:RecentChanges, and noticed that they are showing up on there as well. The account has the bot flag, and I double-checked my code to verify that it is actually flagging the edits as bot edits. In looking at the RecentChanges, I noticed a couple other bots on the list (SPCUClerkBot, XLinkBot), so I'm guessing the problem isn't isolated to just my account. The bot is making all of its changes through the MediaWiki API. Any suggestions? Matt ( talk) 23:31, 1 May 2009 (UTC)
Just FYI, I made a template that might be useful to you.
{{botlinks3|Polbot}}
The "task list" link is a list of all pages starting with "Bots/Requests for approval/Polbot".
{{botlinks3|Polbot|11}}
The "task" link points directly to task 11.
{{botlinks3|Polbot|-}}
The "task" link points to the RfBA without a numerical suffix. (Polbot never had one, which is why it's a redlink.) – Quadell ( talk) 00:46, 28 April 2009 (UTC)
This escalates a clearly erroneous page-move, as only an admin can scrape the crud away in order to revert this. — CharlotteWebb 13:32, 3 May 2009 (UTC)
The design is still flawed as redirects "from other capitalisation" are among the most likely to need reversing. I've noticed other cases in the past where a user has created a redirect from a more correct or equally plausible title, and this (again, worse than useless) bot comes along to add road-block edits preventing the page from being moved to that title. One might as well write a bot to move-protect every bloody article as that would (from my perspective) have the same practical effect.
Let's step back and ask if/why this bot was approved and whether it serves any meaningful purpose. — CharlotteWebb 17:57, 3 May 2009 (UTC)
Following problems with some bots operating in template namespace, the bot policy now mentions that interwikis should appear on all articles using a template. ( Wikipedia:Bot_policy#Restrictions_on_specific_tasks ) -- User:Docu
Hi all,
I had an idea for a bot I could write, but I don't know if there's already a bot that does it, or if the idea would be very well received, so I'm asking for opinions.
Would it be a good idea to write a bot that replaces links to redirects with a link to the redirect's target (assuming that the target is not a disambiguation page)?
Thanks in advance, Matt ( talk) 23:04, 10 May 2009 (UTC)
As another editor has expressed concern over ListasBot 3's approved functions (in short, whether or not talk pages of redirects should be replaced with a redirect to the new talk page), I've set up a discussion on how to proceed with this bot. Input would be appreciated. The discussion is at User:Mikaey/Request for Input/ListasBot 3.
Thanks, Matt ( talk) 02:44, 12 May 2009 (UTC)
It appears to me that Matt's talkpage notes were good-faith attempts to gain wider exposure. They don't look like intentionally leading questions or ballot-stuffing to me. I'm glad Matt is trying to gauge community consensus, and I don't think the rudeness is called for. – Quadell ( talk) 13:16, 12 May 2009 (UTC)
I have written a bot to get the current Quote of the day from Wikiquote & put it on a page so it can then be used as a template on user pages etc. I haven't requested approval yet because although it works fine from my computer, I need to run it from somewhere else. I've uploaded it to a web server with Dreamhost, yet when I try to run it-the following error comes up.(sorry don't know how to make it smaller!) It's a Python script using pywikipedia.
<small>/home/tris1601/thewikipediaforum.com/pywikipedia/wikitest.py 35 site = wikipedia.getSite() 36 newpage = wikipedia.Page(site, u"User:Dottydotdot/test") 37 newpage.put(text + "<br><br>'''Imported from [http://en.wikiquote.org '''Wikiquote'''] by [[User:DottyQuoteBot|'''DottyQuoteBot''']]", u"Testing") 38 39 wikipedia.stopme() newpage = Page{[[User:Dottydotdot/test]]}, newpage.put = <bound method Page.put of Page{[[User:Dottydotdot/test]]}>, text = u'You have so many things in the background that y... could possibly work?" <p> [[Ward Cunningham]] \n' /home/tris1601/thewikipediaforum.com/pywikipedia/wikipedia.py in put(self=Page{[[User:Dottydotdot/test]]}, newtext=u"You have so many things in the background that y...''] by [[User:DottyQuoteBot|'''DottyQuoteBot''']]", comment=u'Testing', watchArticle=None, minorEdit=True, force=False, sysop=False, botflag=True) 1380 1381 # Check blocks 1382 self.site().checkBlocks(sysop = sysop) 1383 1384 # Determine if we are allowed to edit self = Page{[[User:Dottydotdot/test]]}, self.site = <bound method Page.site of Page{[[User:Dottydotdot/test]]}>, ).checkBlocks undefined, sysop = False /home/tris1601/thewikipediaforum.com/pywikipedia/wikipedia.py in checkBlocks(self=wikipedia:en, sysop=False) 4457 if self._isBlocked[index]: 4458 # User blocked 4459 raise UserBlocked('User is blocked in site %s' % self) 4460 4461 def isBlocked(self, sysop = False): global UserBlocked = <class wikipedia.UserBlocked>, self = wikipedia:en UserBlocked: User is blocked in site wikipedia:en args = ('User is blocked in site wikipedia:en',)</small>
I don't know why it's saying I'm blocked-I'm clearly not & I've checked the IP address for the server it's on-69.163.128.253 which doesn't seem to be blocked either, so now I can't work out what's wrong! Any help would be greatly appreciated-as you can guess I'm pretty new to Python & coding in general!
Thanks!
dottydotdot (
talk) 14:43, 26 May 2009 (UTC)
In searching for bot approvals for ArthurBot ( talk · contribs), I've only successfully located an approval from November 2008 which gave approval for "adding/modifying interwiki links and Link_FA templates". Earlier today, ArthurBot (seemingly counter to the guidelines regarding valid redirects) changed links from MAN AG (the former article name, now a redirect) to MAN SE (new article name). While I'm not sure of the reasoning, I'd like to ask if this is out of scope for the bot's approval and/or is there an approval that I am not finding regarding this activity? — Bellhalla ( talk) 14:41, 27 May 2009 (UTC)
This search shows that "ArthurBot" is only mentioned on a "Wikipedia:Bots/" subpage in two places: here and here. Neither of these approves the task you mention. – Quadell ( talk) 15:10, 27 May 2009 (UTC)
Hi. I blocked Jigbot ( talk · contribs · logs) yesterday for it's username. Now the owner Jigesh ( talk · contribs) requests the account to be unblocked, as he wants to use it as an interwiki bot. What is my best course of action? Should interwiki bots be approved? — Edokter • Talk • 18:39, 29 May 2009 (UTC)
Recently I have been trying to find a version of AWB to use for PascalBot. My search has led me to the conclusion that there is no current version of AWB that is safe for use as a bot, with general fixes enabled. I am thinking it may be useful to have a centralized location to discuss which version(s) of AWB should be used as a bot, perhaps to include "safe" and "unsafe" lists of AWB versions.
Versions of AWB before rev 4382 corrupt {{ Article issues}}. More recent versions add incorrect DEFAULTSORTs, remove valid orphan tags, and add commented out categories. -- Pascal 666 20:52, 31 May 2009 (UTC)
{{ Article issues}} | DEFAULTSORT | orphan tags | commented out categories | ||
Disable gen fix: | SetDefaultSort | ||||
4.5.0.0 | rev 3834 | ||||
4.5.1.0 | rev 3906 | ||||
4.5.2.0 | rev 4100 | ||||
4.5.3.2 | rev 4312 | ||||
4.5.3.3 | rev 4382 | ||||
rev 4395 | |||||
rev 4400 | |||||
rev 4419 |
Anyone know the names of the other gen fixes to disable? -- Pascal 666 00:22, 1 June 2009 (UTC)
The above table is skewed towards issues present in recent versions. Does anyone know of any reason 4.5.0.0 should not be used? -- Pascal 666 03:21, 1 June 2009 (UTC)
rev 4426 can now be downloaded here. -- Pascal 666 17:55, 2 June 2009 (UTC)
So from time to time we have an issue with a bot running out of control, unapproved bots being run, etc. In a recent matter (actually its still going on, but thats beside the point), members of the community seems to discuss a proposal that would have (IMHO) compelled a bot owner to change the operation of his bot. But the bot owner was already on record of saying he would not pay attention to that proposal. I asked the crats what sort of consensus they would look for, and WJBscribe indicated they'd look towards the BAG [15] and that it might be nice if the BAG had some formal process "where someone can raise problems with bots and BAG can evaluate whether to require changes to the bot's operation be made in order for approval not to be withdrawn." Im thinking a possible extension might be an RFC-bot, modeled on the RFC-user conduct and RFC-policy systems. Or something akin to Admins Recall, if it could be applied to all bots equally (not 500 different processes). Other ideas? MBisanz talk 07:56, 21 February 2008 (UTC)
(copied from BN) IMHO
WP:BRFA isn't enough in this respect. Consensus (and the bots themselves) can and do change. There needs to be a process for governing bot (and bot owner) activity including withdrawing approval if necessary. Sure, bots can be blocked but that tends to be reactionary and only takes one admin. I had a bot blocked a few days ago (see bot out of control from above) too and it just seems that, for lack of sufficient process, the block (which was not set a time limit) was just forgotten. We, as a community, need the ability to govern bots because when it comes down to it they are just too efficient. This bitterness and resentment seems to stem mostly from the lack of binding recourse either for the sake of justifying a bot, or for governing one. But as I said it's just my opinion.
Adam McCormick (
talk) 07:46, 24 February 2008 (UTC)
I would like to urge those who approve bots, that bots like BJBot — which left an unwanted long notice on my talk page because I made a single edit to Adam Powell, telling me that it was listed on AfD — should honour {{ nobots}}.
As a side note this response is rather uncalled for behaviour for a bot operator. I'm glad he struck that later, but it's still disappointing. Requests by useres not to notify them should only be ignored if there is a good reason to do so. -- Ligulem ( talk) 19:00, 9 March 2008 (UTC)
I would like to see the approval for this task looked into further by BAG. The notifying of people with very few edits to articles seems rather an annoyance and the bot seems to be notifying a lot of people (IPs included) - I count about 50 notifications about the proposed deletion of Prussian Blue (duo) alone. This was probably a request that should have been scrutinised a little longer... WjB scribe 18:22, 10 March 2008 (UTC)
There was in fact two different bugs that allowed Ligulem to get a notice. The first was me playing around with nobots early in the morning and had been fixed for hours (what he requested fixed on my talk), the second I didn't notice until he posted his rant here ("only one edit" got my interest), I also fixed that. If anybody sees unwarranted notices, leave a message on the bots talk with a diff. I also plan do redisable IP notices per a message on my talk, that should further reduce notices. BJ Talk 01:48, 11 March 2008 (UTC)
I've added a new section on the approval discussion page at Wikipedia:Bots/Requests for approval/BJBot 4, proposing to use an opt-in procedure for task 4 (delete notifications). I suggest to follow-up at Wikipedia:Bots/Requests for approval/BJBot 4#Opt-in instead of opt-out. -- Ligulem ( talk) 10:40, 12 March 2008 (UTC)
Is there an essay or guideline for how to deal with bot owners? I have, in the course of the past year, gotten comments and requests about my bot's behavior that range from polite through negative to downright abusive. I'm sure I've read something somewhere, but can someone point me to it? -- SatyrTN ( talk / contribs) 21:14, 9 March 2008 (UTC)
I propose to extend the {{ bots}} specification to allow easier restriction of particular bot types. This involves creating pseudo-usernames to be used in allow and deny parameters, for example, username "AWB" relates to all AWB-based bots (already supported), other bot framework names could include "pywikipedia", "perlwikipedia", "WikiAccess", etc. Additionally, we could classify bots by roles they perform: "interwiki", "recat", "fairuse", "antivandal", "notifier", "RETF", "AWB general fixes" and so on. For convenience, these roles should be case-insensitive. MaxSem( Han shot first!) 10:23, 12 March 2008 (UTC)
I'm interested in developing my bot skills, particularly to running bots which operate on a continuous basis, rather than the more script-oriented bots I'm already operating. I'm looking for a more experienced bot coder/operator who can help me get to grips with the extra knowledge and tools required to operate continuously-running bots. Kind of an
adopt-a-bot-owner system :D
. I can work in C++ and VB, but all of my previous bot-coding experience has been in python. Anyone interested and willing to give me a hand?
Happy‑
melon 10:40, 18 March 2008 (UTC)
I use
OS X Tiger and Im trying to run bots for the
Telugu Wikipedia. I downloaded the python framework from
this page. and I created the user-config.py file which reads
mylang='te'
family='wikipedia'
usernames['wikipedia']['te']=u'Sai2020'
Sai2020 is my username. I open Terminal and type in python login.py
I get the error python: can't open file 'login.py'
Can someone help me please Σαι ( Talk) 12:19, 12 March 2008 (UTC)
That was the problem. once I cd'd it worked but i get a different error this time
Sais-MacBook:~/Desktop/pywikipedia Sai$ python login.py Traceback (most recent call last): File "login.py", line 49, in <module> import wikipedia, config File "/Users/Sai/Desktop/pywikipedia/wikipedia.py", line 127, in <module> import config, login File "/Users/Sai/Desktop/pywikipedia/config.py", line 364, in <module> execfile(_filename) File "./user-config.py", line 1 {\rtf1\mac\ansicpg10000\cocoartf824\cocoasubrtf440 ^ SyntaxError: unexpected character after line continuation character
whats going on? I'm not very good at these kind of stuff.. Σαι ( Talk) 01:27, 13 March 2008 (UTC)
Thank you very much people. I can now login :) Σαι ( Talk) 08:57, 13 March 2008 (UTC)
Any chance of a bot that automatically reverts any blanked page? One may already exist but, if so I'm not familiar with it. I've been chasing alot of blankings lately in my anti-vandalism crusade. Thanks either way. Jasynnash2 ( talk) 17:09, 14 March 2008 (UTC)
Is anyone having a problem with dotnetwikibot today? As of this morning, any attempt to FillAllFromCategory is not working. I changed nothing in my code, which was working fine yesterday.
I placed a query about this at sourceforge.net dotnetwikibot framework forum, but it doesn't appear to get alot of traffic.
Any help would be appreciated. -- Kbdank71 15:17, 20 March 2008 (UTC)
Please offer input there if you have any :). Mart inp23 19:33, 17 March 2008 (UTC)
Something changed in the format of history pages which broke pywikipedia's getVersionHistory. I've fixed it for my own needs, but heads up in case any other bots use this function. Gimmetrow 23:03, 10 March 2008 (UTC)
As a trial for the CorenANIBot, this page is now automatically archived into subpages when new sections are created. There is an automatically generated link right of the titles to edit or watch the subpages, allowing you to watch the individual threads.
Watching this page itself will allow you to see new threads.
Warn me if it breaks! — Coren (talk) 20:31, 21 March 2008 (UTC)
That's certainly interesting. I'm not sure whether I like it or not, but this is a good noticeboard to try it on. What are the perceived benefits? I can see 1) being able to watch individual threads and 2) a sort of 'instant archive', since they're already sorted by date. But I'm not sure how it would fit in with the archiving schemes currently in place at WP:AN, WP:ANI, etc, or what a newbie making their first post to WP:AN would make of a long list of page transclusions. Perhaps this system is best placed at boards which are frequented by regulars, like WP:ANI or WP:AN3RR. Happy‑ melon 10:46, 22 March 2008 (UTC)
I think this is a great idea for AN and ANI. Trying to watch for changes to any given thread there at the moment is rather impossible, especially on ANI. To address HappyMelon's concern, we'd just have to make it clear via some notices at the top not to try and edit the page itself and to use the edit and add section links. This could also be extremely useful for addressing vandalism attempts on those pages; the main pages themselves could be semiprotected or protected if necessary without shutting down discussions, the same could be done to the transcluded pages without disrupting other discussions.-- Dycedarg ж 22:24, 22 March 2008 (UTC)
I just got around to looking at the nobots system, and realized how far from best practices it is. The system is premised on the historical practice of downloading the entire content of a page before making any edit, even if the edit is only to append a new section to the bottom. Once the API editing is implemented, we probably won't need to download any page text at all to get an edit token and commit the new section. At that point, the nobots system will be completely broken.
It seems to me that we should discuss a nobots system that doesn't require bots to perform lots of needless downloads. Perhaps a database of per-bot exclusion lists, like Wikipedia:Nobots/BOTNAME or something like that, which would only require one fetch to get the full list. — Carl ( CBM · talk) 12:59, 9 March 2008 (UTC)
{{bots|allow={{MyBotAllowList}}}}
.{{nobots|BOT1|BOT2|BOT3}}
puts the page into the appropriate categories. — Carl (
CBM ·
talk) 18:39, 9 March 2008 (UTC)exclude [[User:SineBot]] from [[User talk:Happy-melon]] exclude [[User:MelonBot]] from [[User:Happy-melon]] [[User:Happy-melon/About]] [[User:Happy-melon/Boxes]] exclude [[User:ClueBot]] from all exclude all from [[User:Happy-melon/Articles]]
Happy-melon: what do you mean the bot would hold the entire category tree in memory? There would be at most three categories to read: the list of pages forbidding all bots, the list permitting that particular bot, and the list forbidding that particular bot. This would mean (unless any of the lists is over 5000 entries long) only three HTTP queries, one time, to load the exclusions list. That's reasonable.
On the other hand, any system that requires an extra HTTP query for every edit that must be made is unreasonable because it is vastly inefficient. It would be possible to reduce the number of extra queries if you were just looking for page existence, but still every single bot.css file or whatever would have to be loaded, every time the bot wants to edit the corresponding page. That's far from ideal design. — Carl ( CBM · talk) 22:59, 9 March 2008 (UTC)
Just a note, bots is only used on a handful of pages, so the point is moot. Rich Farmbrough, 20:19 7 September 2008 (GMT).
Just wanted to drop a note here, there is presently a proposal underway at WT:BOTS, to require that all bots be {{ nobots}} compliant. SQL Query me! 03:28, 9 March 2008 (UTC)
...just to see what happens to it under bot care... Franamax ( talk) 12:56, 22 March 2008 (UTC)
That was a little weird. First it didn't show up at all, then it showed as a redlink. I did a server purge and it showed up fine. Seems a little confusing, maybe I missed something? Franamax ( talk) 13:01, 22 March 2008 (UTC)
Clarify: first it was on the page normally (I could see it in edit page), then it vanished and redlinked. Perhaps the bot could leave a "reformatting" message? Also, is this a failsafe method? What if there's an edit conflict along the way? Keeping in mind that the only thing important to me is my post and I want to make sure it's there because to me it's the most important thing in the world. :) Franamax ( talk) 13:13, 22 March 2008 (UTC)
Do you have any good ideas on how to build consensus in discussions like this? Whenever an autonomous interwiki bot links the article Monoicous, the bot owner gets angry comments from the article maintainers. I have tried to explain how interwiki bots work and how we can solve the problem by correcting all the links manually, but the discussion always seems to drift toward “just fix the bots”... -- Silvonen ( talk) 04:19, 11 April 2008 (UTC)
...just in case anybody didn't notice. It would be much easier to pretend that the rewrite has consensus, and attempt to gain consensus for more radical kinds of change, if we could get more people commenting there.-- Dycedarg ж 20:31, 12 April 2008 (UTC)
Are we going with this? This is directed ST47 and Carnildo mainly, as I don't think Beta would follow it without force. I'm sure this has already been talked about but I stopped reading that debate a while ago. BJ Talk 15:25, 16 April 2008 (UTC)
What's it do, faced with a random 4th level header? SQL Query me! 10:55, 8 April 2008 (UTC)