This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
I have a quick question for everyone (since I know little about bots). A couple of months ago, I left the above user a note about his username. S/he mentioned they were in the process of getting approval to run their bot. OK, I figured that was legit. Today, however, I noticed two things. One, the bot was never approved (the request expired). Two, the account has continued to edit, including !voting at AfDs. Should this account be blocked as a username violation? TN X Man 14:21, 3 June 2009 (UTC)
This bot did something strange. It changed the redirect for "amœba", which should redirect to Amoeba, to James Woods. Look for yourself: amœba. The redirect page 68.248.233.52 ( talk) 02:07, 4 June 2009 (UTC)
Erwin85Bot ( talk · contribs) carries the task of adding new BLP AFDs to Wikipedia:WikiProject Deletion sorting/Living people. New discussions should be added to the top, leaving those to be closed at the bottom. Unfortunately, Erwin85Bot adds them to the bottom, sandwiching the old in the middle. This caused some problems. Two-week old AFDs unclosed because they weren't seen. I fixed the page and left Erwin a message, but he seems to be busy IRL perhaps, so the error has not been corrected.
I'm not going to block because it's not a big deal, just an annoyance. It's still better to be able to cut/paste the new from the edit window than have to manually search for the BLP AFDs. It would just be nice if this fairly minor error could be fixed if someone else has access to the code. لenna vecia 13:31, 4 June 2009 (UTC)
Done There is a discussion of recommended settings for this module at Wikipedia:Village pump (technical)#Cosmetic changes (Wikitext cleanup options of pywikipediabot) -- User:Docu —Preceding undated comment added 11:57, 4 May 2009 (UTC).
Hey. Miszabot just put 12 threads into 12 archives. Closedmouth has suggested I get an admin to help clean up. Admin or not, does anybody know why Misza did this? -- I dream of horses ( talk) 16:54, 10 June 2009 (UTC)
I have asked in a couple of places about a bot that has created some 6000 algae articles. Every article I've reviewed has major errors of fact due to the way the bot was programmed to extract data from the data base and the bot owner's lack of understanding of the organisms (this point from discussion on the bot owner's talk page). A discussion about problems with the articles the bot created is at [1]. That issue can be discussed there.
What I would like to know here is why a bot was programmed at all to
1. remove more specific redirects to create less specific ones? [2]
2. delete disambiguation pages to create single redirects instead of adding the redirect to the disambiguation page? [3]
Should a bot be deleting entire disambiguation articles to make room for redirects? Is there a set of rules for bots that covers this and was maybe missed with programming Anybot?
-- 69.226.103.13 ( talk) 21:12, 16 June 2009 (UTC)
I drew this up to hopefully save time and effort, whilst reducing the demoralising effect. Any contributions handy, particularly if you went for a bot whilst still inexperienced as a normal editor. - Jarry1250 ( t, c, rfa) 11:08, 20 June 2009 (UTC)
I have started a community RFC about a proposal for a bot to unlink dates. Please see Wikipedia:Full-date unlinking bot and comment here. -- Apoc2400 ( talk) 10:25, 22 June 2009 (UTC)
This seems to be an unapproved bot, making edits. I've left my own message on the talk page, as I'm not sure if there's some template for such things. I feel uneasy with a username report (which would be because it has "bot" in it's name while it's not one), so hopefully the owner will read my message. If someone more knowledgeable in such things could take a look at it that would be appreciated. Cheers - Kingpin 13 ( talk) 20:27, 22 June 2009 (UTC)
I had to block SoxBot it was making a mess of WP:CHU. Q T C 01:50, 3 July 2009 (UTC)
r52190 changed the API so maxlag errors return with an HTTP status code 503 (where formerly they returned with a 200 and an API error message). If your bot code uses maxlag, properly handles HTTP errors, and doesn't treat the two in the same way, chances are you will need to update your code before the next scap. And if your bot code doesn't do all that, maybe it's time to update it Anomie ⚔ 20:19, 20 June 2009 (UTC)
Because of this and other complaints on the mediawiki-api list, I've
reverted the change in r53353. It never went live on Wikipedia.
\o/ Q T C 10:16, 16 July 2009 (UTC)
Bot operators who use the API to download lists of category members will want to watch bugzilla:19640. The 'cmnamespace' parameter is currently being ignored by the API until performance issues are sorted out. — Carl ( CBM · talk) 19:21, 10 July 2009 (UTC)
Hi, could someone verify this is an approved bot - or indeed a bot at all. It has no userpage or current talk page, and is tagging edits with minor, labelling them as robot:..... All it appears to be doing is adding ar: interwiki links, but as I don't read Arabic so well I cannot verify their validity. Also, I seem to recall an editor should not calim to be a bot when not (or was that the other way round?) Addendum - sorry, quick link to contribs :-) --ClubOranje T 12:51, 11 July 2009 (UTC)
This bot is reverting legitimate edits and the owner refuses to fix it.
Reverting test edits should be done by a dedicated bot (and probably already is). Such a bot should match the entire text http://www.example.com link title
as added by the editing buttons, not just indiscriminately reverting any link that contains example.com. — Preceding
unsigned comment added by
71.167.73.189 (
talk •
contribs)
Per edit http://en.wikipedia.org/?title=User:ClueBot_III/Indices/User_talk:AzaToth&diff=302790668&oldid=302781497 I've blocked this bot. → Aza Toth 01:05, 19 July 2009 (UTC)
Hello there. Just to let you know that I (Kingpin13) have been nominated for BAG membership. Per the requirements, I'm "spamming" a number of noticeboards, to request input at my nomination, which can be found at Wikipedia:Bot Approvals Group/nominations/Kingpin13. Thanks - Kingpin 13 ( talk) 08:01, 20 July 2009 (UTC)
I often use the template sandboxes (" Template:X1", " Template:X2", etc.). However ClueBot often clears the sandbox while I'm partway through testing. I drew this to the owner's attention here. Cobi informs me that the bot clears the template every 3 hours. The sandboxes indicate that they are cleared every 12 hours. Ironically, neither is correct. The cleaning seems to be rather erratic. The times of last clearance by the bot:-
I request that if possible, the bot can be adjusted so that it doesn't clear a template sandbox within, say, 15 minutes of the last edit. Is this possible? Thanks. Axl ¤ [Talk] 06:44, 3 August 2009 (UTC)
Very well, unless there is any opposes, I'm gonna ask Cobi to make his bot clean it after it's been unedited for 30 minutes, and we'll keep SoxBot cleaning it every 12 hours, and every time the header is removed (I think the bots will work together very well like that). - Kingpin 13 ( talk) 03:33, 18 August 2009 (UTC)
I am concerned about User:DefaultsortBot, it appears the owner is not taking responsibility for the bot edits. Please see User talk:DefaultsortBot#Did I get it wrong?, Thanks for taking a look, good night. Jeepday ( talk) 00:46, 8 August 2009 (UTC)
Ser Amantio di Nicolao ( talk · contribs) seems to be automatically mass creating pages using AWB without a bot flag, someone might want to look at this? Peachey88 ( Talk Page · Contribs) 08:12, 16 August 2009 (UTC).
Rockfang suggested that bots should not edit copyvio pages. I am not convinced. However I have put {{ Nobots}} in the {{ Copyvio}} template. Comments? Rich Farmbrough, 18:35, 17 August 2009 (UTC).
In the case in point another bot was interwiki-ing. Seems that would have been useful if the page was stubbified. Rich Farmbrough, 21:37, 17 August 2009 (UTC).
A small addition to the tasks for KslotteBot.
Can this be approved? -- Kslotte ( talk) 15:11, 24 August 2009 (UTC)
Pls see Wikipedia_talk:Requests_for_adminship#RFA_closing_time — Rlevse • Talk • 00:05, 25 August 2009 (UTC)
Just want to get a wider opinion on on this bots' recent edits, at first it seemed unrelated to any of it's previous BRfAs [5] [6] [7] [8], so I gave him a warning that it was running an unapproved task. Now after talking with the operator, the edits are going back to update it's previous edits. After the clarification, I'd probably let it slide under the previous approved task, but wanted a wider opinion on it. Q T C 09:36, 31 August 2009 (UTC)
The use of this template in main-space seems to have spread. I suggested its use on copyvios (and created it in the first place) but I like to see maybe one or two articles protected in this way, while problems are dealt with by the bot owners. A list of protected articles and (provisional) details of one of the applications of the templte are here. Rich Farmbrough, 14:57, 6 September 2009 (UTC).
I would like BAG members and other bot operators to look at the bugs by Citation bot and how they are being handled including its operator's interactions with users' reporting bugs. There are a lot of bugs and may be poor communication, in my opinion.
I think additional input at this stage from experienced bot operators could help in making this a relatively bug-free bot.
In my opinion all currently listed bugs should be fixed and investigated before the bot is run again. If the bot is operating reported bugs should receive a response from the bot operator on the bug board in a reasonable time (less than a month? a few days? a week?) and be marked resolved where applicable. If there is a question about what the bot should be adding, the bot operator should post a link showing a prior discussion, or raise the discussion on an appropriate board, or not be adding/changing that parameter. I understand this bot does a lot of useful and tedious work, but, if there are problem areas it should be scaled back and the problems fixed, rather than be running with bugs, in my opinion. I'm posting a link to this on Martin's page. [9] -- 69.225.12.99 ( talk) 22:19, 10 September 2009 (UTC)
[10] -- 69.225.12.99 ( talk) 09:33, 12 September 2009 (UTC)
This bot is cleaning the sandbox every few minutes. That makes it impossible to do any testing there. Furthermore, the owner is on a wikibreak and the bot's talkpage is editprotected. So this is basically a "wild" bot.
For these two reasons, each in itself being sufficient, I think this bot should be turned off. Debresser ( talk) 06:59, 13 September 2009 (UTC)
I don't know what the WP policy is on bots from bare IP addresses. But the IP address 75.69.0.58, for whom I assume good faith, appears to my untrained eye possibly be a bot. So thought I would bring it to your attention. 500 edits in 9 days, edit rate seems higher than could be manually sustained given the typical edit complexity I noted. But the alleged bot appears to be doing good work, at least on the article where he cleaned up after me earlier today. Cheers. N2e ( talk) 18:15, 25 September 2009 (UTC)
Please see the talk between me and Cobi at User talk:ClueBot Commons/Archives/2009/September#Duplicate month headings. I noticed that ClueBot resets the monthly clock on vandalism warnings, and Cobi explained that this feature is intentional and has been discussed numerous times before. The reason that I'm bringing it up here is that it seems to me that the feature, although clearly well-intentioned, may be worth re-evaluating. It seems to me that human editors will generally continue the progression from first to second etc etc warning over the course of a longer time period than the bot is currently instructed to do. I suggest that this issue be given a fresh look. Thanks. -- Tryptofish ( talk) 22:00, 28 September 2009 (UTC)
In the interests of full disclosure:
Since CobraBot a little while ago completed its initial pass over all the articles containing a book infobox (yay!), I modified it to run manually on pages specified by the operator; I've been trawling the internal logs the bot produces when it skips a page to (if possible) manually fix the infobox formatting or add an ISBN where there was none, so the bot can then be run on the individual affected pages. Unfortunately, doing so caused a bug where the edit summary didn't get set, and as a result the default, generic pywikipedia edit summary was used. This has since been fixed and the bot did not otherwise malfunction (i.e. the edits were not misformatted or wrong), but I only noticed it after ~125 edits when I chanced to look at the bot's contribs page. My apologies, and I've noted for and resolved in the future that even seemingly minor changes deserve a level of further testing. -- Cybercobra (talk) 06:36, 1 October 2009 (UTC)
On five different occasions, bots have added the Swedish version of the 2009 Eurocup Formula Renault 2.0 season as an interwiki. Please can they be set to avoid this situation, as the two series are not the same. Regards. Cs-wolves (talk) 14:46, 9 October 2009 (UTC)
The Northern Artsakh ( | talk | history | protect | delete | links | watch | logs | views) in Russian WP was actually deleted on May 16, but some bots continue to add false positive interwiki ru:Северный Арцах/Temp. Could we fix that? Brand t] 07:26, 13 October 2009 (UTC)
FoxBot ( BRFA · contribs · actions log · block log · flag log · user rights) is making a series of changes to date pages which is adding interwiki links such as bcl:Nobyembre 5 which is fine. However at the same time this bot is changing an agreed & n d a s h ; with a dash.
The agreed format of the date pages is to use & n d a s h ; because most people adding items to those pages will not know the difference between a hyphen and a dash. The & n d a s h ; to create a dash reduces the chance that people will use a hyphen. The use of a hyphen messes up something, various checking bots I think, but whatever reason the hyphen is NOT USED on thos date pages.
Will somebody first stop this bot and then will somebody with rollback please revert all the edits made in error by this bot. -- Drappel ( talk) 14:09, 16 October 2009 (UTC)
I am confused here; the changes such as
[11] are replacing the HTML entity – with an equivalent unicode character
u2013. The rendered content is not changed at all. Even if FoxBot doesn't do this substitution, eventually someone will come across the articles with AWB, which also does this substitution. In short, there is essentially no way to prevent html entities from being converted to equivalent Unicode characters. — Carl (
CBM ·
talk) 14:42, 16 October 2009 (UTC)
Sorry for the late reply. I have currently disabled cosmetic changes. However I strongly disagree with the way en wiki uses ndash and mdash. On all other wiki's they are automatically being replaced by the right Unicode character. Why should this wiki bee different from all other wiki's. First of all ndash and mdash are complicated for all users who aren't familiar with ndash or mdash and there even more for "foreign" users who believe they should be changed. What one could do to prevent this from happening, as it will certainly do as AWB also automatically does these changes, is or to change the rules when to use ndash or mdash (wich would be the easiest), or to change the script of pywikipedia and AWB and AutoEd, or to create a bot that reverts thes edits (wich wouldn't be a real solution I believe). - Foxie001 ( talk) 17:29, 16 October 2009 (UTC)
The &ndash was implemented on all of the date pages in April 2009 by User:Juliancolton. There was no discussion of the change, but when I asked him about it (while it was happening), the argument seemed sound. I am more convinced of the soundness of the argument now. It was never added to the project guidelines. Drappel has explained it pretty well. If there is to be a distinction in the MOS for use of dash vs. hyphen, the easiest way to implement this in a page with hundreds of dashes is to do it this way. I never knew the difference before and I would always have used a hyphen. This only rarely gets "fixed" by a bot or editor with AWB. To keep the pages consistent, using &ndash is the best way to do it. If we remove them, we will have a mix of dashes and hyphens that a bot would need to cleanup (if one could). -- Mufka (u) (t) (c) 22:09, 16 October 2009 (UTC)
I recently used the c# regex code from
Template:Bots, but noticed that it gave a TRUE value for a page that contained {{bots|allow=SineBot}}
(when my bot is ChzzBot).
I was doing a little independent check, and skipping pages with "{{bot" or "{{nobot" entirely in any case, to process those by hand later.
If my understanding of the compliance is correct, this should actually have been FALSE, ie the bot was not specifically permitted.
I'm not good at regex, hence I recoded it in what might be a bit of a long-winded fashion, which (for a month) you could check in this pastebin.
If someone thinks it's worth changing to that code on the page, suitably tidied up, then feel free.
Cheers, Chzz ► 03:32, 16 October 2009 (UTC)
bool DoesAllowBots(string botusername, string pagetext)
{
if (Regex.IsMatch(pagetext, "\\{\\{(\\s|)(bots|nobots)(\\s|)(\\|((.)+|)(allow)(\\s|)(=)((.)+|)(" + botusername + "))"))
return true;
return !Regex.IsMatch(pagetext, "\\{\\{(\\s|)(bots|nobots)(\\s|)(\\|((.)+|)((optout|deny)(\\s|)(=)(\\s|)(all)|(optout|deny)(\\s|)(=)((.)+|)(" + botusername + ")|(allow(\\s|)=(\\s|)none))|\\}\\})");
}
/// <summary>
/// checks if a user is allowed to edit this article
/// using bots and nobots tags
/// </summary>
/// <param name="articleText">The wiki text of the article.</param>
/// <param name="user">Name of this user</param>
/// <returns>true if you can edit, false otherwise</returns>
public static bool CheckNoBots(string articleText, string user)
{
return
!Regex.IsMatch(articleText,
@"\{\{(nobots|bots\|(allow=none|deny=(?!none).*(" + user.Normalize() +
@"|awb|all)|optout=all))\}\}", RegexOptions.IgnoreCase);
}
{{
bots|deny=
all|allow=
BotName}}
. IMO, my code is better :) -
Kingpin
13 (
talk) 12:58, 16 October 2009 (UTC){{bots|allow=SineBot}}
" is not enough to deny any bots anyway. Since all bots are allowed by default, explicitly allowing one with this template doesn't seem to opt out anything. –
xeno
talk 13:12, 16 October 2009 (UTC)
{{bots|allow=SineBot}}
" on it. So what we're trying to do (I think) is replace the C# code at
Template:Bots -
Kingpin
13 (
talk) 13:16, 16 October 2009 (UTC)
{{bots|deny=all|allow=<bots to allow>}}
, which makes more sense to me, but I'd be happy to modify my code accordingly? -
Kingpin
13 (
talk) 13:37, 16 October 2009 (UTC)
As is clear above, the current method of bot exclusion is complicated and difficult to be compliant with. Perhaps it's time to redesign the system? So far, it seems the goals are:
One possibility:
The detector for this scheme is pretty simple:
function allowedToEdit($text, $botnames){
if(preg_match('!{{\s*nobots\s*}}!', $text)) return false;
if(!preg_match_all('!{{\s*(bots|nobots)\s*\|\s*except\s*=(.*?)}}!s', $text, $m)) return true;
$re='!/\s*(?:'.implode('|', array_map('preg_quote', $botnames)).')\s*/!';
for($i=0; $i<count($m1]); $i++){
$found = preg_match($re, '/'.$m2][$i.'/');
if($found && $m1][$i == 'bots') return false;
if(!$found && $m1][$i == 'nobots') return false;
}
return true;
}
The major drawback is that this scheme has no provision for bots trying to honor the existing syntax; they'll probably either ignore the new syntax completely or treat the nobots version as {{nobots}}.
Perhaps the best way to do #5 is to just declare that these templates must be placed in section 0, and then bots need only fetch section 0 to perform the check (possibly like this). Anomie ⚔ 20:46, 16 October 2009 (UTC)
(Aside) I await clarification/new code; for now, fortunately my bot edits are trivial, so I just deal with any page containing "{{bot*" or "{{nobot*" manually. Chzz ► 13:24, 19 October 2009 (UTC)
Could a BAGer please take a look at this its been open since September and has been tagged for BAG attention since October. -- Chris 08:53, 8 November 2009 (UTC)
A discussion that may affect some bots running on the toolserver is in progress at WP:VPT#Toolserver IP editing logged-out again. Please comment there. Anomie ⚔ 23:06, 25 November 2009 (UTC)
Does anyone know of a tool to help find abandoned userspace draft articles? I suppose it wouldn't be too hard to write someone to analyze the lastest database dump based on last edit date. Before I consider writing such a tool, I wanted to check if such a thing already existed. Thanks! -- ThaddeusB ( talk) 05:33, 3 December 2009 (UTC)
I'm curios as to the effectiveness and usefulness of this page, and secondly the fact that the approved account is not the account doing the edits. Page currently has ~52k revisions, RonaldB has edited it 27414 times, RonaldBot 24323 (not since eary 2008). In addition RonaldB's total edit count is 27487.
I could see updating it once a day, or every hour/half hour, but not every time it finds a new request. I think it's pertinent to revisit the request. Q T C 21:20, 3 December 2009 (UTC)
Would love a second opinion on in the conclusion I drew of ENWP bot policy were correct in this case. Q T C 01:36, 8 December 2009 (UTC)
Should I file another BRFA to start using PY, rather than AWB, for DrilBot, even if it would be doing essentially the same things (syntax cleanup, etc.)? – Drilnoth ( T • C • L) 19:06, 2 December 2009 (UTC)
There is currently talk at the Albums WikiProject about a possible widespread change across all album articles, and we need some expert advice on what is possible to accomplish with bots. Please see #Implementation of consensus on album reviews for our discussion. Thanks — Akrabbim talk 20:07, 9 December 2009 (UTC)
For Christmas this year I would like Wikipedia:Bots/Requests for approval/Orphaned image deletion bot to be approved. I know I haven't been the best boy this year, and have been slow to respond to queries etc related to that brfa, but seriously its been open since September and I would appreciate it if it was over and done with before New Year. -- Chris 09:19, 15 December 2009 (UTC)
Hi. I think this is the place to ask. :) Currently, Zorglbot does a few tasks at WP:CP, and I'm wondering if it is possible to have its functions there replaced by a bot operator who is a little more active on English Wikipedia. The bot's job is creating new pages each day for transclusion and moving pages up to the "Older than 7 days" section after, well, 7 days. Since there's already a bot doing this, a bot request seems inappropriate, but unfortunately it can be difficult to talk to Schutz about issues because he is not often active here.
For last instance, see User talk:Schutz#Zorglbot: Expansion on the daily CP listings. We requested a change to the bot to take into account changes in CorenSearchBot on 10/31. I left him a note at his French talk page that same day. On 11/2, Schutz responded, saying he would look into it very soon. Despite follow up requests on 11/11 and my asking if we should find somebody less busy on 11/12, that's the last we've heard of it. I asked Tizio on 11/16, but when he was also inactive followed up on the recommendation of another bot operator on 11/22 by asking Schutz if we might have the code for Zorglbot so such small changes could be made by others if he is busy ( Fr:Discussion utilisateur:Schutz#Zorglbot request). Though Tizio came back and implemented our request in DumbBot's daily duties, I have not heard anything from Schutz on this, though he has edited both en:wiki and fr:wiki since.
As occasionally happens to any bot, I'm sure, Zorglbot didn't do its thing last night. Another contributor has notified Schutz (and I've manually performed its task), but it brings home to me the need to have an operator who is accessible. While Schutz is very helpful and accommodating when he is around, my previous communications at his talk page have also suffered lag.
Would it be possible to get a different bot to do this task? -- Moonriddengirl (talk) 12:42, 13 December 2009 (UTC)
The WP 1.0 bot is used to track the assessment templates that are put on the talk pages of articles. These are used by over 1,500 WikkiProjects, with over 2.5 million articles tagged. A new version of the WP 1.0 bot is in initial beta testing.
The new version of the bot runs on the Wikimedia toolserver, using their databases and storing the data in its own database. This should open up opportunities for other bots to use the pre-parsed assessment data to do their own analysis. I am open to implementing APIs for this purpose, and I am very open to patches against my code.
I'd also like to find another person interested in working on the bot. Of course you can choose your own level of involvement. I would be happy to have a new developer at any experience level, and working with this bot would be a very nice way to learn database/web programming in the context of a real project. If you're interested, please contact me on my talk page. — Carl ( CBM · talk) 01:51, 17 December 2009 (UTC)
This is due notification that I have been nominated to become a member of the Bot Approvals Group. My nomination is here. @ harej 05:46, 29 December 2009 (UTC)
I'm currently requesting approval for a bot that will place a message on the talk page of any new namespace 0, 6, 10 or 14 article with ambiguous links. See Wikipedia:Bots/Requests_for_approval/WildBot. Josh Parris 03:00, 3 January 2010 (UTC)
Wikipedia:Bots/Requests for approval/hadoop0910 (created 3 December 2009) and Wikipedia:Bots/Requests for approval/mishumia (created 21 August 2009). -- Magioladitis ( talk) 08:10, 7 January 2010 (UTC)
Anyone know how to get a list of what pages redirect to a given page using pywikipedia? -- Cybercobra (talk) 06:56, 10 January 2010 (UTC)
def get_redirects()
target = wikipedia.Page(site, "Pagename")
for redirect in target.getReferences(follow_redirects=False,redirectsOnly=True):
# Cache all redirects to this base name
yield redirect
Josh Parris 10:05, 10 January 2010 (UTC)
I've proposed a minor change to {{ bot}} here, in case anyone has comments. Olaf Davis ( talk) 15:31, 14 January 2010 (UTC)
Hey Wikipedians, I am here to advertise my nomination to be on the Bot Approvals Group. Take a look if you have some time. Tim1357 ( talk) 02:25, 16 January 2010 (UTC)
(Copied from Wikipedia Talk:Stub at User:Xeno's suggestion)
SmackBot has now twice (if not more) removed my marking of stub templates, the latest being at
Battle of Temesvár with
this edit.
I have got on to the bot's maintainer recently about this when it did it at Battle of Pakozd. Unfortunately the matter there seems to have been dropped and archived before achieving consensus.
In my opinion, a bot should not make the decision to remove a stub, but WP:STUB says "Any editor can remove... without special permission". It depends, then, whether a bot is regarded as an editor for the purpose of this sentence. I believe in practice, though, it is best to leave such removal to a human's decision. In particular, in these articles which form part of a series:
{{
Expand Hungarian}}
and {{
Expand section}}
tags. WP:STUB says an article should not have both the {{
expand}}
tag and stub templates. I do not think by extension this should mean any template that happens to start with "expand". I am not assertoing this was the reason for removing them, but I would regard it being so as very much too liberal an interpretation of WP:STUB, since those templates have quite distinct usages/meanings and are not merely artifacts of {{
expand}}
. Clarification there please.{{
underconstruction}}
tag. While not of itself even intended to prevent/delay other editors' contributions, it may be taken as evidence indicating that the article is actively being expanded, and so hint that in its current state it could well be considered a stub. Again I am not suggesting that it will always do so (otherwise it might as well come under the same guidance as {{
expand}}
) but it may give a hint to a human editor that it is indeed a stub.In short, I think it entirely inappropriate for a bot to make this kind of decision. For a human editor, with the assistance of AWB, to do so, is entirely another matter, since in good faith I accept that an editor will consider the whole balance of the article and use AWB to find candidates, not automatically trust it (as a bot does).
Any opinions on this matter?
Best wishes Si Trew ( talk) 20:47, 16 December 2009 (UTC)
{{bots|deny=SmackBot}}
to the article, to prevent them editing it again (if it is exclusion complient) . –
xeno
talk 20:53, 16 December 2009 (UTC){{
expand}}
template.[...] there are subjects about which a lot could be written - their articles may still be stubs even if they are a few paragraphs long. As such, it is impossible to state whether an article is a stub based solely on its length, and any decision on the article has to come down to an editor's best judgement (the user essay on the Croughton-London rule may be of use when trying to judge whether an article is a stub). Similarly, stub status usually depends on the length of prose text alone - lists, templates, images, and other such peripheral parts of an article are usually not considered when judging whether an article is a stub.
(outdent) Let's make it simple. Either he edited it for himself, and accidentally was on the SmackBot account – which I accept as a human error if he will only say so – or his bot edited it as part of what it would then do to thousands of other articles. All he has to do is say which, but if he does not, I think he is abusing the bot. Si Trew ( talk) 13:22, 27 December 2009 (UTC)
I have no idea, and little care, about the circumstances surrounding this dispute, but one fact is abundantly clear: a bot does not have the judgement to decide when an article has progressed beyond stub stage. A bot is not an editor, it is a mindless automaton; a such, a bot should not be removing stub templates. If a bot has approval for that task, that approval should, IMO, be withdrawn; if it does not have approval, it should not be doing it anyway. Happy‑ melon 17:32, 27 December 2009 (UTC)
I haven't read the whole discussion yet but I would like to add my opinion on Si Trew's concerns. As far as I understand:
On stub removal from AWB:
Thanks, Magioladitis ( talk) 08:10, 28 December 2009 (UTC)
I had pointed this out some time ago [13]. I have raised the issue again on the bot's talk page, as this is something that doesn't have bot approval and seems unlikely to gain bot approval (as Happy-Melon said above). Not all AWB features are suitable for bots. — Carl ( CBM · talk) 13:50, 28 December 2009 (UTC)
In case this matters for the discussion: AWB Tagger's bug for orphans has been fixed and will be available with next release probably next week. Check Wikipedia_talk:AutoWikiBrowser/Bugs/Archive_15#Orphan_tags_part_IV. Further tweaks, will be done soon. Check Wikipedia_talk:Orphan#AWB_and_Orphans. After that, I don't see any reason why SmackBot should not be going genfixes and tagging while doing other stuff. This saves watchlists from being cluttered by continuous bot edits, servers and a lot of work. If someone goes to WP:CHECKWIKI will see that most errors are human made and not AWB bots related. Thanks, Magioladitis ( talk) 16:04, 14 January 2010 (UTC)
I am currently standing for BAG membership. Your input is appreciated. ⇌ Jake Wartenberg 02:53, 26 January 2010 (UTC)
The end of the discussion is that Rich got genfixes turned off. I didn't want that or ask for it. Never during this whole farrago did I turn his bot off or ask for it to be. I asked, more or less, for it to be turned off for stub remmoval, while it was decided. It still seems it has not been and there is no consensus. I do think that SmackBot should be let to go about its general business cleaning up genfixes, I have no problem with that at all. Rich put a message on my talk page to which I have replied, and left a message at his.
It is bizarre, because really all both he and I want to do is make Wikipedia better. We clashed, the earth still goes around the sun. He should have SmackBot reinstated for genfixes. Anything I can do to help that I will, and have said so. The stub thing I still think is not resolved, but SB should do uncontroversial bot edits and should not be stopped from doing that. Certainly it was not me who asked for that. Si Trew ( talk) 18:26, 30 January 2010 (UTC)
Let me make something clear: AWB has two main procedures: Auto-tagger and general fixes. The first part really still does have problems in some parts. I can write them down if someone ask. We are working a way out. SmackBot should keep doing general fixes in addition to its other functions. -- Magioladitis ( talk) 20:07, 30 January 2010 (UTC)
Further to: Wikipedia:Bot_owners'_noticeboard/Archive_4#Bot_categories:_Grand_plan it's about time finding bots to
became much easier. I have suggestions, but before I pollute the pool with my own thoughts, what are yours? Josh Parris 02:49, 1 February 2010 (UTC)
I have an new bot leaving a note on the talk pages of new articles where it finds links to disambiguation pages in the new article. During approval consensus was that a message on a user's talk page or a tag within the article itself would be too disruptive. A study during the trial of the bot showed that early tagging (on the talk page) was vital to generating editor action, and the reception has thus far been strongly positive.
However, it's been pointed out that there's an increased administrative burden from creating these talk pages, as 7.5% of them are subsequently deleted along with their associated article (under the CSD).
I see six possible solutions (in order of preference):
Is there anything else I can do? Josh Parris 14:49, 19 January 2010 (UTC)
Do we have a filter for empty talk pages? I think after the dabs are corrected the templates should removed and if the page is emptied then to be deleted. In fact I really hope that these messages are updated. In the past we had some bot messages that stood in talk pages for years ending to be out-of-date and sometimes disruptive for readers who wanted to read discussions in the talk page. -- Magioladitis ( talk) 17:01, 19 January 2010 (UTC)
Further to this: Wikipedia:Bot requests/Archive 33#db-author bot Josh Parris 00:31, 20 January 2010 (UTC) Also Wikipedia:Bots/Requests for approval/7SeriesBOT Josh Parris 11:17, 4 February 2010 (UTC)
I have been nominated for Bot Approvals Group membership by MBisanz, and I am posting a notification here as encouraged by the bot policy. If you have time, please comment at Wikipedia:Bot Approvals Group/nominations/The Earwig. Thanks, — The Earwig @ 03:39, 3 February 2010 (UTC)
I have accepted MBisanz's nomination of myself for membership of the Bot Approvals Group, and invite interested parties to participate in the discussion and voting. Josh Parris 03:02, 11 February 2010 (UTC)
If your bot has been not sending a User-Agent header, it will now be getting errors while trying to do anything on Wikimedia sites. The fix is to set a distinct User-Agent header for your bot (i.e. don't just copy the UA from IE or Firefox, or use the default in your HTTP library), like you should have been doing already. Anomie ⚔ 19:58, 16 February 2010 (UTC)
Did you notice changes in interwiki bots behaviour? There are some problems with it.
Basilicofresco ( msg) 13:37, 17 February 2010 (UTC)
Hi! Are approved bots allowed to add cosmetic changes (-cc parameter, see cosmetic_changes.py in pywikipedia) to their tasks without any specific authorization request? example Thanks. -- Basilicofresco ( msg) 15:56, 19 February 2010 (UTC)
I'm not entirely sure how to report a bot, but User:FaleBot seems to be doing odd things. It removes perfectly good interwiki links (for example, every interwiki link to 2010 in music). It seems to take offence at interwiki articles with non-ASCII names (such as Akasaka and the Japanese disambig page/ Korean disambig page) and just removes them seemingly randomly.-- Prosperosity ( talk) 18:41, 20 February 2010 (UTC)
Hello. I have opened a BRFA for a new anti-vandalism bot ( User:AVBOT). It reverts vandalism, blanking and test edits. This bot has been tested in Spanish Wikipedia for about 2 years, and it has reverted about 200,000 vandalisms.
Some more features:
Also, the code is available for reviews. Thanks. Regards. emijrp ( talk) 21:46, 22 February 2010 (UTC)
User:Muro Bot renames DEFAULTSORT to ORDENAR in its "cosmetic changes". Examples: here, here, here. Please someone block it until this is fixed. Thanks Hekerui ( talk) 17:04, 27 February 2010 (UTC)
Hey everyone. Does anyone know if this toolserver IP address was ever matched up with which one of the AIV helper bots has been logging out? It seems to be losing its log-in frequently lately. -- Nick— Contact/ Contribs 07:44, 4 March 2010 (UTC)
At Wikipedia:Bots/Requests for approval/Full-date unlinking bot 2 harej has requested another operator adopt his bot (and presumably the BRFA that goes with it). Are there any takers? Josh Parris 12:08, 5 March 2010 (UTC)
Here is another interwiki bot running in -auto and -force mode... [20] I believe it's time to ask developers to block the contemporary use of -force and -auto. What do you think? -- Basilicofresco ( msg) 06:47, 6 March 2010 (UTC)
Here is the answer of the operator. -- Basilicofresco ( msg) 10:48, 6 March 2010 (UTC)
Operator has been warned, but is still running the bot in -auto -force: [21]. There is consensus against these edits. Please take appropriate actions. -- Basilicofresco ( msg) 23:13, 9 March 2010 (UTC)
User talk:Evgeniychub. It seems most of us regular unblock reviewers are pretty clueless about the details of bot operations... Beeblebrox ( talk) 17:39, 11 March 2010 (UTC)
Chillum seems to be serious about retiring, and as a result we've had no bot generated username reports for the last five days. Anyone want to adopt this bot? Beeblebrox ( talk) 18:50, 12 March 2010 (UTC)
I thought I should call attention to the informal discussion at User_talk:Basilicofresco#FrescoBot_adding_WildBot_templates and User_talk:Josh_Parris#WildBot_template_clutter. I've weighed in somewhat assertively and didn't want to seem as if trying to short-cut a discussion that probably belongs on the table for the entire BAG community to work through. It appears FrescoBot has essentially turned Wildbot's DAB and section-link function from an opt-in to a mandatory function wiki-wide. Main problem seems to be that the mass prompting set up by FrescoBot is setting up full-size templates on countless busy talk pages where the top of the page is devoted to setting context for the talk page rather than minor maintenance functions that these bots perform. The tasks are of course potentially useful and impressive. But I wonder if this effect was fully contemplated in the last approvals request of FrescoBot? ... Kenosis ( talk) 02:52, 18 March 2010 (UTC)
Given that the honorable Mr. Wolterding seems to be dead (no edits since January 3rd), what is the Plan B for WolterBot ??? -- AlainR345 Techno-Wiki-Geek 04:44, 9 March 2010 (UTC)
I would notify how user:Monegasque is recently adding a host of ultra-detailed categories (such as category:People from Fiume Veneto, which I recently blanked) whose importance is really minor, and usually contain just one entry. CAn somebody intervene and, possibly, revert all his improper additions? Thanks and good work. -- '''Attilios''' ( talk) 15:36, 26 March 2010 (UTC)
The Southern Railway (Great Britain) article has had an interwikilink added to the pt:Southern San Paulo Railway article. The link has been added by Xqbot, TXikiBot, Volkovbot and Volkovbot again. No doubt a similar series of edits will have been performed to the corresponding Portuguese article. I've not checked this as that is a matter for pt:wiki. Further addition of this link will be seen as disruptive and the bots prevented from editing. Please can we get this sorted. Mjroots ( talk) 03:21, 27 March 2010 (UTC)
There are eight pages (one each for eight languages: ca, de, en, fr, ja, no, pt, simple), and I have examined each. Seven of the eight deal with what we in the UK know as Southern Railway (Great Britain); the other ( pt:Southern San Paulo Railway) deals with a railway in Brazil, and the latter was incorrectly being added as an interwiki to the other seven, and vice versa. User:EdJogg removed all seven interwikis from the pt page, but I was concerned that a 'bot would add them all back in, so I went to each of the other seven and manually removed the pt interwiki. This has cleared the immediate problem. I have since examined the histories of the eight pages in question, and built up the following chronological sequence. All times in UTC.
26 March 2010
27 March 2010
It seems to me that the original error was made by User:ZéroBot and it spread from there. To fix it properly required an edit to all eight pages -- Redrose64 ( talk) 14:59, 27 March 2010 (UTC)
Hello. I recently wrote the article Backfire (Cocktail) and User:Xtzou put a thingy in there to have it deleted. So I looked at his contribs and wow, this is something. Obviously this is not what I would call a normal user. Is this a socketpuppet-account of an experienced wp user? And is this somebody using partly a bot?
His contrib list: [ [23]]
Please have a look into it. Dropdead567 ( talk) 14:06, 2 April 2010 (UTC)
If your bot(s) use any specific categories, feel free to tag those categories with {{ Bot use warning}}. For examples of use, see Category:Candidates for speedy deletion and Category:Wikipedia files for deletion. עוד מישהו Od Mishehu 13:26, 6 April 2010 (UTC)
Please note, about 10-15 minutes ago there was a security change to the login system (Which is now live on all WMF wikis), which will break change the current login systems (API users included), For more information see:
bugzilla:23076.
Peachey88 (
Talk Page ·
Contribs) 00:39, 7 April 2010 (UTC)
Would any fix have to be implemented by individual bot owners or is there a general fix? I'm not very good at coding/programming, so I'm not sure if that was answered above or not. TN X Man 03:19, 8 April 2010 (UTC)
Hello, My Pywikipedia bot is correctly logged, with api, but it doesn't edit anymore with template.py, nor with replace.py does anyone have an idea of the problem (I use a completly standard code). Regards -- Hercule ( talk) 09:26, 8 April 2010 (UTC)
Overview of frameworks and their status since the update, If anyone else knows any other frameworks, please add them to the list: Peachey88 ( Talk Page · Contribs) 10:15, 8 April 2010 (UTC)
I have accepted Kingpin13's nomination for membership in the Bot Approvals Group, and per the instructions invite interested parties to participate in the discussion and voting. Thank you, – xeno talk 19:24, 17 April 2010 (UTC)
Who is running/owning/responsible for SQLBot-Hello ( talk · contribs)?
Admin SQL ( talk · contribs) hasn't edited Wikipedia in just over a year, but the bot is still welcoming people en-masse.
(Context: I saw the welcome template (and broken signature) at this new user's talkpage, User talk:Lisa-maria syrett, and I assumed the editor named in the signature had added it, so informed him that his signature was breaking, and he pointed out that a bot had added the template, not him. Hence after a little digging, I'm here.) Thanks. -- Quiddity ( talk) 01:20, 25 April 2010 (UTC)
I know at least two welcome templates that use {{REVISIONUSER}} to fill in the welcoming user's talk page link during substitution (one has done so for a year, the other was recently changed to do so by me). Obviously, if bots go around and subst those templates, the bot's talk page will be linked instead.
Do we still have active bots substituting templates during their rounds? If so, I would suggest to
botsubst=1
, noting that the parameter bot=...
is already in use in some templates) so that the template can react to that.Not at all critical of course, and as far as I'm concerned we can just as well decide to simply not use any fancy magic words on bot-substituted templates. Current usage is not at all critical, as far as I'm aware, and could easily be removed.
Amalthea 17:18, 3 May 2010 (UTC)
I've temporarily blocked CommonsDelinker for the duration of the disruption at Commons to prevent further damage to the local project. WP:AN notice Q T C 01:26, 8 May 2010 (UTC)
At
WP:ANI#User:X!'s adminbots is was a discussion on whether to desysop his two adminbots since he has resigned as a bureaucrat and admin.
As of now several have agreed that this is unnecessary, but please reply there, especially if you disagree with that.
PleaseStand
(talk) 20:10, 9 May 2010 (UTC) 21:17, 9 May 2010 (UTC)
Programmer looking to work for food and shelter! - Hey folks. I've been inactive for a long while. I used to be a BAG member an a rather prolific bot operator, but my departure to college left me with no time and sporadic internet access. This summer I'm moving into my new apartment, which will hopefully give me more space to set up my linux laptop again. If there are any of my former bots that haven't been replaced, including the IRC ones for election times, I have the code and a system to run them on. I posted a longer bit on my talk page, if you'd direct any comments to me there, that'd be great. Let me know if there's anything I can do. ST47 ( talk) 05:50, 13 May 2010 (UTC)
I am adding this note here at the request of OlEnglish ( talk · contribs).
In a recent incident, after I had used a 'final warning', SineBot added a level-1 warning for failing to sign. This resulted in the admin not seeing my final warning and declining a request to block (following additional disruption) at AN.
AN/I.
Chzz ► 07:18, 18 May 2010 (UTC)
Hello,
I am working on a new release of the Perlwikibot framework with ST47. This release will contain some breaking changes, so if you use Perlwikibot, please be sure to check the documentation prior to upgrading. It will use the API wherever possible, and will handle some basic autoconfiguration to make authoring scripts much easier.
If you'd like to suggest improvements to the framework, please file a bug at http://perlwikipedia.googlecode.com – and patches are welcome, of course. You can check out our page on OpenHatch for ways to get involved.
— mikelifeguard@ enwiki:~$ 18:30, 22 May 2010 (UTC)
Bot-savvy users and BAG members may be interested by this thread on ANI - Kingpin 13 ( talk) 16:31, 26 May 2010 (UTC)
The bots BenzolBot ( talk · contribs), Dinamik-bot ( talk · contribs) and RibotBOT ( talk · contribs) keep adding links to Alforja, Spain to the article Pannier. The non-English pages an:Alforja, ca:Alforja, es:Alforja, eu:Alforja, fr:Alforja, it:Alforja, nl:Alforja, ru:Альфоржа, uk:Алфоржа, vi:Alforja and war:Alforja are all articles about the town in Spain. Alforja used to redirect to Saddlebag but this morning I changed it to a dab page. One of the bot owners previously explained to me that they could only fix their bot and that non-English bots would continue to replicate the error. I don't really understand it but it would be nice to fix this problem somehow. -- Dbratland ( talk) 19:02, 29 May 2010 (UTC)
{{
nobots}}
" on it -
Kingpin
13 (
talk) 20:00, 29 May 2010 (UTC)Okay, that's an overstatement, but why does this regex eat ALL text on an article, rather than stopping at the end of the template's curly braces? (examples: [24], [25]) I've changed it after the linked revision to the following:
$content =~ s#\{\{current(.*?)\}\}##i;
But I can't entirely explain if this works, either. I've also put in a failsafe ("how many chars am I removing?") to keep it from doing so again. tedder ( talk) 06:02, 1 June 2010 (UTC)
\{\{current([^{}]*?)\}\}
instead, which will be fine unless these current event templates can take a nested template as an argument.
Rjwilmsi 07:26, 1 June 2010 (UTC)
TF2 Wiki is a community wiki covering topics surrounding the game Team Fortress 2. We are beginning a process of moving to a new domain and server. As none of the current administrators have access to the FTP site we are manually moving some 5,800 files across. There is no problem in downloading all these files for the transfer (as we have systems in place to automate it), but the thought of manually uploading them all to the new server gives me goosebumps. Would anyone here be willing to offer any bot-related assistance or advice? Is this the best place to ask such a question? — surlyanduncouth ( talk) 19:14, 4 June 2010 (UTC)
Hi, this is just a notice that I have opened a brfa for an adminbot to delete images that are available as identical copies on the Wikimedia Commons per WP:CSD#F8 -- Chris 10:13, 8 June 2010 (UTC)
I have not created a bot yet, but I would like to know if redundancy would be a desirable feature (prompted by the recent ClueBot downtime). The implementation I am thinking of would involve two copies of the bot running on separate computers and Internet connections (e.g. one on a home computer, the other on Toolserver). The two instances would periodically check each other's most recent edit timestamp, and if there is no activity for some time, the other bot would take over. Additionally, the active bot would periodically hand off its work to the inactive bot to prevent its failure from going unnoticed. Thoughts? PleaseStand (talk) 17:51, 13 June 2010 (UTC)
It seems like it would be good to create a very extensible, configurable PHP bot framework, capable of being run on MediaWiki installation, so as to avoid needless duplication of coding. Rather than put every possible function under the sun into the core (including those added by obscure API extensions), it should have lots of hooks and allow for new bot functions to be added as plugins; hopefully a whole library of such plugins can be created, analogously to what has been done with m:pywikipediabot. So then the question arises, which bot framework should be the starting point? I created this table in an effort to determine which bot frameworks are the most powerful. wikibot.classes.php is pretty popular, and I put a version here that has been modified to be more readily configurable, or at least to allow configuration from a single file. Probably code from its many forks (including admin code from botclasses.php could be merged into it to create one supremely all-powerful superbot like something out of Transformers. But Pillar looks pretty powerful too. Other than by having them square off, I'm not sure what criterion would be best for determining the bot that should rule them all. Their codebases are totally different so it does not appear there is much prospect of merging them. Tisane ( talk) 21:19, 2 June 2010 (UTC)
I think this is a good idea to remove the massive overlap of PHP bot frameworks we have (even my framework itself is forked into two different versions atm). This page might be useful in helping you to determine the different functions that each framework supports etc. I'll be happy to help in the coding, although the extent of my help will depend on the amount of free time I have -- Chris 03:07, 5 June 2010 (UTC)
Keeping the suggestions here in mind, I'm halfway through writing a new framework that incorporates ideas from wikitools, pillar, botclasses.php, SxWiki, and ClueBot, plus adding functionality such as plugins and Non-Wikipedia-Centricism. You can watch my progress here. It's still a MASSIVE work in progress, and fixmes are everywhere. ( X! · talk) · @635 · 14:14, 10 June 2010 (UTC)
(email redacted)
) to the project? Also we should probably create a page onwiki for it, and maybe create a planning/discussion page as well? --
Chris 03:27, 12 June 2010 (UTC)
- Jarry1250 Humorous? Discuss. 13:32, 13 June 2010 (UTC)
So now that we've gotten all the base function calls laid out, what additional suggestions for plugins/hooks would you guys have? The more, the better. ( X! · talk) · @160 · 02:50, 15 June 2010 (UTC)
We'd need a bot which updates the number of articles under pending changes and a bot which reports certain kind of edits if possible, please see Wikipedia_talk:Pending_changes/Trial#Bots. The trial will begin in a few hours. Cenarium ( talk) 17:45, 15 June 2010 (UTC)
If you run a bot on nightshade.toolserver.org, you may be intersted in this WMF bug: bugzilla:23982. — Carl ( CBM · talk) 19:27, 15 June 2010 (UTC)
Does anyone happen to know a good webhost on which to run a bot, other than the toolserver, that will let you run it 24/7 without killing it? I use Bluehost, but they have a known issue with processes getting mysteriously spontaneously killed all the time, even when you use a dedicated IP. Thanks, Tisane talk/ stalk 03:44, 15 June 2010 (UTC)
Sorry about this, but I really don't have the time, or regular enough internet access, to maintain the wubbot ( talk · contribs) anymore. Would anyone like to take it over? Its only task is to check the subpages of Wikipedia:WikiProject Deletion sorting twice a day, and remove any AfD's that have been closed. It's written in Python (using pywikipedia), and currently runs from the toolserver. The code is available at User:The wubbot/source - apologies for the mess! There's a few bugs and possible improvements listed on the bot's userpage that someone more skilled than me may be able to fix. Reply here if you're interested - I would be extremely grateful, and I apologise for not being available to take care of this over the past few months. the wub "?!" 15:51, 17 June 2010 (UTC)
I've written a JavaScript framework for bot scripting that I would like to invite feedback on. The motivation behind the framework is more-or-less solely that I think JavaScript a wonderful and accessible little language for doing ad hoc pieces of code. The name I've given the framework, Luasóg, a diminutive form of the Gaelic word luas, meaning speed, is supposed to reflect this motivation.
A part of the over-arching project is a web-based IDE for writing and executing scripts using the framework. The project also includes a version of the IDE hosted in an Air application so that cross-domain scripting can be executed outside of the sandbox of the main browsers. If you are willing to give the framework a spin, the Air application is likely to be your best bet. You can download it here.
This is a development release and, though not very buggy, it is not stable, particularly the IDE element. It should execute scripts as expected however and so please tell me if it doesn't (or indeed any other problems you come across).
The release notes for this release contain a simple example script. The full API is documented here. The 'request' method gives access to any function of the MediaWiki API. Other methods act as convenient wrappers to common functions. So far the number of methods is essentially limited to "log in, get content, replace content, and log out again". Developing this was my first experience in bot scripting so I would particularly like to get feedback on the kind of common bot functions that should be included in later releases. If anyone wants to write methods for the framework then they are more than welcome to do so.
As a note of caution, scripts executed using the framework are throttled by default to one-call-every-10-seconds. This can be changed via the 'speed' property.
Thanks in advance, --RA ( talk) 09:08, 26 April 2010 (UTC)
prop=info|revisions&intoken=edit
or the like). Getting it wrong screws up the "deleted since you started editing" check, which could possibly cause you to recreate a page that was just deleted or to fail to edit a page that was just undeleted. (Edit conflict handling is one of my pet peeves) I am glad to see AssertEdit usage in there. If you really want to be safe regarding deletions and page creations, ensure that the data for the edit function always includes the prop=info output and pass "createonly=1" if prop=info includes "missing" or "nocreate=1" if it doesn't.
Anomie
⚔ 02:04, 1 June 2010 (UTC)
I wrote a prototype bot using your framework. It is the unapproved bot User:PSBot – at this time it is only editing one page in its user space. The source code is at User talk:PSBot/Deprods. While coding, I noticed a lack of some important features:
I hope you keep up the good work, and if you can provide feedback on the design of my bot, please do so. This is my first one. PleaseStand (talk) 06:37, 20 June 2010 (UTC)
I asked for
VP comments on a proposal to let bots fill in url |accessdate=
automatically from the date the url was first added. I wasn't sure whether to post this under Bot Policy talk, so posting here. —
Hellknowz ▎
talk 16:58, 31 May 2010 (UTC)
I was curious, and pulled a list of accounts currently flagged with both "bot" and other groups.
Note that giving a bot "autoreviewer" is currently useless, as the bot group already has the autopatrol right. It looks like that's the only group totally redundant for a bot, as even "confirmed" has rights that a (non-autoconfirmed) bot lacks: patrol, move, movestable, reupload, collectionsaveascommunitypage, collectionsaveasuserpage, and upload. Anomie ⚔ 20:35, 22 June 2010 (UTC)
I just noticed that AntiAbuseBot's approval stated that it was only approved "until such time as the AbuseFilter extension or a substantially indentical technical feature is turned on by the sysadmins at the English Wikipedia. At such a time, a report should be made to the bot owners' noticeboard requesting that Chris G either to turn off the bot or seek re-approval in a Wikipedia:BRFA". So technically, that should be done. But since the AbuseFilter's ability to block users has not been activated, and a discussion at ANI occurred in August 2009 with consensus for the bot to continue running, I intend to re-approve Wikipedia:Bots/Requests for approval/AntiAbuseBot as simply "approved" without condition, but with the suggestion that it be revisited if the AbuseFilter's block functionality is ever activated instead of forcing a new BRFA with a most likely foregone conclusion. Any objections? Anomie ⚔ 21:01, 22 June 2010 (UTC)
In my opinion RaptureBot is not checking sufficiently before replacing Wikipedia images with Commons images. I have reverted several that popped up in my watchlist which have incorrect attribution on Commons. [26] [27] [28] [29]. Incorrect attribution is cause for deletion on Commons which could leave the articles without images. The bot should check that the attribution on Commons matches the attribution here before replacing. At the very least it should be tagging the files with the information that they do not match to warn administrators not to delete the en.wiki version without checking, but probably the right thing is not to replace the image until the problem is resolved.
There are also several complaints on the bots talk page that the bot has replaced an image with one at Commons threatened with deletion. This really should not be happening. This replacement of a fair use image immediately caused the Wikipedia image to be deleted. This was despite a challenge to the copyright tag on Commons. Potentially, this could have resulted in both images being deleted.
Since the bot owner is showing no inclination to pro-actively clean up these errors him/herself, leaving it to others to sort out, I think that the bot should be forbidden from doing any further runs until tighter checking is implemented and approved. SpinningSpark 18:04, 25 June 2010 (UTC)
So after a few weeks of work, the new PHP framework that was called for above is reaching a public beta point. It's got most of the common functions that bots use, and those are fairly stable. The only things left to do are to fill in the remaining obscure functions and plugins, do bugfixing, etc before the stable 1.0 is released. In the beta stage, we need multiple people to test out the framework using their own bots, as this is the only way many of the bugs will be found. It can be downloaded here, and the manual is here. When bugs are found, they can be listed here, and feel free to submit a patch. I hope it works well for all of you. :) ( X! · talk) · @082 · 00:57, 28 June 2010 (UTC)
The User:ArticleAlertbot has been down for a few months now, due to some sort of login problem. The coder, User:B. Wolterding, has not logged in since March, and thus is not available to fix this. I believe in April, a couple of users discussed the issue with the bot's operator, User:Legoktm, and User:Tedder agreed to try to fix this. However, he appears not to have a toolserver account (or something like that), so he has been unable to do this. Is anyone here willing to give it a try? I'm sure many WikiProjects would appreciate it. (I posted a similar request in the village pump in late March, but nobody replied). Bramble claw x 00:47, 12 June 2010 (UTC)
Does anyone happen to know of a BOT that removes Red Links from articles. -- intraining Jack In 23:14, 2 July 2010 (UTC)
ArbCom is considering lifting the restriction imposed in Wikipedia:Requests for arbitration/Date delinking#Lightmouse automation, subject to BAG approval of Wikipedia:Bots/Requests for approval/Lightbot 4. As part of BAG's mandate is to gauge community consensus for proposed bot tasks and Lightbot's former activities were highly controversial, I invite all interested editors to join that discussion to ensure that community consensus is in fact in favor of this task. Thanks. Anomie ⚔ 17:32, 13 July 2010 (UTC)
During the code review for the removal of this preference setting, it was noted that it may effect bots that were using this setting. In this case, the bots would need to be modified to explicitly mark their edits as minor. I'm sure this would be a fairly simple code addition, but it could also be accomplished through javascript as explained in this VPT thread. – xeno talk 12:50, 15 July 2010 (UTC) [with thanks to User:Department of Redundancy Department for pointing this out to me]
Looking at bug 17450, it would appear there is some possibility that we could get an XMPP-based XML-format recent changes feed that would include revision text. I wouldn't suppose this bug would be too hard to fix, considering that MediaWiki already has IRC recent changes feed capability. Anyway, the revision text would be quite useful, since it would eliminate the need for bots to hit Wikipedia's API for said text. Therefore, I'll probably end up creating an extension, or finishing this one. Does anyone have a particular XMPP PHP library they think is particularly good, and would recommend using for this project? I found this one, but their SVN repository doesn't seem to be working at the moment. Hopefully, with the right library, we can make that Jabber shit happen soon! Thanks, Tisane talk/ stalk 15:47, 16 July 2010 (UTC)
Do you think there would be support for a bot to replace barelinks with bot-generated titles, much as DumZiBoT used to? Tisane talk/ stalk 19:10, 25 July 2010 (UTC)
An hour or so ago, we were apparently updated to r70061. This seems to have broken the use of "max" as a value for the various limit parameters in the API; attempting to use them will now give a fatal error. This has already been reported as T26564, hopefully they fix it soon. If necessary, a workaround is to use explicit limit values in your API queries (typically 5000 for flagged bots, 500 for some expensive queries). Anomie ⚔ 03:49, 28 July 2010 (UTC)
Comments are invited at the above-linked thread. – xeno talk 14:46, 4 August 2010 (UTC)
Hi guys, in the past couple of weeks I implemented a bot to easily mass move files from the English Wikipedia to Commons. The bot focuses on self-published works. The bot is now beta and I'm looking for people to test it. See Commons:User:Multichill/Imagecopy for more information. multichill ( talk) 16:34, 8 August 2010 (UTC)
Hello bot operators! I have been nominated for the bot approval group and would appreciate input at Wikipedia:Bot Approvals Group/nominations/EdoDodo. Thanks. - EdoDodo talk 02:46, 17 August 2010 (UTC)
User:Coren has been missing since July 20th, and User:CorenSearchBot is down. (I e-mailed Coren to see if he was all right on August 10th or 11th and have received no answer, which worries me. Coren is always responsive! :/)
I consider this pretty urgent for copyright cleanup, as CorenSearchBot typically finds dozens of valid copyright problems in a given day. I don't know how many of those will be found by new article reviewers. Some of them may be being tagged for WP:CSD#G12, but I'm afraid that a good many are likely to be overlooked. User:Xeno tells me that the source code for CorenSearchBot is published at [30]. Is it possible to get a temporary replacement bot or one that can run in parallel with CorenSearchBot?
There is some urgency to identifying and eradicating copyright problems quickly. As we all know, Wikipedia is very widely mirrored and reused around the internet, and this content doesn't have to be published long before substantial damage can be done to the interests of copyright holders...and the reputation of Wikipedia. -- Moonriddengirl (talk) 14:43, 19 August 2010 (UTC)
# End of customizable exclusions
#
return "already-tagged" if $text =~ m/{{csb-/;
Ah, clearly I missed that part of the code. I'll go ahead and fire mine back up again and we'll see how a race between CSBot and VWBot goes... VernoWhitney ( talk) 16:23, 21 August 2010 (UTC)
I am proud to announce the first major release of the Peachy MediaWiki Bot Framework, version 1.0!
After almost 3 months of hard work, I believe we are at a point where the framework is stable enough to be officially released to the public. In those three months, multiple bots, including SoxBot, MessageDeliveryBot, RaptureBot, and many others have been operating nonstop on top of the Peachy framework. I can only hope that other PHP bot operators follow along.
New features since the public beta include...
Upgrading from 0.1beta should be for the most part seamless. Very few breaking changes have been implemented. The minimum PHP version has been bumped to 5.2.1, as many of the internal features use functions implemented in this version. Other than that, scripts that were written for 0.1beta should work in 1.0
If you have not yet written anything in Peachy, now would be a good time to learn! The Peachy manual has been redesigned for this release, and with an intuitive guide for getting started, you should be writing your first bot in minutes!
Naturally, there may be a few bugs that will arise in the first release. The issue tracker is located at http://code.google.com/p/mw-peachy/issues/list. Don't hesitate to report something that doesn't seem right! We can't fix something we don't know about.
To download version 1.0, see the Peachy Wiki. Instructions for downloading the nightly compressed archives and SVN repos are located there.
Thank you for supporting Peachy, and enjoy!
( X! · talk) · @155 · 02:42, 30 August 2010 (UTC)
I'm not sure if this is the right place to put this, but per [31] the archive indexer bot has been running while logged out. I remember there was some API change a couple months ago that caused breakage like this, but I thought all affected bots would have been fixed by now. 67.122.209.135 ( talk) 21:54, 1 September 2010 (UTC)
Hello. Per the ArbCom conditions on my talk page (#4 in particular), I seek some advice.
I currently use no bots/scripts which automate any API-WRITE functionality. However, for purposes of my anti-vandal tool STiki, I do have scripts making a large quantity of API READ requests. Am I correct to assume that such READ scripts require no approval from this group?
On a different note, is there any way to get in contact with API folks and provide them my IP addresses, etc. -- so they know my requests are purpose-driven? Thanks, West.andrew.g ( talk) 18:12, 11 August 2010 (UTC)
We're now discussing 'bot-assisted solutions to this cleanup problem. Uncle G ( talk) 13:05, 6 September 2010 (UTC)
We're now at the stage where the 'bot is ready to roll, and no-one has voiced an objection. (Indeed, to the contrary: Several people want to go further, and mass delete the articles.)
If the 'bot goes ahead, this will probably light up some people's watchlists like Diwali. Be warned. Uncle G ( talk) 04:33, 10 September 2010 (UTC)
Comments are invited at Wikipedia:Administrators' noticeboard/Incidents#VolkovBot overly eager to remove interwiki links, especially from parties familiar with the m:interwiki.py function of the pywikipedia framework. – xeno talk 15:15, 9 September 2010 (UTC)
Ever since DASHBot became an anti-vandal bot, I notice that some users are getting double warnings after DASHBot reverts one edit.
Here are five recent examples:
What I want is the issue of double warnings to be fixed. mechamind 9 0 22:23, 18 August 2010 (UTC)
Untagged, thats because my bot has been down for 13 hours . Cobi, I had that error too. What I ended up doing was querying the API right after reverting an edit, to see if the top revision was listed as DASHBot's. Tim 1357 talk 17:32, 19 August 2010 (UTC)
I suppose it's
mechamind 9 0 02:47, 13 September 2010 (UTC)
Someone needs to take a look at Nubian Jak the interwikis are a complete disaster and its above my skill to fix. ΔT The only constant 13:05, 17 September 2010 (UTC)
My bot fixes a wide range of wikilink syntax problems and some redundancies. Recently user Magioladitis asked me to add also the syntax consolidation "[[architect|architects]]" --> "[[architect]]s" (already fixed by AWB, eg). What do you think? -- Basilicofresco ( msg) 12:24, 14 September 2010 (UTC)
Your comments are welcome: Wikipedia:Bots/Requests for approval/FrescoBot 7 -- Basilicofresco ( msg) 10:15, 20 September 2010 (UTC)
Could someone look at Wikipedia:Bots/Requests_for_approval/CleanupListingBot and either pass/fail the bot...it's been a few days.. Smallman12q ( talk) 22:17, 23 September 2010 (UTC)
I'm having some trouble at the diglyceride article, in that the article has apparently become associated with non-identical articles on other language wikipedias, and now a bot ( User:TXiKiBoT, though I'm guessing other bots would do the same) is reinserting the links when I try to remove them. I guess this kind of bot looks at associated articles across many different language versions, so maybe the issue has to be addressed everywhere simultaneously. I don't know all the languages involved, particularly because some of them are in non-Latin alphabets, so that kind of fix is beyond my abilities. Also, the bot owner doesn't seem to be responding, at least not as quickly as his own bot. So I'm hoping someone here can help fix this. Sakkura ( talk) 15:17, 1 October 2010 (UTC)
{{
bots|deny=TXiKiBoT}}
while fixing the interwiki map. –
xeno
talk 15:51, 1 October 2010 (UTC)
I've done this for monecious or whatever it is a number of times. There are also a bunch of articles where there is no simple bijective mapping, but interwikis are still useful, see below for example. Not high on my list of priorities but something I have given a fair amount of thought over the years. Interwikis should become SQRT of current, i.e. O(n), as edit intensive once the Summer of Code "Reasonably efficient interwiki transclusion" goes live.
Rich
Farmbrough, 11:50, 6 October 2010 (UTC).
...how do all of the "most well known interwiki bots" run all the day through all of the Wikimedia wikis. I'm not any good with scripting (well, I can do basic things) and just learned to use the Unix shell (after using Windows all of my life). Is there something like a sh or a special function of the pywikipedia scripts to let the bot run always? P.D. My bot ( User:Diego Grez Bot) runs from the Toolserver, if that is worth something. -- Diego Grez ( talk) 17:06, 2 October 2010 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Query articles in "Category:Bad category" do {{edit categories|swap|Bad category|Good category}};Also in the future I can see opening up the bot's source code so other developers can add features to it. So think of it as a wiki-bot. :) So what do you all think, do you see any issues with it, will editors actually use it and contribute to it? d'oh! talk 09:33, 14 September 2010 (UTC)
See below for the updated proposal. -- d'oh! [talk] 16:22, 29 October 2010 (UTC)
Sounds interesting. -- Chris 13:38, 15 September 2010 (UTC)
This sounds a bit like an old project of mine. The language I adopted is a dialect of Lisp called Scheme, and I did at one point have a prototype running. Since then I've done little work on the bot side apart from building a bot framework in Scheme based on the API.
I also tried to make the job control aspects of the system independent of the language implementation, and that aspect of the project may be of interest. In essence, once a user is authorized to use the bot they can invoke it by typing a command on a page in their user space (the talk page would be okay for low volume users) Because it's script-based, it enabled editors to develop a script on the Wiki and then other authorized editors who may not have any programming skills can invoke the script. The bot would typically run in a jail on a host server that monitored pages, performed security checks to filter out unauthorized edits for instance, and then placed results into the invoking user's userspace. I felt that at least initially it would be too risky to permit the bot to perform changes outside userspace, and I envisaged it being used mainly to perform statistical analysis and construct lists of articles for work by a human user or another bot.
You could also implement scripts in Perl, PHP, Java, even C, but the main issue there is ensuring that the code cannot mess with the host system, remembering that anybody once authorized can write a script for the bot. The extreme reprogrammability of Scheme makes it ideal for this (though some people have an allergy to all the brackets). Creating a safe environment for running scripts is quite easy in R5R Scheme.
If anybody is interested, pleace leave a message on my user talk page. -- TS 13:55, 29 October 2010 (UTC)
object item { method doSomethingCool ( #something ) { ... do something to #something ... return self.doSomethingCoolToo(#something); } method doSomethingCoolToo ( #something ) { ... do something to #something ... return #something; } } #pages = pages.category('Jargon'); foreach (#pages as #page) { #website = website.url('http://example.com/' + #page.title); #result = item.doSomethingCool(#website.content); #page.content = #result; }
The updated idea is a platform to run code (scripts) from Wikipedia created by editors, the code will be picked up from pages on Wikipedia (via a template like {{ User:D'oh!/Coding}}). The code syntax will be object-oriented and will look very similar to PHP, C and Ruby, e.g.:
#pages = articles.category('Computer jargon').limit(10); foreach (#pages as #page) { delete #page.categories.find('Computer jargon'); }
The platform will ask for and use the editor's bot account, to make edits to Wikipedia, this will make the editor responsible for the edits their code does. Plus the editing privileges are transfered over to the platform, so editors can only do edits which they can do with their main account. The platform will either run the code continuously or once-off. -- d'oh! [talk] 16:22, 29 October 2010 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
I have a quick question for everyone (since I know little about bots). A couple of months ago, I left the above user a note about his username. S/he mentioned they were in the process of getting approval to run their bot. OK, I figured that was legit. Today, however, I noticed two things. One, the bot was never approved (the request expired). Two, the account has continued to edit, including !voting at AfDs. Should this account be blocked as a username violation? TN X Man 14:21, 3 June 2009 (UTC)
This bot did something strange. It changed the redirect for "amœba", which should redirect to Amoeba, to James Woods. Look for yourself: amœba. The redirect page 68.248.233.52 ( talk) 02:07, 4 June 2009 (UTC)
Erwin85Bot ( talk · contribs) carries the task of adding new BLP AFDs to Wikipedia:WikiProject Deletion sorting/Living people. New discussions should be added to the top, leaving those to be closed at the bottom. Unfortunately, Erwin85Bot adds them to the bottom, sandwiching the old in the middle. This caused some problems. Two-week old AFDs unclosed because they weren't seen. I fixed the page and left Erwin a message, but he seems to be busy IRL perhaps, so the error has not been corrected.
I'm not going to block because it's not a big deal, just an annoyance. It's still better to be able to cut/paste the new from the edit window than have to manually search for the BLP AFDs. It would just be nice if this fairly minor error could be fixed if someone else has access to the code. لenna vecia 13:31, 4 June 2009 (UTC)
Done There is a discussion of recommended settings for this module at Wikipedia:Village pump (technical)#Cosmetic changes (Wikitext cleanup options of pywikipediabot) -- User:Docu —Preceding undated comment added 11:57, 4 May 2009 (UTC).
Hey. Miszabot just put 12 threads into 12 archives. Closedmouth has suggested I get an admin to help clean up. Admin or not, does anybody know why Misza did this? -- I dream of horses ( talk) 16:54, 10 June 2009 (UTC)
I have asked in a couple of places about a bot that has created some 6000 algae articles. Every article I've reviewed has major errors of fact due to the way the bot was programmed to extract data from the data base and the bot owner's lack of understanding of the organisms (this point from discussion on the bot owner's talk page). A discussion about problems with the articles the bot created is at [1]. That issue can be discussed there.
What I would like to know here is why a bot was programmed at all to
1. remove more specific redirects to create less specific ones? [2]
2. delete disambiguation pages to create single redirects instead of adding the redirect to the disambiguation page? [3]
Should a bot be deleting entire disambiguation articles to make room for redirects? Is there a set of rules for bots that covers this and was maybe missed with programming Anybot?
-- 69.226.103.13 ( talk) 21:12, 16 June 2009 (UTC)
I drew this up to hopefully save time and effort, whilst reducing the demoralising effect. Any contributions handy, particularly if you went for a bot whilst still inexperienced as a normal editor. - Jarry1250 ( t, c, rfa) 11:08, 20 June 2009 (UTC)
I have started a community RFC about a proposal for a bot to unlink dates. Please see Wikipedia:Full-date unlinking bot and comment here. -- Apoc2400 ( talk) 10:25, 22 June 2009 (UTC)
This seems to be an unapproved bot, making edits. I've left my own message on the talk page, as I'm not sure if there's some template for such things. I feel uneasy with a username report (which would be because it has "bot" in it's name while it's not one), so hopefully the owner will read my message. If someone more knowledgeable in such things could take a look at it that would be appreciated. Cheers - Kingpin 13 ( talk) 20:27, 22 June 2009 (UTC)
I had to block SoxBot it was making a mess of WP:CHU. Q T C 01:50, 3 July 2009 (UTC)
r52190 changed the API so maxlag errors return with an HTTP status code 503 (where formerly they returned with a 200 and an API error message). If your bot code uses maxlag, properly handles HTTP errors, and doesn't treat the two in the same way, chances are you will need to update your code before the next scap. And if your bot code doesn't do all that, maybe it's time to update it Anomie ⚔ 20:19, 20 June 2009 (UTC)
Because of this and other complaints on the mediawiki-api list, I've
reverted the change in r53353. It never went live on Wikipedia.
\o/ Q T C 10:16, 16 July 2009 (UTC)
Bot operators who use the API to download lists of category members will want to watch bugzilla:19640. The 'cmnamespace' parameter is currently being ignored by the API until performance issues are sorted out. — Carl ( CBM · talk) 19:21, 10 July 2009 (UTC)
Hi, could someone verify this is an approved bot - or indeed a bot at all. It has no userpage or current talk page, and is tagging edits with minor, labelling them as robot:..... All it appears to be doing is adding ar: interwiki links, but as I don't read Arabic so well I cannot verify their validity. Also, I seem to recall an editor should not calim to be a bot when not (or was that the other way round?) Addendum - sorry, quick link to contribs :-) --ClubOranje T 12:51, 11 July 2009 (UTC)
This bot is reverting legitimate edits and the owner refuses to fix it.
Reverting test edits should be done by a dedicated bot (and probably already is). Such a bot should match the entire text http://www.example.com link title
as added by the editing buttons, not just indiscriminately reverting any link that contains example.com. — Preceding
unsigned comment added by
71.167.73.189 (
talk •
contribs)
Per edit http://en.wikipedia.org/?title=User:ClueBot_III/Indices/User_talk:AzaToth&diff=302790668&oldid=302781497 I've blocked this bot. → Aza Toth 01:05, 19 July 2009 (UTC)
Hello there. Just to let you know that I (Kingpin13) have been nominated for BAG membership. Per the requirements, I'm "spamming" a number of noticeboards, to request input at my nomination, which can be found at Wikipedia:Bot Approvals Group/nominations/Kingpin13. Thanks - Kingpin 13 ( talk) 08:01, 20 July 2009 (UTC)
I often use the template sandboxes (" Template:X1", " Template:X2", etc.). However ClueBot often clears the sandbox while I'm partway through testing. I drew this to the owner's attention here. Cobi informs me that the bot clears the template every 3 hours. The sandboxes indicate that they are cleared every 12 hours. Ironically, neither is correct. The cleaning seems to be rather erratic. The times of last clearance by the bot:-
I request that if possible, the bot can be adjusted so that it doesn't clear a template sandbox within, say, 15 minutes of the last edit. Is this possible? Thanks. Axl ¤ [Talk] 06:44, 3 August 2009 (UTC)
Very well, unless there is any opposes, I'm gonna ask Cobi to make his bot clean it after it's been unedited for 30 minutes, and we'll keep SoxBot cleaning it every 12 hours, and every time the header is removed (I think the bots will work together very well like that). - Kingpin 13 ( talk) 03:33, 18 August 2009 (UTC)
I am concerned about User:DefaultsortBot, it appears the owner is not taking responsibility for the bot edits. Please see User talk:DefaultsortBot#Did I get it wrong?, Thanks for taking a look, good night. Jeepday ( talk) 00:46, 8 August 2009 (UTC)
Ser Amantio di Nicolao ( talk · contribs) seems to be automatically mass creating pages using AWB without a bot flag, someone might want to look at this? Peachey88 ( Talk Page · Contribs) 08:12, 16 August 2009 (UTC).
Rockfang suggested that bots should not edit copyvio pages. I am not convinced. However I have put {{ Nobots}} in the {{ Copyvio}} template. Comments? Rich Farmbrough, 18:35, 17 August 2009 (UTC).
In the case in point another bot was interwiki-ing. Seems that would have been useful if the page was stubbified. Rich Farmbrough, 21:37, 17 August 2009 (UTC).
A small addition to the tasks for KslotteBot.
Can this be approved? -- Kslotte ( talk) 15:11, 24 August 2009 (UTC)
Pls see Wikipedia_talk:Requests_for_adminship#RFA_closing_time — Rlevse • Talk • 00:05, 25 August 2009 (UTC)
Just want to get a wider opinion on on this bots' recent edits, at first it seemed unrelated to any of it's previous BRfAs [5] [6] [7] [8], so I gave him a warning that it was running an unapproved task. Now after talking with the operator, the edits are going back to update it's previous edits. After the clarification, I'd probably let it slide under the previous approved task, but wanted a wider opinion on it. Q T C 09:36, 31 August 2009 (UTC)
The use of this template in main-space seems to have spread. I suggested its use on copyvios (and created it in the first place) but I like to see maybe one or two articles protected in this way, while problems are dealt with by the bot owners. A list of protected articles and (provisional) details of one of the applications of the templte are here. Rich Farmbrough, 14:57, 6 September 2009 (UTC).
I would like BAG members and other bot operators to look at the bugs by Citation bot and how they are being handled including its operator's interactions with users' reporting bugs. There are a lot of bugs and may be poor communication, in my opinion.
I think additional input at this stage from experienced bot operators could help in making this a relatively bug-free bot.
In my opinion all currently listed bugs should be fixed and investigated before the bot is run again. If the bot is operating reported bugs should receive a response from the bot operator on the bug board in a reasonable time (less than a month? a few days? a week?) and be marked resolved where applicable. If there is a question about what the bot should be adding, the bot operator should post a link showing a prior discussion, or raise the discussion on an appropriate board, or not be adding/changing that parameter. I understand this bot does a lot of useful and tedious work, but, if there are problem areas it should be scaled back and the problems fixed, rather than be running with bugs, in my opinion. I'm posting a link to this on Martin's page. [9] -- 69.225.12.99 ( talk) 22:19, 10 September 2009 (UTC)
[10] -- 69.225.12.99 ( talk) 09:33, 12 September 2009 (UTC)
This bot is cleaning the sandbox every few minutes. That makes it impossible to do any testing there. Furthermore, the owner is on a wikibreak and the bot's talkpage is editprotected. So this is basically a "wild" bot.
For these two reasons, each in itself being sufficient, I think this bot should be turned off. Debresser ( talk) 06:59, 13 September 2009 (UTC)
I don't know what the WP policy is on bots from bare IP addresses. But the IP address 75.69.0.58, for whom I assume good faith, appears to my untrained eye possibly be a bot. So thought I would bring it to your attention. 500 edits in 9 days, edit rate seems higher than could be manually sustained given the typical edit complexity I noted. But the alleged bot appears to be doing good work, at least on the article where he cleaned up after me earlier today. Cheers. N2e ( talk) 18:15, 25 September 2009 (UTC)
Please see the talk between me and Cobi at User talk:ClueBot Commons/Archives/2009/September#Duplicate month headings. I noticed that ClueBot resets the monthly clock on vandalism warnings, and Cobi explained that this feature is intentional and has been discussed numerous times before. The reason that I'm bringing it up here is that it seems to me that the feature, although clearly well-intentioned, may be worth re-evaluating. It seems to me that human editors will generally continue the progression from first to second etc etc warning over the course of a longer time period than the bot is currently instructed to do. I suggest that this issue be given a fresh look. Thanks. -- Tryptofish ( talk) 22:00, 28 September 2009 (UTC)
In the interests of full disclosure:
Since CobraBot a little while ago completed its initial pass over all the articles containing a book infobox (yay!), I modified it to run manually on pages specified by the operator; I've been trawling the internal logs the bot produces when it skips a page to (if possible) manually fix the infobox formatting or add an ISBN where there was none, so the bot can then be run on the individual affected pages. Unfortunately, doing so caused a bug where the edit summary didn't get set, and as a result the default, generic pywikipedia edit summary was used. This has since been fixed and the bot did not otherwise malfunction (i.e. the edits were not misformatted or wrong), but I only noticed it after ~125 edits when I chanced to look at the bot's contribs page. My apologies, and I've noted for and resolved in the future that even seemingly minor changes deserve a level of further testing. -- Cybercobra (talk) 06:36, 1 October 2009 (UTC)
On five different occasions, bots have added the Swedish version of the 2009 Eurocup Formula Renault 2.0 season as an interwiki. Please can they be set to avoid this situation, as the two series are not the same. Regards. Cs-wolves (talk) 14:46, 9 October 2009 (UTC)
The Northern Artsakh ( | talk | history | protect | delete | links | watch | logs | views) in Russian WP was actually deleted on May 16, but some bots continue to add false positive interwiki ru:Северный Арцах/Temp. Could we fix that? Brand t] 07:26, 13 October 2009 (UTC)
FoxBot ( BRFA · contribs · actions log · block log · flag log · user rights) is making a series of changes to date pages which is adding interwiki links such as bcl:Nobyembre 5 which is fine. However at the same time this bot is changing an agreed & n d a s h ; with a dash.
The agreed format of the date pages is to use & n d a s h ; because most people adding items to those pages will not know the difference between a hyphen and a dash. The & n d a s h ; to create a dash reduces the chance that people will use a hyphen. The use of a hyphen messes up something, various checking bots I think, but whatever reason the hyphen is NOT USED on thos date pages.
Will somebody first stop this bot and then will somebody with rollback please revert all the edits made in error by this bot. -- Drappel ( talk) 14:09, 16 October 2009 (UTC)
I am confused here; the changes such as
[11] are replacing the HTML entity – with an equivalent unicode character
u2013. The rendered content is not changed at all. Even if FoxBot doesn't do this substitution, eventually someone will come across the articles with AWB, which also does this substitution. In short, there is essentially no way to prevent html entities from being converted to equivalent Unicode characters. — Carl (
CBM ·
talk) 14:42, 16 October 2009 (UTC)
Sorry for the late reply. I have currently disabled cosmetic changes. However I strongly disagree with the way en wiki uses ndash and mdash. On all other wiki's they are automatically being replaced by the right Unicode character. Why should this wiki bee different from all other wiki's. First of all ndash and mdash are complicated for all users who aren't familiar with ndash or mdash and there even more for "foreign" users who believe they should be changed. What one could do to prevent this from happening, as it will certainly do as AWB also automatically does these changes, is or to change the rules when to use ndash or mdash (wich would be the easiest), or to change the script of pywikipedia and AWB and AutoEd, or to create a bot that reverts thes edits (wich wouldn't be a real solution I believe). - Foxie001 ( talk) 17:29, 16 October 2009 (UTC)
The &ndash was implemented on all of the date pages in April 2009 by User:Juliancolton. There was no discussion of the change, but when I asked him about it (while it was happening), the argument seemed sound. I am more convinced of the soundness of the argument now. It was never added to the project guidelines. Drappel has explained it pretty well. If there is to be a distinction in the MOS for use of dash vs. hyphen, the easiest way to implement this in a page with hundreds of dashes is to do it this way. I never knew the difference before and I would always have used a hyphen. This only rarely gets "fixed" by a bot or editor with AWB. To keep the pages consistent, using &ndash is the best way to do it. If we remove them, we will have a mix of dashes and hyphens that a bot would need to cleanup (if one could). -- Mufka (u) (t) (c) 22:09, 16 October 2009 (UTC)
I recently used the c# regex code from
Template:Bots, but noticed that it gave a TRUE value for a page that contained {{bots|allow=SineBot}}
(when my bot is ChzzBot).
I was doing a little independent check, and skipping pages with "{{bot" or "{{nobot" entirely in any case, to process those by hand later.
If my understanding of the compliance is correct, this should actually have been FALSE, ie the bot was not specifically permitted.
I'm not good at regex, hence I recoded it in what might be a bit of a long-winded fashion, which (for a month) you could check in this pastebin.
If someone thinks it's worth changing to that code on the page, suitably tidied up, then feel free.
Cheers, Chzz ► 03:32, 16 October 2009 (UTC)
bool DoesAllowBots(string botusername, string pagetext)
{
if (Regex.IsMatch(pagetext, "\\{\\{(\\s|)(bots|nobots)(\\s|)(\\|((.)+|)(allow)(\\s|)(=)((.)+|)(" + botusername + "))"))
return true;
return !Regex.IsMatch(pagetext, "\\{\\{(\\s|)(bots|nobots)(\\s|)(\\|((.)+|)((optout|deny)(\\s|)(=)(\\s|)(all)|(optout|deny)(\\s|)(=)((.)+|)(" + botusername + ")|(allow(\\s|)=(\\s|)none))|\\}\\})");
}
/// <summary>
/// checks if a user is allowed to edit this article
/// using bots and nobots tags
/// </summary>
/// <param name="articleText">The wiki text of the article.</param>
/// <param name="user">Name of this user</param>
/// <returns>true if you can edit, false otherwise</returns>
public static bool CheckNoBots(string articleText, string user)
{
return
!Regex.IsMatch(articleText,
@"\{\{(nobots|bots\|(allow=none|deny=(?!none).*(" + user.Normalize() +
@"|awb|all)|optout=all))\}\}", RegexOptions.IgnoreCase);
}
{{
bots|deny=
all|allow=
BotName}}
. IMO, my code is better :) -
Kingpin
13 (
talk) 12:58, 16 October 2009 (UTC){{bots|allow=SineBot}}
" is not enough to deny any bots anyway. Since all bots are allowed by default, explicitly allowing one with this template doesn't seem to opt out anything. –
xeno
talk 13:12, 16 October 2009 (UTC)
{{bots|allow=SineBot}}
" on it. So what we're trying to do (I think) is replace the C# code at
Template:Bots -
Kingpin
13 (
talk) 13:16, 16 October 2009 (UTC)
{{bots|deny=all|allow=<bots to allow>}}
, which makes more sense to me, but I'd be happy to modify my code accordingly? -
Kingpin
13 (
talk) 13:37, 16 October 2009 (UTC)
As is clear above, the current method of bot exclusion is complicated and difficult to be compliant with. Perhaps it's time to redesign the system? So far, it seems the goals are:
One possibility:
The detector for this scheme is pretty simple:
function allowedToEdit($text, $botnames){
if(preg_match('!{{\s*nobots\s*}}!', $text)) return false;
if(!preg_match_all('!{{\s*(bots|nobots)\s*\|\s*except\s*=(.*?)}}!s', $text, $m)) return true;
$re='!/\s*(?:'.implode('|', array_map('preg_quote', $botnames)).')\s*/!';
for($i=0; $i<count($m1]); $i++){
$found = preg_match($re, '/'.$m2][$i.'/');
if($found && $m1][$i == 'bots') return false;
if(!$found && $m1][$i == 'nobots') return false;
}
return true;
}
The major drawback is that this scheme has no provision for bots trying to honor the existing syntax; they'll probably either ignore the new syntax completely or treat the nobots version as {{nobots}}.
Perhaps the best way to do #5 is to just declare that these templates must be placed in section 0, and then bots need only fetch section 0 to perform the check (possibly like this). Anomie ⚔ 20:46, 16 October 2009 (UTC)
(Aside) I await clarification/new code; for now, fortunately my bot edits are trivial, so I just deal with any page containing "{{bot*" or "{{nobot*" manually. Chzz ► 13:24, 19 October 2009 (UTC)
Could a BAGer please take a look at this its been open since September and has been tagged for BAG attention since October. -- Chris 08:53, 8 November 2009 (UTC)
A discussion that may affect some bots running on the toolserver is in progress at WP:VPT#Toolserver IP editing logged-out again. Please comment there. Anomie ⚔ 23:06, 25 November 2009 (UTC)
Does anyone know of a tool to help find abandoned userspace draft articles? I suppose it wouldn't be too hard to write someone to analyze the lastest database dump based on last edit date. Before I consider writing such a tool, I wanted to check if such a thing already existed. Thanks! -- ThaddeusB ( talk) 05:33, 3 December 2009 (UTC)
I'm curios as to the effectiveness and usefulness of this page, and secondly the fact that the approved account is not the account doing the edits. Page currently has ~52k revisions, RonaldB has edited it 27414 times, RonaldBot 24323 (not since eary 2008). In addition RonaldB's total edit count is 27487.
I could see updating it once a day, or every hour/half hour, but not every time it finds a new request. I think it's pertinent to revisit the request. Q T C 21:20, 3 December 2009 (UTC)
Would love a second opinion on in the conclusion I drew of ENWP bot policy were correct in this case. Q T C 01:36, 8 December 2009 (UTC)
Should I file another BRFA to start using PY, rather than AWB, for DrilBot, even if it would be doing essentially the same things (syntax cleanup, etc.)? – Drilnoth ( T • C • L) 19:06, 2 December 2009 (UTC)
There is currently talk at the Albums WikiProject about a possible widespread change across all album articles, and we need some expert advice on what is possible to accomplish with bots. Please see #Implementation of consensus on album reviews for our discussion. Thanks — Akrabbim talk 20:07, 9 December 2009 (UTC)
For Christmas this year I would like Wikipedia:Bots/Requests for approval/Orphaned image deletion bot to be approved. I know I haven't been the best boy this year, and have been slow to respond to queries etc related to that brfa, but seriously its been open since September and I would appreciate it if it was over and done with before New Year. -- Chris 09:19, 15 December 2009 (UTC)
Hi. I think this is the place to ask. :) Currently, Zorglbot does a few tasks at WP:CP, and I'm wondering if it is possible to have its functions there replaced by a bot operator who is a little more active on English Wikipedia. The bot's job is creating new pages each day for transclusion and moving pages up to the "Older than 7 days" section after, well, 7 days. Since there's already a bot doing this, a bot request seems inappropriate, but unfortunately it can be difficult to talk to Schutz about issues because he is not often active here.
For last instance, see User talk:Schutz#Zorglbot: Expansion on the daily CP listings. We requested a change to the bot to take into account changes in CorenSearchBot on 10/31. I left him a note at his French talk page that same day. On 11/2, Schutz responded, saying he would look into it very soon. Despite follow up requests on 11/11 and my asking if we should find somebody less busy on 11/12, that's the last we've heard of it. I asked Tizio on 11/16, but when he was also inactive followed up on the recommendation of another bot operator on 11/22 by asking Schutz if we might have the code for Zorglbot so such small changes could be made by others if he is busy ( Fr:Discussion utilisateur:Schutz#Zorglbot request). Though Tizio came back and implemented our request in DumbBot's daily duties, I have not heard anything from Schutz on this, though he has edited both en:wiki and fr:wiki since.
As occasionally happens to any bot, I'm sure, Zorglbot didn't do its thing last night. Another contributor has notified Schutz (and I've manually performed its task), but it brings home to me the need to have an operator who is accessible. While Schutz is very helpful and accommodating when he is around, my previous communications at his talk page have also suffered lag.
Would it be possible to get a different bot to do this task? -- Moonriddengirl (talk) 12:42, 13 December 2009 (UTC)
The WP 1.0 bot is used to track the assessment templates that are put on the talk pages of articles. These are used by over 1,500 WikkiProjects, with over 2.5 million articles tagged. A new version of the WP 1.0 bot is in initial beta testing.
The new version of the bot runs on the Wikimedia toolserver, using their databases and storing the data in its own database. This should open up opportunities for other bots to use the pre-parsed assessment data to do their own analysis. I am open to implementing APIs for this purpose, and I am very open to patches against my code.
I'd also like to find another person interested in working on the bot. Of course you can choose your own level of involvement. I would be happy to have a new developer at any experience level, and working with this bot would be a very nice way to learn database/web programming in the context of a real project. If you're interested, please contact me on my talk page. — Carl ( CBM · talk) 01:51, 17 December 2009 (UTC)
This is due notification that I have been nominated to become a member of the Bot Approvals Group. My nomination is here. @ harej 05:46, 29 December 2009 (UTC)
I'm currently requesting approval for a bot that will place a message on the talk page of any new namespace 0, 6, 10 or 14 article with ambiguous links. See Wikipedia:Bots/Requests_for_approval/WildBot. Josh Parris 03:00, 3 January 2010 (UTC)
Wikipedia:Bots/Requests for approval/hadoop0910 (created 3 December 2009) and Wikipedia:Bots/Requests for approval/mishumia (created 21 August 2009). -- Magioladitis ( talk) 08:10, 7 January 2010 (UTC)
Anyone know how to get a list of what pages redirect to a given page using pywikipedia? -- Cybercobra (talk) 06:56, 10 January 2010 (UTC)
def get_redirects()
target = wikipedia.Page(site, "Pagename")
for redirect in target.getReferences(follow_redirects=False,redirectsOnly=True):
# Cache all redirects to this base name
yield redirect
Josh Parris 10:05, 10 January 2010 (UTC)
I've proposed a minor change to {{ bot}} here, in case anyone has comments. Olaf Davis ( talk) 15:31, 14 January 2010 (UTC)
Hey Wikipedians, I am here to advertise my nomination to be on the Bot Approvals Group. Take a look if you have some time. Tim1357 ( talk) 02:25, 16 January 2010 (UTC)
(Copied from Wikipedia Talk:Stub at User:Xeno's suggestion)
SmackBot has now twice (if not more) removed my marking of stub templates, the latest being at
Battle of Temesvár with
this edit.
I have got on to the bot's maintainer recently about this when it did it at Battle of Pakozd. Unfortunately the matter there seems to have been dropped and archived before achieving consensus.
In my opinion, a bot should not make the decision to remove a stub, but WP:STUB says "Any editor can remove... without special permission". It depends, then, whether a bot is regarded as an editor for the purpose of this sentence. I believe in practice, though, it is best to leave such removal to a human's decision. In particular, in these articles which form part of a series:
{{
Expand Hungarian}}
and {{
Expand section}}
tags. WP:STUB says an article should not have both the {{
expand}}
tag and stub templates. I do not think by extension this should mean any template that happens to start with "expand". I am not assertoing this was the reason for removing them, but I would regard it being so as very much too liberal an interpretation of WP:STUB, since those templates have quite distinct usages/meanings and are not merely artifacts of {{
expand}}
. Clarification there please.{{
underconstruction}}
tag. While not of itself even intended to prevent/delay other editors' contributions, it may be taken as evidence indicating that the article is actively being expanded, and so hint that in its current state it could well be considered a stub. Again I am not suggesting that it will always do so (otherwise it might as well come under the same guidance as {{
expand}}
) but it may give a hint to a human editor that it is indeed a stub.In short, I think it entirely inappropriate for a bot to make this kind of decision. For a human editor, with the assistance of AWB, to do so, is entirely another matter, since in good faith I accept that an editor will consider the whole balance of the article and use AWB to find candidates, not automatically trust it (as a bot does).
Any opinions on this matter?
Best wishes Si Trew ( talk) 20:47, 16 December 2009 (UTC)
{{bots|deny=SmackBot}}
to the article, to prevent them editing it again (if it is exclusion complient) . –
xeno
talk 20:53, 16 December 2009 (UTC){{
expand}}
template.[...] there are subjects about which a lot could be written - their articles may still be stubs even if they are a few paragraphs long. As such, it is impossible to state whether an article is a stub based solely on its length, and any decision on the article has to come down to an editor's best judgement (the user essay on the Croughton-London rule may be of use when trying to judge whether an article is a stub). Similarly, stub status usually depends on the length of prose text alone - lists, templates, images, and other such peripheral parts of an article are usually not considered when judging whether an article is a stub.
(outdent) Let's make it simple. Either he edited it for himself, and accidentally was on the SmackBot account – which I accept as a human error if he will only say so – or his bot edited it as part of what it would then do to thousands of other articles. All he has to do is say which, but if he does not, I think he is abusing the bot. Si Trew ( talk) 13:22, 27 December 2009 (UTC)
I have no idea, and little care, about the circumstances surrounding this dispute, but one fact is abundantly clear: a bot does not have the judgement to decide when an article has progressed beyond stub stage. A bot is not an editor, it is a mindless automaton; a such, a bot should not be removing stub templates. If a bot has approval for that task, that approval should, IMO, be withdrawn; if it does not have approval, it should not be doing it anyway. Happy‑ melon 17:32, 27 December 2009 (UTC)
I haven't read the whole discussion yet but I would like to add my opinion on Si Trew's concerns. As far as I understand:
On stub removal from AWB:
Thanks, Magioladitis ( talk) 08:10, 28 December 2009 (UTC)
I had pointed this out some time ago [13]. I have raised the issue again on the bot's talk page, as this is something that doesn't have bot approval and seems unlikely to gain bot approval (as Happy-Melon said above). Not all AWB features are suitable for bots. — Carl ( CBM · talk) 13:50, 28 December 2009 (UTC)
In case this matters for the discussion: AWB Tagger's bug for orphans has been fixed and will be available with next release probably next week. Check Wikipedia_talk:AutoWikiBrowser/Bugs/Archive_15#Orphan_tags_part_IV. Further tweaks, will be done soon. Check Wikipedia_talk:Orphan#AWB_and_Orphans. After that, I don't see any reason why SmackBot should not be going genfixes and tagging while doing other stuff. This saves watchlists from being cluttered by continuous bot edits, servers and a lot of work. If someone goes to WP:CHECKWIKI will see that most errors are human made and not AWB bots related. Thanks, Magioladitis ( talk) 16:04, 14 January 2010 (UTC)
I am currently standing for BAG membership. Your input is appreciated. ⇌ Jake Wartenberg 02:53, 26 January 2010 (UTC)
The end of the discussion is that Rich got genfixes turned off. I didn't want that or ask for it. Never during this whole farrago did I turn his bot off or ask for it to be. I asked, more or less, for it to be turned off for stub remmoval, while it was decided. It still seems it has not been and there is no consensus. I do think that SmackBot should be let to go about its general business cleaning up genfixes, I have no problem with that at all. Rich put a message on my talk page to which I have replied, and left a message at his.
It is bizarre, because really all both he and I want to do is make Wikipedia better. We clashed, the earth still goes around the sun. He should have SmackBot reinstated for genfixes. Anything I can do to help that I will, and have said so. The stub thing I still think is not resolved, but SB should do uncontroversial bot edits and should not be stopped from doing that. Certainly it was not me who asked for that. Si Trew ( talk) 18:26, 30 January 2010 (UTC)
Let me make something clear: AWB has two main procedures: Auto-tagger and general fixes. The first part really still does have problems in some parts. I can write them down if someone ask. We are working a way out. SmackBot should keep doing general fixes in addition to its other functions. -- Magioladitis ( talk) 20:07, 30 January 2010 (UTC)
Further to: Wikipedia:Bot_owners'_noticeboard/Archive_4#Bot_categories:_Grand_plan it's about time finding bots to
became much easier. I have suggestions, but before I pollute the pool with my own thoughts, what are yours? Josh Parris 02:49, 1 February 2010 (UTC)
I have an new bot leaving a note on the talk pages of new articles where it finds links to disambiguation pages in the new article. During approval consensus was that a message on a user's talk page or a tag within the article itself would be too disruptive. A study during the trial of the bot showed that early tagging (on the talk page) was vital to generating editor action, and the reception has thus far been strongly positive.
However, it's been pointed out that there's an increased administrative burden from creating these talk pages, as 7.5% of them are subsequently deleted along with their associated article (under the CSD).
I see six possible solutions (in order of preference):
Is there anything else I can do? Josh Parris 14:49, 19 January 2010 (UTC)
Do we have a filter for empty talk pages? I think after the dabs are corrected the templates should removed and if the page is emptied then to be deleted. In fact I really hope that these messages are updated. In the past we had some bot messages that stood in talk pages for years ending to be out-of-date and sometimes disruptive for readers who wanted to read discussions in the talk page. -- Magioladitis ( talk) 17:01, 19 January 2010 (UTC)
Further to this: Wikipedia:Bot requests/Archive 33#db-author bot Josh Parris 00:31, 20 January 2010 (UTC) Also Wikipedia:Bots/Requests for approval/7SeriesBOT Josh Parris 11:17, 4 February 2010 (UTC)
I have been nominated for Bot Approvals Group membership by MBisanz, and I am posting a notification here as encouraged by the bot policy. If you have time, please comment at Wikipedia:Bot Approvals Group/nominations/The Earwig. Thanks, — The Earwig @ 03:39, 3 February 2010 (UTC)
I have accepted MBisanz's nomination of myself for membership of the Bot Approvals Group, and invite interested parties to participate in the discussion and voting. Josh Parris 03:02, 11 February 2010 (UTC)
If your bot has been not sending a User-Agent header, it will now be getting errors while trying to do anything on Wikimedia sites. The fix is to set a distinct User-Agent header for your bot (i.e. don't just copy the UA from IE or Firefox, or use the default in your HTTP library), like you should have been doing already. Anomie ⚔ 19:58, 16 February 2010 (UTC)
Did you notice changes in interwiki bots behaviour? There are some problems with it.
Basilicofresco ( msg) 13:37, 17 February 2010 (UTC)
Hi! Are approved bots allowed to add cosmetic changes (-cc parameter, see cosmetic_changes.py in pywikipedia) to their tasks without any specific authorization request? example Thanks. -- Basilicofresco ( msg) 15:56, 19 February 2010 (UTC)
I'm not entirely sure how to report a bot, but User:FaleBot seems to be doing odd things. It removes perfectly good interwiki links (for example, every interwiki link to 2010 in music). It seems to take offence at interwiki articles with non-ASCII names (such as Akasaka and the Japanese disambig page/ Korean disambig page) and just removes them seemingly randomly.-- Prosperosity ( talk) 18:41, 20 February 2010 (UTC)
Hello. I have opened a BRFA for a new anti-vandalism bot ( User:AVBOT). It reverts vandalism, blanking and test edits. This bot has been tested in Spanish Wikipedia for about 2 years, and it has reverted about 200,000 vandalisms.
Some more features:
Also, the code is available for reviews. Thanks. Regards. emijrp ( talk) 21:46, 22 February 2010 (UTC)
User:Muro Bot renames DEFAULTSORT to ORDENAR in its "cosmetic changes". Examples: here, here, here. Please someone block it until this is fixed. Thanks Hekerui ( talk) 17:04, 27 February 2010 (UTC)
Hey everyone. Does anyone know if this toolserver IP address was ever matched up with which one of the AIV helper bots has been logging out? It seems to be losing its log-in frequently lately. -- Nick— Contact/ Contribs 07:44, 4 March 2010 (UTC)
At Wikipedia:Bots/Requests for approval/Full-date unlinking bot 2 harej has requested another operator adopt his bot (and presumably the BRFA that goes with it). Are there any takers? Josh Parris 12:08, 5 March 2010 (UTC)
Here is another interwiki bot running in -auto and -force mode... [20] I believe it's time to ask developers to block the contemporary use of -force and -auto. What do you think? -- Basilicofresco ( msg) 06:47, 6 March 2010 (UTC)
Here is the answer of the operator. -- Basilicofresco ( msg) 10:48, 6 March 2010 (UTC)
Operator has been warned, but is still running the bot in -auto -force: [21]. There is consensus against these edits. Please take appropriate actions. -- Basilicofresco ( msg) 23:13, 9 March 2010 (UTC)
User talk:Evgeniychub. It seems most of us regular unblock reviewers are pretty clueless about the details of bot operations... Beeblebrox ( talk) 17:39, 11 March 2010 (UTC)
Chillum seems to be serious about retiring, and as a result we've had no bot generated username reports for the last five days. Anyone want to adopt this bot? Beeblebrox ( talk) 18:50, 12 March 2010 (UTC)
I thought I should call attention to the informal discussion at User_talk:Basilicofresco#FrescoBot_adding_WildBot_templates and User_talk:Josh_Parris#WildBot_template_clutter. I've weighed in somewhat assertively and didn't want to seem as if trying to short-cut a discussion that probably belongs on the table for the entire BAG community to work through. It appears FrescoBot has essentially turned Wildbot's DAB and section-link function from an opt-in to a mandatory function wiki-wide. Main problem seems to be that the mass prompting set up by FrescoBot is setting up full-size templates on countless busy talk pages where the top of the page is devoted to setting context for the talk page rather than minor maintenance functions that these bots perform. The tasks are of course potentially useful and impressive. But I wonder if this effect was fully contemplated in the last approvals request of FrescoBot? ... Kenosis ( talk) 02:52, 18 March 2010 (UTC)
Given that the honorable Mr. Wolterding seems to be dead (no edits since January 3rd), what is the Plan B for WolterBot ??? -- AlainR345 Techno-Wiki-Geek 04:44, 9 March 2010 (UTC)
I would notify how user:Monegasque is recently adding a host of ultra-detailed categories (such as category:People from Fiume Veneto, which I recently blanked) whose importance is really minor, and usually contain just one entry. CAn somebody intervene and, possibly, revert all his improper additions? Thanks and good work. -- '''Attilios''' ( talk) 15:36, 26 March 2010 (UTC)
The Southern Railway (Great Britain) article has had an interwikilink added to the pt:Southern San Paulo Railway article. The link has been added by Xqbot, TXikiBot, Volkovbot and Volkovbot again. No doubt a similar series of edits will have been performed to the corresponding Portuguese article. I've not checked this as that is a matter for pt:wiki. Further addition of this link will be seen as disruptive and the bots prevented from editing. Please can we get this sorted. Mjroots ( talk) 03:21, 27 March 2010 (UTC)
There are eight pages (one each for eight languages: ca, de, en, fr, ja, no, pt, simple), and I have examined each. Seven of the eight deal with what we in the UK know as Southern Railway (Great Britain); the other ( pt:Southern San Paulo Railway) deals with a railway in Brazil, and the latter was incorrectly being added as an interwiki to the other seven, and vice versa. User:EdJogg removed all seven interwikis from the pt page, but I was concerned that a 'bot would add them all back in, so I went to each of the other seven and manually removed the pt interwiki. This has cleared the immediate problem. I have since examined the histories of the eight pages in question, and built up the following chronological sequence. All times in UTC.
26 March 2010
27 March 2010
It seems to me that the original error was made by User:ZéroBot and it spread from there. To fix it properly required an edit to all eight pages -- Redrose64 ( talk) 14:59, 27 March 2010 (UTC)
Hello. I recently wrote the article Backfire (Cocktail) and User:Xtzou put a thingy in there to have it deleted. So I looked at his contribs and wow, this is something. Obviously this is not what I would call a normal user. Is this a socketpuppet-account of an experienced wp user? And is this somebody using partly a bot?
His contrib list: [ [23]]
Please have a look into it. Dropdead567 ( talk) 14:06, 2 April 2010 (UTC)
If your bot(s) use any specific categories, feel free to tag those categories with {{ Bot use warning}}. For examples of use, see Category:Candidates for speedy deletion and Category:Wikipedia files for deletion. עוד מישהו Od Mishehu 13:26, 6 April 2010 (UTC)
Please note, about 10-15 minutes ago there was a security change to the login system (Which is now live on all WMF wikis), which will break change the current login systems (API users included), For more information see:
bugzilla:23076.
Peachey88 (
Talk Page ·
Contribs) 00:39, 7 April 2010 (UTC)
Would any fix have to be implemented by individual bot owners or is there a general fix? I'm not very good at coding/programming, so I'm not sure if that was answered above or not. TN X Man 03:19, 8 April 2010 (UTC)
Hello, My Pywikipedia bot is correctly logged, with api, but it doesn't edit anymore with template.py, nor with replace.py does anyone have an idea of the problem (I use a completly standard code). Regards -- Hercule ( talk) 09:26, 8 April 2010 (UTC)
Overview of frameworks and their status since the update, If anyone else knows any other frameworks, please add them to the list: Peachey88 ( Talk Page · Contribs) 10:15, 8 April 2010 (UTC)
I have accepted Kingpin13's nomination for membership in the Bot Approvals Group, and per the instructions invite interested parties to participate in the discussion and voting. Thank you, – xeno talk 19:24, 17 April 2010 (UTC)
Who is running/owning/responsible for SQLBot-Hello ( talk · contribs)?
Admin SQL ( talk · contribs) hasn't edited Wikipedia in just over a year, but the bot is still welcoming people en-masse.
(Context: I saw the welcome template (and broken signature) at this new user's talkpage, User talk:Lisa-maria syrett, and I assumed the editor named in the signature had added it, so informed him that his signature was breaking, and he pointed out that a bot had added the template, not him. Hence after a little digging, I'm here.) Thanks. -- Quiddity ( talk) 01:20, 25 April 2010 (UTC)
I know at least two welcome templates that use {{REVISIONUSER}} to fill in the welcoming user's talk page link during substitution (one has done so for a year, the other was recently changed to do so by me). Obviously, if bots go around and subst those templates, the bot's talk page will be linked instead.
Do we still have active bots substituting templates during their rounds? If so, I would suggest to
botsubst=1
, noting that the parameter bot=...
is already in use in some templates) so that the template can react to that.Not at all critical of course, and as far as I'm concerned we can just as well decide to simply not use any fancy magic words on bot-substituted templates. Current usage is not at all critical, as far as I'm aware, and could easily be removed.
Amalthea 17:18, 3 May 2010 (UTC)
I've temporarily blocked CommonsDelinker for the duration of the disruption at Commons to prevent further damage to the local project. WP:AN notice Q T C 01:26, 8 May 2010 (UTC)
At
WP:ANI#User:X!'s adminbots is was a discussion on whether to desysop his two adminbots since he has resigned as a bureaucrat and admin.
As of now several have agreed that this is unnecessary, but please reply there, especially if you disagree with that.
PleaseStand
(talk) 20:10, 9 May 2010 (UTC) 21:17, 9 May 2010 (UTC)
Programmer looking to work for food and shelter! - Hey folks. I've been inactive for a long while. I used to be a BAG member an a rather prolific bot operator, but my departure to college left me with no time and sporadic internet access. This summer I'm moving into my new apartment, which will hopefully give me more space to set up my linux laptop again. If there are any of my former bots that haven't been replaced, including the IRC ones for election times, I have the code and a system to run them on. I posted a longer bit on my talk page, if you'd direct any comments to me there, that'd be great. Let me know if there's anything I can do. ST47 ( talk) 05:50, 13 May 2010 (UTC)
I am adding this note here at the request of OlEnglish ( talk · contribs).
In a recent incident, after I had used a 'final warning', SineBot added a level-1 warning for failing to sign. This resulted in the admin not seeing my final warning and declining a request to block (following additional disruption) at AN.
AN/I.
Chzz ► 07:18, 18 May 2010 (UTC)
Hello,
I am working on a new release of the Perlwikibot framework with ST47. This release will contain some breaking changes, so if you use Perlwikibot, please be sure to check the documentation prior to upgrading. It will use the API wherever possible, and will handle some basic autoconfiguration to make authoring scripts much easier.
If you'd like to suggest improvements to the framework, please file a bug at http://perlwikipedia.googlecode.com – and patches are welcome, of course. You can check out our page on OpenHatch for ways to get involved.
— mikelifeguard@ enwiki:~$ 18:30, 22 May 2010 (UTC)
Bot-savvy users and BAG members may be interested by this thread on ANI - Kingpin 13 ( talk) 16:31, 26 May 2010 (UTC)
The bots BenzolBot ( talk · contribs), Dinamik-bot ( talk · contribs) and RibotBOT ( talk · contribs) keep adding links to Alforja, Spain to the article Pannier. The non-English pages an:Alforja, ca:Alforja, es:Alforja, eu:Alforja, fr:Alforja, it:Alforja, nl:Alforja, ru:Альфоржа, uk:Алфоржа, vi:Alforja and war:Alforja are all articles about the town in Spain. Alforja used to redirect to Saddlebag but this morning I changed it to a dab page. One of the bot owners previously explained to me that they could only fix their bot and that non-English bots would continue to replicate the error. I don't really understand it but it would be nice to fix this problem somehow. -- Dbratland ( talk) 19:02, 29 May 2010 (UTC)
{{
nobots}}
" on it -
Kingpin
13 (
talk) 20:00, 29 May 2010 (UTC)Okay, that's an overstatement, but why does this regex eat ALL text on an article, rather than stopping at the end of the template's curly braces? (examples: [24], [25]) I've changed it after the linked revision to the following:
$content =~ s#\{\{current(.*?)\}\}##i;
But I can't entirely explain if this works, either. I've also put in a failsafe ("how many chars am I removing?") to keep it from doing so again. tedder ( talk) 06:02, 1 June 2010 (UTC)
\{\{current([^{}]*?)\}\}
instead, which will be fine unless these current event templates can take a nested template as an argument.
Rjwilmsi 07:26, 1 June 2010 (UTC)
TF2 Wiki is a community wiki covering topics surrounding the game Team Fortress 2. We are beginning a process of moving to a new domain and server. As none of the current administrators have access to the FTP site we are manually moving some 5,800 files across. There is no problem in downloading all these files for the transfer (as we have systems in place to automate it), but the thought of manually uploading them all to the new server gives me goosebumps. Would anyone here be willing to offer any bot-related assistance or advice? Is this the best place to ask such a question? — surlyanduncouth ( talk) 19:14, 4 June 2010 (UTC)
Hi, this is just a notice that I have opened a brfa for an adminbot to delete images that are available as identical copies on the Wikimedia Commons per WP:CSD#F8 -- Chris 10:13, 8 June 2010 (UTC)
I have not created a bot yet, but I would like to know if redundancy would be a desirable feature (prompted by the recent ClueBot downtime). The implementation I am thinking of would involve two copies of the bot running on separate computers and Internet connections (e.g. one on a home computer, the other on Toolserver). The two instances would periodically check each other's most recent edit timestamp, and if there is no activity for some time, the other bot would take over. Additionally, the active bot would periodically hand off its work to the inactive bot to prevent its failure from going unnoticed. Thoughts? PleaseStand (talk) 17:51, 13 June 2010 (UTC)
It seems like it would be good to create a very extensible, configurable PHP bot framework, capable of being run on MediaWiki installation, so as to avoid needless duplication of coding. Rather than put every possible function under the sun into the core (including those added by obscure API extensions), it should have lots of hooks and allow for new bot functions to be added as plugins; hopefully a whole library of such plugins can be created, analogously to what has been done with m:pywikipediabot. So then the question arises, which bot framework should be the starting point? I created this table in an effort to determine which bot frameworks are the most powerful. wikibot.classes.php is pretty popular, and I put a version here that has been modified to be more readily configurable, or at least to allow configuration from a single file. Probably code from its many forks (including admin code from botclasses.php could be merged into it to create one supremely all-powerful superbot like something out of Transformers. But Pillar looks pretty powerful too. Other than by having them square off, I'm not sure what criterion would be best for determining the bot that should rule them all. Their codebases are totally different so it does not appear there is much prospect of merging them. Tisane ( talk) 21:19, 2 June 2010 (UTC)
I think this is a good idea to remove the massive overlap of PHP bot frameworks we have (even my framework itself is forked into two different versions atm). This page might be useful in helping you to determine the different functions that each framework supports etc. I'll be happy to help in the coding, although the extent of my help will depend on the amount of free time I have -- Chris 03:07, 5 June 2010 (UTC)
Keeping the suggestions here in mind, I'm halfway through writing a new framework that incorporates ideas from wikitools, pillar, botclasses.php, SxWiki, and ClueBot, plus adding functionality such as plugins and Non-Wikipedia-Centricism. You can watch my progress here. It's still a MASSIVE work in progress, and fixmes are everywhere. ( X! · talk) · @635 · 14:14, 10 June 2010 (UTC)
(email redacted)
) to the project? Also we should probably create a page onwiki for it, and maybe create a planning/discussion page as well? --
Chris 03:27, 12 June 2010 (UTC)
- Jarry1250 Humorous? Discuss. 13:32, 13 June 2010 (UTC)
So now that we've gotten all the base function calls laid out, what additional suggestions for plugins/hooks would you guys have? The more, the better. ( X! · talk) · @160 · 02:50, 15 June 2010 (UTC)
We'd need a bot which updates the number of articles under pending changes and a bot which reports certain kind of edits if possible, please see Wikipedia_talk:Pending_changes/Trial#Bots. The trial will begin in a few hours. Cenarium ( talk) 17:45, 15 June 2010 (UTC)
If you run a bot on nightshade.toolserver.org, you may be intersted in this WMF bug: bugzilla:23982. — Carl ( CBM · talk) 19:27, 15 June 2010 (UTC)
Does anyone happen to know a good webhost on which to run a bot, other than the toolserver, that will let you run it 24/7 without killing it? I use Bluehost, but they have a known issue with processes getting mysteriously spontaneously killed all the time, even when you use a dedicated IP. Thanks, Tisane talk/ stalk 03:44, 15 June 2010 (UTC)
Sorry about this, but I really don't have the time, or regular enough internet access, to maintain the wubbot ( talk · contribs) anymore. Would anyone like to take it over? Its only task is to check the subpages of Wikipedia:WikiProject Deletion sorting twice a day, and remove any AfD's that have been closed. It's written in Python (using pywikipedia), and currently runs from the toolserver. The code is available at User:The wubbot/source - apologies for the mess! There's a few bugs and possible improvements listed on the bot's userpage that someone more skilled than me may be able to fix. Reply here if you're interested - I would be extremely grateful, and I apologise for not being available to take care of this over the past few months. the wub "?!" 15:51, 17 June 2010 (UTC)
I've written a JavaScript framework for bot scripting that I would like to invite feedback on. The motivation behind the framework is more-or-less solely that I think JavaScript a wonderful and accessible little language for doing ad hoc pieces of code. The name I've given the framework, Luasóg, a diminutive form of the Gaelic word luas, meaning speed, is supposed to reflect this motivation.
A part of the over-arching project is a web-based IDE for writing and executing scripts using the framework. The project also includes a version of the IDE hosted in an Air application so that cross-domain scripting can be executed outside of the sandbox of the main browsers. If you are willing to give the framework a spin, the Air application is likely to be your best bet. You can download it here.
This is a development release and, though not very buggy, it is not stable, particularly the IDE element. It should execute scripts as expected however and so please tell me if it doesn't (or indeed any other problems you come across).
The release notes for this release contain a simple example script. The full API is documented here. The 'request' method gives access to any function of the MediaWiki API. Other methods act as convenient wrappers to common functions. So far the number of methods is essentially limited to "log in, get content, replace content, and log out again". Developing this was my first experience in bot scripting so I would particularly like to get feedback on the kind of common bot functions that should be included in later releases. If anyone wants to write methods for the framework then they are more than welcome to do so.
As a note of caution, scripts executed using the framework are throttled by default to one-call-every-10-seconds. This can be changed via the 'speed' property.
Thanks in advance, --RA ( talk) 09:08, 26 April 2010 (UTC)
prop=info|revisions&intoken=edit
or the like). Getting it wrong screws up the "deleted since you started editing" check, which could possibly cause you to recreate a page that was just deleted or to fail to edit a page that was just undeleted. (Edit conflict handling is one of my pet peeves) I am glad to see AssertEdit usage in there. If you really want to be safe regarding deletions and page creations, ensure that the data for the edit function always includes the prop=info output and pass "createonly=1" if prop=info includes "missing" or "nocreate=1" if it doesn't.
Anomie
⚔ 02:04, 1 June 2010 (UTC)
I wrote a prototype bot using your framework. It is the unapproved bot User:PSBot – at this time it is only editing one page in its user space. The source code is at User talk:PSBot/Deprods. While coding, I noticed a lack of some important features:
I hope you keep up the good work, and if you can provide feedback on the design of my bot, please do so. This is my first one. PleaseStand (talk) 06:37, 20 June 2010 (UTC)
I asked for
VP comments on a proposal to let bots fill in url |accessdate=
automatically from the date the url was first added. I wasn't sure whether to post this under Bot Policy talk, so posting here. —
Hellknowz ▎
talk 16:58, 31 May 2010 (UTC)
I was curious, and pulled a list of accounts currently flagged with both "bot" and other groups.
Note that giving a bot "autoreviewer" is currently useless, as the bot group already has the autopatrol right. It looks like that's the only group totally redundant for a bot, as even "confirmed" has rights that a (non-autoconfirmed) bot lacks: patrol, move, movestable, reupload, collectionsaveascommunitypage, collectionsaveasuserpage, and upload. Anomie ⚔ 20:35, 22 June 2010 (UTC)
I just noticed that AntiAbuseBot's approval stated that it was only approved "until such time as the AbuseFilter extension or a substantially indentical technical feature is turned on by the sysadmins at the English Wikipedia. At such a time, a report should be made to the bot owners' noticeboard requesting that Chris G either to turn off the bot or seek re-approval in a Wikipedia:BRFA". So technically, that should be done. But since the AbuseFilter's ability to block users has not been activated, and a discussion at ANI occurred in August 2009 with consensus for the bot to continue running, I intend to re-approve Wikipedia:Bots/Requests for approval/AntiAbuseBot as simply "approved" without condition, but with the suggestion that it be revisited if the AbuseFilter's block functionality is ever activated instead of forcing a new BRFA with a most likely foregone conclusion. Any objections? Anomie ⚔ 21:01, 22 June 2010 (UTC)
In my opinion RaptureBot is not checking sufficiently before replacing Wikipedia images with Commons images. I have reverted several that popped up in my watchlist which have incorrect attribution on Commons. [26] [27] [28] [29]. Incorrect attribution is cause for deletion on Commons which could leave the articles without images. The bot should check that the attribution on Commons matches the attribution here before replacing. At the very least it should be tagging the files with the information that they do not match to warn administrators not to delete the en.wiki version without checking, but probably the right thing is not to replace the image until the problem is resolved.
There are also several complaints on the bots talk page that the bot has replaced an image with one at Commons threatened with deletion. This really should not be happening. This replacement of a fair use image immediately caused the Wikipedia image to be deleted. This was despite a challenge to the copyright tag on Commons. Potentially, this could have resulted in both images being deleted.
Since the bot owner is showing no inclination to pro-actively clean up these errors him/herself, leaving it to others to sort out, I think that the bot should be forbidden from doing any further runs until tighter checking is implemented and approved. SpinningSpark 18:04, 25 June 2010 (UTC)
So after a few weeks of work, the new PHP framework that was called for above is reaching a public beta point. It's got most of the common functions that bots use, and those are fairly stable. The only things left to do are to fill in the remaining obscure functions and plugins, do bugfixing, etc before the stable 1.0 is released. In the beta stage, we need multiple people to test out the framework using their own bots, as this is the only way many of the bugs will be found. It can be downloaded here, and the manual is here. When bugs are found, they can be listed here, and feel free to submit a patch. I hope it works well for all of you. :) ( X! · talk) · @082 · 00:57, 28 June 2010 (UTC)
The User:ArticleAlertbot has been down for a few months now, due to some sort of login problem. The coder, User:B. Wolterding, has not logged in since March, and thus is not available to fix this. I believe in April, a couple of users discussed the issue with the bot's operator, User:Legoktm, and User:Tedder agreed to try to fix this. However, he appears not to have a toolserver account (or something like that), so he has been unable to do this. Is anyone here willing to give it a try? I'm sure many WikiProjects would appreciate it. (I posted a similar request in the village pump in late March, but nobody replied). Bramble claw x 00:47, 12 June 2010 (UTC)
Does anyone happen to know of a BOT that removes Red Links from articles. -- intraining Jack In 23:14, 2 July 2010 (UTC)
ArbCom is considering lifting the restriction imposed in Wikipedia:Requests for arbitration/Date delinking#Lightmouse automation, subject to BAG approval of Wikipedia:Bots/Requests for approval/Lightbot 4. As part of BAG's mandate is to gauge community consensus for proposed bot tasks and Lightbot's former activities were highly controversial, I invite all interested editors to join that discussion to ensure that community consensus is in fact in favor of this task. Thanks. Anomie ⚔ 17:32, 13 July 2010 (UTC)
During the code review for the removal of this preference setting, it was noted that it may effect bots that were using this setting. In this case, the bots would need to be modified to explicitly mark their edits as minor. I'm sure this would be a fairly simple code addition, but it could also be accomplished through javascript as explained in this VPT thread. – xeno talk 12:50, 15 July 2010 (UTC) [with thanks to User:Department of Redundancy Department for pointing this out to me]
Looking at bug 17450, it would appear there is some possibility that we could get an XMPP-based XML-format recent changes feed that would include revision text. I wouldn't suppose this bug would be too hard to fix, considering that MediaWiki already has IRC recent changes feed capability. Anyway, the revision text would be quite useful, since it would eliminate the need for bots to hit Wikipedia's API for said text. Therefore, I'll probably end up creating an extension, or finishing this one. Does anyone have a particular XMPP PHP library they think is particularly good, and would recommend using for this project? I found this one, but their SVN repository doesn't seem to be working at the moment. Hopefully, with the right library, we can make that Jabber shit happen soon! Thanks, Tisane talk/ stalk 15:47, 16 July 2010 (UTC)
Do you think there would be support for a bot to replace barelinks with bot-generated titles, much as DumZiBoT used to? Tisane talk/ stalk 19:10, 25 July 2010 (UTC)
An hour or so ago, we were apparently updated to r70061. This seems to have broken the use of "max" as a value for the various limit parameters in the API; attempting to use them will now give a fatal error. This has already been reported as T26564, hopefully they fix it soon. If necessary, a workaround is to use explicit limit values in your API queries (typically 5000 for flagged bots, 500 for some expensive queries). Anomie ⚔ 03:49, 28 July 2010 (UTC)
Comments are invited at the above-linked thread. – xeno talk 14:46, 4 August 2010 (UTC)
Hi guys, in the past couple of weeks I implemented a bot to easily mass move files from the English Wikipedia to Commons. The bot focuses on self-published works. The bot is now beta and I'm looking for people to test it. See Commons:User:Multichill/Imagecopy for more information. multichill ( talk) 16:34, 8 August 2010 (UTC)
Hello bot operators! I have been nominated for the bot approval group and would appreciate input at Wikipedia:Bot Approvals Group/nominations/EdoDodo. Thanks. - EdoDodo talk 02:46, 17 August 2010 (UTC)
User:Coren has been missing since July 20th, and User:CorenSearchBot is down. (I e-mailed Coren to see if he was all right on August 10th or 11th and have received no answer, which worries me. Coren is always responsive! :/)
I consider this pretty urgent for copyright cleanup, as CorenSearchBot typically finds dozens of valid copyright problems in a given day. I don't know how many of those will be found by new article reviewers. Some of them may be being tagged for WP:CSD#G12, but I'm afraid that a good many are likely to be overlooked. User:Xeno tells me that the source code for CorenSearchBot is published at [30]. Is it possible to get a temporary replacement bot or one that can run in parallel with CorenSearchBot?
There is some urgency to identifying and eradicating copyright problems quickly. As we all know, Wikipedia is very widely mirrored and reused around the internet, and this content doesn't have to be published long before substantial damage can be done to the interests of copyright holders...and the reputation of Wikipedia. -- Moonriddengirl (talk) 14:43, 19 August 2010 (UTC)
# End of customizable exclusions
#
return "already-tagged" if $text =~ m/{{csb-/;
Ah, clearly I missed that part of the code. I'll go ahead and fire mine back up again and we'll see how a race between CSBot and VWBot goes... VernoWhitney ( talk) 16:23, 21 August 2010 (UTC)
I am proud to announce the first major release of the Peachy MediaWiki Bot Framework, version 1.0!
After almost 3 months of hard work, I believe we are at a point where the framework is stable enough to be officially released to the public. In those three months, multiple bots, including SoxBot, MessageDeliveryBot, RaptureBot, and many others have been operating nonstop on top of the Peachy framework. I can only hope that other PHP bot operators follow along.
New features since the public beta include...
Upgrading from 0.1beta should be for the most part seamless. Very few breaking changes have been implemented. The minimum PHP version has been bumped to 5.2.1, as many of the internal features use functions implemented in this version. Other than that, scripts that were written for 0.1beta should work in 1.0
If you have not yet written anything in Peachy, now would be a good time to learn! The Peachy manual has been redesigned for this release, and with an intuitive guide for getting started, you should be writing your first bot in minutes!
Naturally, there may be a few bugs that will arise in the first release. The issue tracker is located at http://code.google.com/p/mw-peachy/issues/list. Don't hesitate to report something that doesn't seem right! We can't fix something we don't know about.
To download version 1.0, see the Peachy Wiki. Instructions for downloading the nightly compressed archives and SVN repos are located there.
Thank you for supporting Peachy, and enjoy!
( X! · talk) · @155 · 02:42, 30 August 2010 (UTC)
I'm not sure if this is the right place to put this, but per [31] the archive indexer bot has been running while logged out. I remember there was some API change a couple months ago that caused breakage like this, but I thought all affected bots would have been fixed by now. 67.122.209.135 ( talk) 21:54, 1 September 2010 (UTC)
Hello. Per the ArbCom conditions on my talk page (#4 in particular), I seek some advice.
I currently use no bots/scripts which automate any API-WRITE functionality. However, for purposes of my anti-vandal tool STiki, I do have scripts making a large quantity of API READ requests. Am I correct to assume that such READ scripts require no approval from this group?
On a different note, is there any way to get in contact with API folks and provide them my IP addresses, etc. -- so they know my requests are purpose-driven? Thanks, West.andrew.g ( talk) 18:12, 11 August 2010 (UTC)
We're now discussing 'bot-assisted solutions to this cleanup problem. Uncle G ( talk) 13:05, 6 September 2010 (UTC)
We're now at the stage where the 'bot is ready to roll, and no-one has voiced an objection. (Indeed, to the contrary: Several people want to go further, and mass delete the articles.)
If the 'bot goes ahead, this will probably light up some people's watchlists like Diwali. Be warned. Uncle G ( talk) 04:33, 10 September 2010 (UTC)
Comments are invited at Wikipedia:Administrators' noticeboard/Incidents#VolkovBot overly eager to remove interwiki links, especially from parties familiar with the m:interwiki.py function of the pywikipedia framework. – xeno talk 15:15, 9 September 2010 (UTC)
Ever since DASHBot became an anti-vandal bot, I notice that some users are getting double warnings after DASHBot reverts one edit.
Here are five recent examples:
What I want is the issue of double warnings to be fixed. mechamind 9 0 22:23, 18 August 2010 (UTC)
Untagged, thats because my bot has been down for 13 hours . Cobi, I had that error too. What I ended up doing was querying the API right after reverting an edit, to see if the top revision was listed as DASHBot's. Tim 1357 talk 17:32, 19 August 2010 (UTC)
I suppose it's
mechamind 9 0 02:47, 13 September 2010 (UTC)
Someone needs to take a look at Nubian Jak the interwikis are a complete disaster and its above my skill to fix. ΔT The only constant 13:05, 17 September 2010 (UTC)
My bot fixes a wide range of wikilink syntax problems and some redundancies. Recently user Magioladitis asked me to add also the syntax consolidation "[[architect|architects]]" --> "[[architect]]s" (already fixed by AWB, eg). What do you think? -- Basilicofresco ( msg) 12:24, 14 September 2010 (UTC)
Your comments are welcome: Wikipedia:Bots/Requests for approval/FrescoBot 7 -- Basilicofresco ( msg) 10:15, 20 September 2010 (UTC)
Could someone look at Wikipedia:Bots/Requests_for_approval/CleanupListingBot and either pass/fail the bot...it's been a few days.. Smallman12q ( talk) 22:17, 23 September 2010 (UTC)
I'm having some trouble at the diglyceride article, in that the article has apparently become associated with non-identical articles on other language wikipedias, and now a bot ( User:TXiKiBoT, though I'm guessing other bots would do the same) is reinserting the links when I try to remove them. I guess this kind of bot looks at associated articles across many different language versions, so maybe the issue has to be addressed everywhere simultaneously. I don't know all the languages involved, particularly because some of them are in non-Latin alphabets, so that kind of fix is beyond my abilities. Also, the bot owner doesn't seem to be responding, at least not as quickly as his own bot. So I'm hoping someone here can help fix this. Sakkura ( talk) 15:17, 1 October 2010 (UTC)
{{
bots|deny=TXiKiBoT}}
while fixing the interwiki map. –
xeno
talk 15:51, 1 October 2010 (UTC)
I've done this for monecious or whatever it is a number of times. There are also a bunch of articles where there is no simple bijective mapping, but interwikis are still useful, see below for example. Not high on my list of priorities but something I have given a fair amount of thought over the years. Interwikis should become SQRT of current, i.e. O(n), as edit intensive once the Summer of Code "Reasonably efficient interwiki transclusion" goes live.
Rich
Farmbrough, 11:50, 6 October 2010 (UTC).
...how do all of the "most well known interwiki bots" run all the day through all of the Wikimedia wikis. I'm not any good with scripting (well, I can do basic things) and just learned to use the Unix shell (after using Windows all of my life). Is there something like a sh or a special function of the pywikipedia scripts to let the bot run always? P.D. My bot ( User:Diego Grez Bot) runs from the Toolserver, if that is worth something. -- Diego Grez ( talk) 17:06, 2 October 2010 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Query articles in "Category:Bad category" do {{edit categories|swap|Bad category|Good category}};Also in the future I can see opening up the bot's source code so other developers can add features to it. So think of it as a wiki-bot. :) So what do you all think, do you see any issues with it, will editors actually use it and contribute to it? d'oh! talk 09:33, 14 September 2010 (UTC)
See below for the updated proposal. -- d'oh! [talk] 16:22, 29 October 2010 (UTC)
Sounds interesting. -- Chris 13:38, 15 September 2010 (UTC)
This sounds a bit like an old project of mine. The language I adopted is a dialect of Lisp called Scheme, and I did at one point have a prototype running. Since then I've done little work on the bot side apart from building a bot framework in Scheme based on the API.
I also tried to make the job control aspects of the system independent of the language implementation, and that aspect of the project may be of interest. In essence, once a user is authorized to use the bot they can invoke it by typing a command on a page in their user space (the talk page would be okay for low volume users) Because it's script-based, it enabled editors to develop a script on the Wiki and then other authorized editors who may not have any programming skills can invoke the script. The bot would typically run in a jail on a host server that monitored pages, performed security checks to filter out unauthorized edits for instance, and then placed results into the invoking user's userspace. I felt that at least initially it would be too risky to permit the bot to perform changes outside userspace, and I envisaged it being used mainly to perform statistical analysis and construct lists of articles for work by a human user or another bot.
You could also implement scripts in Perl, PHP, Java, even C, but the main issue there is ensuring that the code cannot mess with the host system, remembering that anybody once authorized can write a script for the bot. The extreme reprogrammability of Scheme makes it ideal for this (though some people have an allergy to all the brackets). Creating a safe environment for running scripts is quite easy in R5R Scheme.
If anybody is interested, pleace leave a message on my user talk page. -- TS 13:55, 29 October 2010 (UTC)
object item { method doSomethingCool ( #something ) { ... do something to #something ... return self.doSomethingCoolToo(#something); } method doSomethingCoolToo ( #something ) { ... do something to #something ... return #something; } } #pages = pages.category('Jargon'); foreach (#pages as #page) { #website = website.url('http://example.com/' + #page.title); #result = item.doSomethingCool(#website.content); #page.content = #result; }
The updated idea is a platform to run code (scripts) from Wikipedia created by editors, the code will be picked up from pages on Wikipedia (via a template like {{ User:D'oh!/Coding}}). The code syntax will be object-oriented and will look very similar to PHP, C and Ruby, e.g.:
#pages = articles.category('Computer jargon').limit(10); foreach (#pages as #page) { delete #page.categories.find('Computer jargon'); }
The platform will ask for and use the editor's bot account, to make edits to Wikipedia, this will make the editor responsible for the edits their code does. Plus the editing privileges are transfered over to the platform, so editors can only do edits which they can do with their main account. The platform will either run the code continuously or once-off. -- d'oh! [talk] 16:22, 29 October 2010 (UTC)