From Wikipedia, the free encyclopedia
Archive 20 Archive 23 Archive 24 Archive 25 Archive 26 Archive 27 Archive 30


Changing the format of an infobox

Hello, I would like to change the format of {{ Infobox Taiwan Station}}. The current template codes on each page are like this:

{{Infobox_Taiwan_Station|
Title=|
SubTitle=|
EngTitle=|
EngSubTitle=|
ImageName=|
ImageSize=|
ImageCaption=|
Style=|
Place=|
Coordinates=|
Code=|
Operator=|
Line=|
StartDate=|
RouteType=|
TraLevel=|
TraElecCode=|
TraStartLocal=|
TraMile=‎|
}}

It is very disorganized. Based on other station infoboxes, I propose to change to this: (Note: The name of the infobox is changed too.)

{{Infobox Taiwan station
| title =
| en-title =
| image = example.jpg
| image_size = 250px
| image_caption =
| type = 
| address = 
| coordinates = {{coord||}}
| code =
| operator =
| line =
| opened =
| tra_level = 
| tra_code =
| tra_start = 
| tra_mileage = 
}}

Notice subtitles and route type are deleted because they are mostly unnecessary. Several names are replaced in order. Thanks, waiting for approval... impact F = check this 05:38, 25 January 2009 (UTC)

Bot needed to fix Year in Baseball template please

Hello. See Template talk:Year in baseball#suggestion. We need a bot that can replace the sidebar {{ Year in baseball}} with: 1) {{ Year in baseball top}}; and 2) at the bottom of the page the new footer {{ Year in baseball}}. This would be done on all of the "[YEAR] in baseball" articles, about 140 or so pages. Right before this is done, the code from the talk page needs to be copied onto the template so the current sidebar template becomes a navbox. Any takers? If so, please follow up at the talk page Template talk:Year in baseball#suggestion. Rgrds. -- Tombstone ( talk) 15:16, 21 January 2009 (UTC)

Doing... see talk page in previous comment. Richard 0612 20:25, 22 January 2009 (UTC)
Y Done. Richard 0612 15:10, 25 January 2009 (UTC)

in the template {{ Infobox Television film}}, i would like make it so that that filename only is required when adding images. this is easily accomplished by swapping:

this:

CURRENT
{{#if: {{{image|<noinclude>-</noinclude>}}} |
{{!}} style="text-align: center;" colspan="2" 
{{!}} {{{image}}} {{#if: {{{caption|<noinclude>-</noinclude>}}} 
| <br/><span style="font-size: 95%; line-height:1.5em;">{{{caption}}}</span> }}

with this:

NEW
{{#if:{{{image|}}}| 
{{!}} style="font-size: 95%; line-height:1.5em; text-align: center;" colspan="2" 
{{!}} [[File:{{{image}}}|{{#if:{{{image_size|}}}|<!--then:-->{{px|{{{image_size}}} }}
|<!--else:-->220px}}|]] {{#if:{{{caption|}}}|<br />{{{caption}}}}}


however, each article (>500) using this template will need to be gleaned of the wiki markup coding in that field.

 
from this: 

| image     = [[file:example.jpg|220px]]

to this:

| image     = example.jpg
| imagesize = 220px

.....submitted for your approval. -- emerson7 23:00, 23 January 2009 (UTC)

Yes, I'm going to have a go at it, but it's quite possible someone's already got a bot approved for this sort of thing and I will be happy to let them take over. Is it also possible that articles are using image and not file? Coding... - Jarry1250 ( t, c) 09:48, 24 January 2009 (UTC)
 Done - any problems? - Jarry1250 ( t, c) 18:54, 25 January 2009 (UTC)

Pride events

Is there a bot that can place move requests on every article in a template? All 60+ articles in Template:Pride Events need to be moved (through 1 discussion page of coarse), so is there a bot that can place the same move template on them all or will I have to manually do it? TJ Spyke 07:24, 25 January 2009 (UTC)

This would be an easy task to do with AWB. §hepTalk 20:02, 25 January 2009 (UTC)

Spelling change: Kveta Peschke to Květa Peschke

Can about change all the links from Kveta Peschke to Květa Peschke please, as this is her correct spelling. There might be 100s more similar to this requests, for Czech names, so maybe I will have to make a bot myself. -- Voletyvole ( talk) 11:28, 25 January 2009 (UTC)

I think (though I've been over-ruled many a time) that it might fall foul of WP:R#NOTBROKEN, but I'm sure a more experience bot writer would give you a better answer. - Jarry1250 ( t, c) 11:41, 25 January 2009 (UTC)
Declined Not a good task for a bot.: Jarry is right, it's usually not a good idea to make hundreds of edits to bypass a redirect, as it has little benefit. As long as it's an accepted alternate spelling updating the links doesn't really seem necessary here. Richard 0612 12:00, 25 January 2009 (UTC)
In addition, Kveta Peschke is by far the most common spelling in English sources so the article should probably be moved to that per Wikipedia:Naming conventions (use English). PrimeHunter ( talk) 12:29, 25 January 2009 (UTC)
OK. Thanks for the info. I think the message here is "it doesn't matter so much about spelling. You can do more worthy things with your time on WP". I will do more worthy things ;) -- Voletyvole ( talk) 16:01, 25 January 2009 (UTC)
I have added a rule to RegExTypoFix to make the change from "Kveta Peschke" to "Květa Peschke" here. We already have several similar rules to correct foreign names, so I don't see any reason not to add this one to the list. -- ThaddeusB ( talk) 16:39, 25 January 2009 (UTC)
Will you be fed up if I add a humungous bunch of Czechslovak and Polish (and other Slav) names to this AWB Typos list? I have made an offline list, featuring a shed load thus. I beg your pardon if this isn't quite the right place to ask (n00b errors, ya know dudes, what can y'all do?) Cheers 4 the links though. -- Voletyvole ( talk) 21:43, 25 January 2009 (UTC)
I personally wouldn't mind, but there are those that feel the list shouldn't be clogged with "worthless entries." As long as the error occurs at least 5-10 times on Wikipedia its probably safe to add. -- ThaddeusB ( talk) 02:28, 26 January 2009 (UTC)
This would be better than the first suggestion as (a) it could fix several misspellings per edit and (b) it would also fix ones which are not part of a link. — CharlotteWebb 02:52, 26 January 2009 (UTC)

Need a bot to update a table

I need a bot to update this table on a monthly basis. It's not too hard to do by hand, but then it's even easier for a bot. :) The required information for the quality/importance columns can just be copied directly from here. The AfD columns require that the bot parse the archive of this page for the previous month and count the number of bullets in the lists (the "*" character), as well as the number of instances of the words "Delete", "Redirected", "Keep" and "Merged". Thanks! SharkD ( talk) 05:19, 24 January 2009 (UTC)

Unless someone has an already existing bot that can handle this, I will be happy to incorporate this request into WikiStatsBOT. -- ThaddeusB ( talk) 06:25, 24 January 2009 (UTC)
Thanks very much! The final structure of the table has not been finalized: a third fourth set of statistics may appear shortly. However, none of the existing columns will be removed; so if you want to start figuring out how to create the bot, go ahead. SharkD ( talk) 11:28, 25 January 2009 (UTC)
OK, I added the fourth set of statistics. The stats can be gotten from here. You just need to enter the correct page link and date and then copy the total. SharkD ( talk) 11:58, 25 January 2009 (UTC)
I removed them again since they're duplicates of information that appears in another page already. SharkD ( talk) 20:00, 26 January 2009 (UTC)

Template:Swiss Presidents

Please update {{Swiss Presidents}} to {{Presidents of the Swiss Confederation}} as according to Special:WhatLinksHere/Template:Swiss_Presidents about 80 articles of former presidents still use the old name template, which causes problems, like these articles still showing up at the disambiguation Special:WhatLinksHere/Eduard_Müller for some reason, instead at the proper article Special:WhatLinksHere/Eduard_Müller_(Swiss_politician). The admin that had made the move a year ago had not bothered to fix this. --  Matthead   Discuß   22:25, 28 January 2009 (UTC)

Need another bot to update a different table

I need a bot to update the tables in this page. The information can be gotton using the article traffic statistics tool. There's also a backlog of about 8 months that would need to be filled on the bot's first run. If the bot could also update the graphs it would be doubleplusgood. Ideally I would like the graphs to be SVG images instead of PNG, but I'm not real sure how this would be accomplished. Thanks! SharkD ( talk) 20:02, 26 January 2009 (UTC)

Coding... Lego Kontribs TalkM 01:09, 29 January 2009 (UTC)

Put stub template on stub articles

Can anyone put {{ Taiwan-film-stub}} and replace {{ Taiwan-stub}} and {{ Film-stub}} on Taiwanese film stubs? The list is here. Thanks! :)

Can anyone do the request I had above??? impact F = check this 00:32, 27 January 2009 (UTC)

You might want to also try requesting this over at Wikipedia:WikiProject Stub sorting, since I think that's one of the reasons why the project exists. SharkD ( talk) 01:49, 27 January 2009 (UTC)
I did. We had a consensus to create the stub and stub category. Here. Thanks for concerning. impact F = check this 01:59, 27 January 2009 (UTC)
This is probably a task better suited for AWB. You can request it be done by someone with AWB here. -- ThaddeusB ( talk) 05:01, 27 January 2009 (UTC)
Doing... with AWB. §hepTalk 22:03, 27 January 2009 (UTC)
Y Done §hepTalk 03:01, 29 January 2009 (UTC)

WikiProject Triathlon banner tagging

Could we get the WikiProject Triathlon banner ( Template:WP Triathlon) on all articles in Category:Triathlon and Category:Duathlon and all subcategories of both (I have checked for exceptions, but there are none). A number of articles have already been tagged. Thanks. Yboy83 ( talk) 09:42, 28 January 2009 (UTC)

Possible Possible : TinucherianBot ( talk · contribs · count) can do this for you -- Tinu Cherian - 11:52, 28 January 2009 (UTC)
Doing... : TinucherianBot ( talk · contribs) working on this -- Tinu Cherian - 19:09, 28 January 2009 (UTC)
Y Done : The bot task is completed. Tagged around 455 article/cat talk pages -- Tinu Cherian - 02:24, 29 January 2009 (UTC)
Many thanks. Regards, Yboy83 ( talk) 08:39, 29 January 2009 (UTC)

Uncategorized articles

The backlog at Wikipedia:WikiProject Categories/uncategorized is running low. The category is usually populated by Alaibot directly from database dumps but it seems that Alai is AWOL. I don't know if anyone has code available to perform a similar task, nor do I know if Alai ever published his code. In any case, I'd appreciate if someone can at least tag articles of Special:UncategorizedPages which are indeed uncategorized. Note that in fact most of the articles appearing on the special page are categorized because of cache issues which is why Alaibot was so useful. Pichpich ( talk) 21:37, 30 January 2009 (UTC)

PS: anybody know what's up with Alai? Pichpich ( talk) 21:37, 30 January 2009 (UTC)
I currently run through the special page with User:UnCatBot, though I suppose I could use the API to generate the data manually, which would alleviate the cache problem. — Nn123645 ( talk) 02:03, 31 January 2009 (UTC)

Prodbot

Not sure if there's any bot that already fulfills this function, but I monitor articles proposed for deletion from time to time and find it quite tiresome to add {{ oldprodfull}} to talkpages. What about having a bot search old revisions of articles for prod notices and updating talkpages with the appropriate details accordingly? Skomorokh 02:39, 31 January 2009 (UTC)

Betacommandbot

PLEASE reinstate betacommandbot! he was sooooooooooo cool! please reinstate him soon! why did you get rid of him??????? —Preceding unsigned comment added by 216.160.167.169 ( talk) 21:19, 31 January 2009 (UTC)

for these reasons, in short. - Jarry1250 ( t, c) 21:27, 31 January 2009 (UTC)

Delsort

Would someone have the time and inclination to create a one-step WP:DELSORT process? User:Hrafn suggested that a bot periodically pick up on delsort tags placed directly in an AfD and automagically transclude the AfD in question on to the target page if not already present. It would cut the work of deletion sorting roughly in half. Jclemens ( talk) 18:48, 29 January 2009 (UTC)

So the request is that the bot check active AFDs for links to any subpage of Wikipedia:WikiProject Deletion sorting, and add a transclusion to the page in question (just under the "<!-- New AFD's should be placed on top of the list, directly below this line -->") if it's not already present? Anomie 00:59, 30 January 2009 (UTC)
Exactly. It would allow delsort tagging without needing to depart from the AfD/t page. Jclemens ( talk) 01:32, 30 January 2009 (UTC)
The detection algorithm would likely be very similar to the one that WP:RFC uses to detect {{RFC[topic]}} templates in article talk pages (but simplified by the fact that it would only have to check the AfD subspace), but instead look for {{subst:delsort|[topic]}} (may need to be altered slightly and/or turned into a non-subst template). Hrafn Talk Stalk( P) 03:30, 30 January 2009 (UTC)
Coding... No need to unsubst the template, it just needs to look for a link to any WP:DELSORT subpages in any active AFD and compare that to the list of templates transcluded in those subpages. Anomie 03:42, 30 January 2009 (UTC)
I'm having a slight issue, the finding of the AFDs part is working fine but it seems every sorting page has a slightly different "insert new AFDs here" line (and some lack the line completely). Anomie 12:06, 30 January 2009 (UTC)
Sounds like a good idea, though I would strongly recommend Jayvdb's awesome script to anyone unaware of it - it already makes sorting a one step process. the wub "?!" 13:28, 1 February 2009 (UTC)

IMDb links

Would it be possible to have a bot check through all articles using the |imdb_id= parameter in {{ Infobox Film}} to see which of these do not otherwise contain a link to IMDb, i.e. through the use of {{ imdb title}} (or any of its redirects), and present this data in the form of a numbered list? Such information would be useful in an ongoing debate over the use of such parameters in the infobox. Thanks in advance for any help! :) PC78 ( talk) 15:34, 2 February 2009 (UTC)

User pages in article space categories

Can a bot remove all user pages from article space categories and monitor and new additions? I am often removing user page categories since uses will:

  • create draft articles in user space for eventual moving to article space
  • editors sometimes userfy articles and forget to remove or comment out the categories.

It seems a feasible task for a bot. Hmm. Had a think about it. How do you define a non-article category? Everything under Category:Wikipedia administration? All cats (with some exceptions) that have Wikipedia or template in the category name? -- Alan Liefting ( talk) - 02:56, 4 February 2009 (UTC)

Annoying as it may be when using a category, a user busy fettling a copy of an article in userspace will not thank you for coming along and stomping all over the categories in the copy article. If you are going to do anything, at best, comment out the cats, don;t delete them. -- Tagishsimon (talk) 03:33, 4 February 2009 (UTC)
Yes. That is what I have been doing. I have not received any flack yet even though I have deleted some on occasion. If it was done by a bot I would no longer in the cyberfiring line. Some users may not be aware of their subpages even though there is now the "subpages" link in the contributions page. -- Alan Liefting ( talk) - 05:07, 4 February 2009 (UTC)

OrphanBot

Could someone please write a bot to patrol this page to tag orphaned articles as such? This would help WikiProject Orphanage in our work at de-orphaning articles. Thanks, ErikTheBikeMan ( talk) 21:47, 3 February 2009 (UTC)

Tell me that you are not going to put a visible tag onto an orphan article? The result of this will be thousands of article's so tagged for the next n years. Tagging on this scale is akin to vandalism. -- Tagishsimon (talk) 03:34, 4 February 2009 (UTC)
The hyperbole is uncalled for and completely incorrect. Tagging orphan articles as orphans is standard practice and has been done for years. It is a standard maintenance tag that assists the encyclopedia. -- JLaTondre ( talk) 03:51, 4 February 2009 (UTC)
That's your view. Mine is that such tags disfigure articles. There is little prospect that many articles will be un-orphaned. The corollary is that such tags will persist, to no good effect, for years to come. Being an orphan is not the most important thing about an article. Indeed it barely rates. Mass tagging of such articles would be a crime. -- Tagishsimon (talk) 03:54, 4 February 2009 (UTC)
And clearly everybody disagrees with you. BJ Talk 03:55, 4 February 2009 (UTC)
Possibly, though I always like a little evidence to back up a wild assertion like that. Take it to Wikipedia:Village pump (policy)#Policy on Article Tags and put the bot request on hold until it is settled. -- Tagishsimon (talk) 04:08, 4 February 2009 (UTC)
The very fact of the template's existence and continued use is very clear evidence. BJ Talk 04:19, 4 February 2009 (UTC)
As I said, take it to Wikipedia:Village pump (policy)#Policy on Article Tags. -- Tagishsimon (talk) 04:21, 4 February 2009 (UTC)
And oddly for your thesis, BJ, Template talk:Orphan sees a 5:5 split between those who think spamming articles with the orphan tag is a good idea, and those who do not. Clearly your "clearly everybody disagrees with you" angle was just invention. -- Tagishsimon (talk) 04:43, 4 February 2009 (UTC)

Excellent idea. Saves us the trouble of clicking "What link here" on a bunch of stubs. Over time these will be de-orphaned, and thus get more attention, thus improve. Tag away!

Replacing accessmonthday parameters in citeweb template

Could someone (or something) replace all the outdated accessmonthday fields in Ayumi Hamasaki with whatever the correct field(s) is/are? I r t3h n00b when it comes to this kind of thing, so sorry if this is in the wrong section or whatever. Thanks! Ink Runner ( talk) 18:30, 3 February 2009 (UTC)

It would be quite difficult for a bot to determine whether or not the url still contains the content we are intending to cite. In some cases it may be replaced by new content, with the old stuff being moved elsewhere (needing to be located again and perma-linked if possible). In other cases it may be deleted altogether but the host server does not provide a proper 404 notice which such a bot might depend on. Overall this would create too many false positives and defeat the purpose of tracking the most recent date on which the correct content was known to be accessible. — CharlotteWebb 19:07, 3 February 2009 (UTC)
Ah, okay. Thanks muchly! Ink Runner ( talk) 19:12, 3 February 2009 (UTC)
The only way this could work well is if the bot had a specific string of text to check for, such as the exact quotation being cited in the ref (of course everybody knows footnoted quotes are bad juju, so proceed with caution). — CharlotteWebb 20:16, 3 February 2009 (UTC)
I may have misunderstood Ink Runner ( talk · contribs)'s request (this was a topic of discussion over at Talk:Ayumi Hamasaki first); but I don't think we're looking to revalidate the reference. Rather, just change from using accessmonthday (and accessyear), instead using accessdate. For example, if a reference now has:
accessmonthday=December 2|accessyear=2008
We'd like it instead to have:
accessdate=2008-12-2
If I've got that correct, that seems like a task that would be amenable to botting. Ink Runner, did I read you correctly? TJRC ( talk) 21:41, 3 February 2009 (UTC)
Wikipedia:Bots/Requests for approval/MelonBot 12 might be of some interest. - Jarry1250 ( t, c) 21:44, 3 February 2009 (UTC)
Okay, that would actually be a good idea and easy to do too. Oops, I read "outdated" to mean "too many days ago" etc. — CharlotteWebb 02:38, 4 February 2009 (UTC)
Looks like MelonBot's on it. It's hit three articles on my watchlist since this discussion (including Days/Green, Ink Runner). I imagine it's just a matter of time until it gets to Ayumi Hamasaki. TJRC ( talk) 07:33, 4 February 2009 (UTC)

Archiving with delay

Since we often discuss highly contentious issues in our project ( WP:SLR), we agreed to wait some time after marking topics as "resolved", before archiving them, so people get a chance to say: "No, this isn't resolved yet!" I asked Cobi, whose well documented bots I would have loved to use, but they can't do that. Earlier, we had a purely time triggered bot, but the problem with that was that it also archived sections that were just at a momentary standstill. — Sebastian 09:35, 31 January 2009 (UTC)

Something that I've seen used to delay archiving is to sign the section with a timestamp in the future. I'm not sure if that works for all archive bots, though. -- Carnildo ( talk) 09:54, 31 January 2009 (UTC)
Great idea! We could start from a timestamps like {{#time: Y-m-d h}}. [1] The two things that are missing are (1) How to add a day or two to that timestamp? (2) Which bot can look for a different day each time? — Sebastian 10:16, 31 January 2009 (UTC)
With ClueBot III you could set the archive time to something extremely old (ie 100 days) and when the discussion was done tag it with {{ resolved}}. Or, WP:GL/I uses DyceBot operated by Dycedarg. The bot tags a section as stale after 2 weeks of no activity, then archives them a week later. When a section is marked with {{ resolved|1=~~~~~}} it archives the section after the tag has been placed for three days. He might be able to do something similar for you. §hepTalk 03:45, 2 February 2009 (UTC)
Cool, it sounds like DyceBot does just what we need! I'll ask Dycedarg. Thanks a lot for the lead! — Sebastian 19:41, 5 February 2009 (UTC)

Category tagging requests

Could someone tag all the following categories with their proper banners?

-- Jeremy ( Blah blah...) 20:29, 4 February 2009 (UTC)

You're quite sure that you want Category talk:Red Bull Air Race World Series pilots tagged with {{ WikiProject Soft Drinks}}? How about Category talk:Snooker and Category talk:Cocktails with WPBeer? The other two look alright, but the paragraph at the top of the page was added for a reason: tagging by category is hit-and-miss at the best of times, and just tagging the actual categories doesn't decrease the risk that much. Happymelon 21:04, 4 February 2009 (UTC)
  1. Soft drink advertising is a sub cat of soft drinks, so yeah.
  2. I took care of the Beer/Drinking establishments entanglements, they are sibling cats not parent/child cats.
  3. As far as I can tell, Category:Cocktails is not a child of category:Beer, there is an overlapping Category:Cocktails with beer but not a parent/child relationship.
I have looked at these categories down about three branches and can see some overlap but no major entanglements other than the beer/bar one that I fixed. So, I believe that this is a goof to go. -- Jeremy ( Blah blah...) 21:38, 4 February 2009 (UTC)

Remove flag

Can some one have a bot find {{flagicon|Ireland|rugby}} and replace with [[Ireland national rugby union team|Ireland]] as per Wikipedia_talk:WikiProject_Rugby_union#Wider_opinion_needed Gnevin ( talk)

Also {{ru|IRE}} Gnevin ( talk) 23:35, 2 February 2009 (UTC)

Identifying virtually identical redirects with different targets

I found these two redirect pages redirecting to two DIFFERENT articles:

That should not happen and I fixed it. I've seen this situation maybe a couple of dozen times. In another case, I found these three redirecting to three DIFFERENT articles:

A bot cannot decide what pages things like this ought to redirect to, if any, but I would think a bot could be constructed to

  • find things like this;
  • make a list of them so that Wikipedians can go down the list and find those within their competence and fix them;
  • possibly call them to the attention of the appropriate WikiProjects based on the target articles' category tags. Michael Hardy ( talk) 18:39, 4 February 2009 (UTC)
  • Could you specify a little more precisely how "things like this" are to be defined? -- R'n'B ( call me Russ) 13:57, 6 February 2009 (UTC)
  • Differing only in choice of dash, internal spacing with the dash capitalization, and a final "s"? Just a wild guess.... — Arthur Rubin (talk) 02:30, 7 February 2009 (UTC)

Internal project page tagging

Could someone tag all the following projects with their proper banners and proper categories?

  • All pages under Wikipedia:WikiProject Soft drinks with {{WikiProject Soft Drinks |class=project |importance=na}}, categorize as [[category:WikiProject Soft Drinks|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Spirits with {{WikiProject Spirits |class=project |importance=na}}, categorize as [[category:WikiProject Spirits|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Wine with {{WikiProject Wine |class=project |importance=na}}, categorize as [[category:WikiProject Wine|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Beer with {{WikiProject Beer |class=project |importance=na}}, categorize as [[category:WikiProject Beer|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Food and drink with {{WikiProject Food and drink |class=project |importance=na}}, categorize as [[category:WikiProject Food and drink|{{PAGENAME}}]]?

Thanks, Jeremy ( Blah blah...) 20:04, 5 February 2009 (UTC)

Do you mean subpages of the project page? Lego Kontribs TalkM 04:45, 6 February 2009 (UTC)

yes. -- Jeremy ( Blah blah...) 06:31, 6 February 2009 (UTC)

Coding... Lego Kontribs TalkM 03:57, 7 February 2009 (UTC)

Can someone add an automatic bot that patrols the pages in the Heroes template? Raiku Lucifer Samiyaza 04:00, 7 February 2009 (UTC)

Redundant As X! mentioned earlier, ClueBot, VoABot II, SoxBot III are already running. There is no need for a bot to "patrol" such a small set of pages. Anomie 04:09, 7 February 2009 (UTC)

Can someone add an automatic bot that patrols featured articles? Raiku Lucifer Samiyaza 04:04, 7 February 2009 (UTC)

Redundant As X! mentioned earlier, ClueBot, VoABot II, SoxBot III are already running. Anomie 04:09, 7 February 2009 (UTC)

Stub templates not updating

Greetings all! This is my first bot request, so be kind. Over at WP:SFD we need something that will update stub categories on hundreds of articles, as the templates have been either renamed, deleted, or redirected. Alaibot used to do this, but sadly Alai has vanished off the radar since Dec 13, and work is backing up. There are currently, for example, articles linked to the non-existent Category:European organization stubs which should fall into the new Category:European organisation stubs, since the template's category was renamed. I hope I'm explaining myself all right. Can we recruit a bot to go through the stub cats on a regular basis and fix this? I assume it's due to server lag, but no one wants to null-edit a gazillion stub articles (I'm sure that's what caused Grutness' arthritis...;). Cheers, Pegship ( talk) 05:37, 8 February 2009 (UTC)

Forgive the huge gap in my understanding of Wikipedia, but is there not some sort of redirection from one category name to another? If there were, you might come unstuck on WP:R#NOTBROKEN, but for my own laziness I have no idea whether there is or isn't. - Jarry1250 ( t, c) 09:49, 8 February 2009 (UTC)
I'm not sure whether that's the issue (aren't we a pair?)...What happened is the template {{ Euro-org-stub}} used to place articles in Category:European organization stubs. The category was renamed Category:European organisation stubs and changed accordingly in the template code. I assume that if there were no server lag, articles tagged with {{ Euro-org-stub}} should start moving from Category:European organization stubs to Category:European organisation stubs. That wouldn't require a redirect, I think. Pegship ( talk) 21:07, 8 February 2009 (UTC)
This will get fixed automatically by the job queue. All the pages will be updated so that they now reflect the proper category. Lego Kontribs TalkM 03:21, 9 February 2009 (UTC)
That has been my experience in the past, but since about mid-December I have noticed a longer and longer wait for the work to get caught up. Right now the job queue is at 1.5 million - is this high, normal, abnormal? Pegship ( talk) 03:45, 9 February 2009 (UTC)
Manual:Job queue says a few million isn't abnormal during peak hours; I generally see it at around 500,000 – 1+ million. §hepTalk 02:32, 10 February 2009 (UTC)
If it turns out that you do need a bot, my AWB bot can assist you. Let me know and I'll get started on the approval process. Robert Skyhawk So sue me! ( You'll lose) 04:58, 9 February 2009 (UTC)

Not exactly a bot request, buy you guys are the computer geniuses. A new way to communicate.

Please take a look at Wikipedia:Village_pump_(proposals)#IM_and_VOIP. Thanks. - Peregrine Fisher ( talk) ( contribs) 06:42, 11 February 2009 (UTC)

Speedy-delete bot

Hi, I was just wondering if it would be possible to make a bot that would check, when someone removes a speedy delete tag, that it wasn't the page creator, and if it was, revert the edit? It would also be nice if the bot could move {{ hangon}} templates to the proper location. This would make monitoring new pages much easier, since ATM every time I csd an article I have to watch it until it gets deleted, because there is a 25-30% chance that the creator will remove the tag. -Zeus- u c 01:34, 10 February 2009 (UTC)

I thought there was a bot that already handeled this task. — Nn123645 ( talk) 20:41, 10 February 2009 (UTC)
I don't think so, it's happened a lot to me and always I have to revert it. I don't know if there's something for AfD, but that would be nice too. -Zeus- u c 20:52, 10 February 2009 (UTC)
Ahh here it is I guess it didn't get approved. — Nn123645 ( talk) 20:55, 10 February 2009 (UTC)
Doing... while I hate to take on another project because I feel like I'm over extending myself, I think I can do this fairly easily by modifing code I have already written. — Nn123645 ( talk) 20:59, 10 February 2009 (UTC)
Wow, thanks. It looks like the creator withdrew it cause he couldn't get it work. After reading some of the concerns in the original thread, maybe it would be better if it just warned whoever put the csd if it gets removed by the creator. I don't know what's protocol for bots. -Zeus- u c 22:09, 10 February 2009 (UTC)

If the original article falls into a class like "nocontent" or "nocontext", and the author fixes the article and then removes the speedy tag, reverting these edits would essentially be vandalism by the bot and might cause us to lose a good article (if the CSD-patrolling admin doesn't properly check the history). Wikipedia is not a bureaucracy, let's not turn it into one by making a strong but ignorable rule on CSD tags into bot-enforced policy. Kusma ( talk) 12:59, 12 February 2009 (UTC)

I see your point, I'm thinking I will set up the bot so it will only revert new editors (say with less than maybe less than 200 mainspace edits), and be 1RR complient. In the case of an an experienced editor I think the best option would be to place a notice on the person's talk page who placed the tag and not on the page creators page. As far as that sernerio the CSD tag is only a request to delete the page, it is up to the admin to verify that the page really does meet CSD, if the admin is wrong there is always WP:DRV for the page creator to get the page recreated. The speedy tag is pretty clear that only another user is allowed to remove the tag, if the actual page creator wants to argue his/her case he/she should do so with {{ hangon}}. — Nn123645 ( talk) 13:19, 12 February 2009 (UTC)
In any case, the bot should not revert the edit that removed the CSD template, but rather re-add the CSD template to the (possibly expanded) article (the new page might no longer be an A1 or A7). Another case where reverting might not be the best course of action is the fairly common scenario where a page is created, then CSD tagged, and then the author blanks the page. It is not necessary to further embarass the author by reverting this edit; instead, a {{ db-blanked}} should be added. A good bot for this is probably failry complicated, and a bad bot is too scary for newcomers. Kusma ( talk) 15:23, 12 February 2009 (UTC)
I can see this is going to get more complicated than I originally thought (which is pretty much the story of anything), I'm thinking I will have the bot use both the IRC and RSS feeds to get the diffs of what exactly was removed, if it was just the template I will have the bot use undo to save some bandwidth (not rollback so it doesn't revert all edits by that user), if its something other than just the templates I will have the bot readd the tags using &section=0. I suppose I could try to determine the type of CSD using various methods (size of page, number of sections, etc., see this list to see the (general) types of things I'm thinking of ) to make sure it does readd a wrong CSD and notify the person who placed. Depending on whether the page may still meet the criteria I can have it decide whether to only notice the person who requested speedy deletion or undo the edit and notify. — Nn123645 ( talk) 15:37, 12 February 2009 (UTC)

Removing and replacing inappropriately placed protection templates

The recent addition of the {{PROTECTIONLEVEL}} magic word has allowed the protection templates to output a category instead of visible material when they are placed inappropriately, that is, when the protection level of the page does not match the protection template. This category is visible at Category:Wikipedia pages with incorrect protection templates. I initially thought that the speed at which this category would fill would be low enough that it would be manageable; I was wrong.

Therefore, I request that someone create a bot to handle most cases, using basic logic along the lines of:

  1. IF the page is not protected, THEN remove all protection templates from the page
  2. IF the page is edit-protected but the move-protection is autoconfirmed, THEN remove all move-protection templates from the page
  3. IF the page is move-protected AND the move-protection is sysop AND the move-protection expiry is "indefinite", THEN add {{ pp-move-indef}} to the page

This bot could be run once daily to eliminate virtually all the backlog for the aforementioned category. The logic could be improved to make more intelligent decisions and provide other benefits, but this minimal workflow would be sufficient. Thank you for considering this request. {{ Nihiltres| talk| log}} 03:46, 8 February 2009 (UTC)

Coding... Lego Kontribs TalkM 04:53, 12 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/Legobot III Lego Kontribs TalkM 16:40, 13 February 2009 (UTC)

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Cleaning up ISBN entries in infoboxes

It was formerly the case that filling in the |isbn= field with a raw number in infoboxes such as {{ Infobox book}} did not activate the ISBN magic coding linking to Special:Booksources. This meant that "ISBN" had to be entered into the field as well: that is "|isbn=ISBN 1412806461" rather than simply |isbn=1412806461. The code seems to have been fixed now, meaning that there are a lot of entries with redundant "ISBN" coding. It's obviously an issue of minor style/presentation importance, but should be a relatively easy task to code for. All a bot would have to do is check whether "ISBN" followed "|isbn=" and if so, remove the former (with appropriate spacing). Anyone willing to take this on? Skomorokh 16:33, 12 February 2009 (UTC)

How "important" is it - just how bad does it look with the extra ISBN in it? If it's worth doing, I'd be more than happy to have LivingBot do it. - Jarry1250 ( t, c) 17:38, 12 February 2009 (UTC)
Not very important, it just makes us look a little amateurish. See the infobox here for an example. Skomorokh 21:50, 13 February 2009 (UTC)
Some of them are also enclosed in quote marks (isbn = "ISBN ###"). - Jarry1250 ( t, c) 10:42, 14 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/LivingBot 7

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Change of administration means massive cleanup project needed

  1. The Obama administration has completely scotched the websites of the Bush administration. No redirects, no nothing, everything except the pre-2001 history pages is just down the memory hole. Hey, it's their computers, they can do what they want, but for us, this means that thousands of links in the encyclopedia to www.whitehouse.gov/* are now dead links, and need to be changed to http://georgewbush-whitehouse.archives.gov/ -- but surely in the last few months people have added links to legitimate www.whitehouse.gov pages. So care is needed.
  2. Somewhat more trivially, there are about 200 articles about cases pending in United States courts that are now all mistitled as X v. Bush. Because Bush was sued in his official capacity, rather than in his personal capacity, the cases have been renamed under Federal Rule of Civil Procedure 25(d), and articles need to be moved to X v. Obama pages.

I raise it here in case there's an easy way to set up a project or bot to tag all these dead links and move all these outdated/mistitled pages. (Some of the v. Bush cases are closed, and thus correctly titled; some of the whitehouse.gov pages work, so not all the links are dead, so perhaps not.) THF ( talk) 06:25, 13 February 2009 (UTC), updated 14:06, 13 February 2009 (UTC)

If this were brought to the administration's attention, could the fact that there are thousands of such links and the fact (if it is a fact?) that Wikipedia is an important institution influence them to put back some links en masse? Michael Hardy ( talk) 21:48, 13 February 2009 (UTC)
There might be legal reasons why the old site was moved; and, to be fair, it might have been low-level staffers in the Bush administration that did it. But if we get a bot going fast, it will be less of a problem. NB that editors unaware of what has happened are simply removing dead links instead of tagging them, so we're already starting to lose information. THF ( talk) 13:51, 14 February 2009 (UTC)

So save the list of articles and urls now, and work on developing a bot in the meantime. That way you'll be able to get a list of pages where you'd need to manually review the edit history after the bot finishes (ones where the bot cannot find a link to modify). — CharlotteWebb 20:52, 14 February 2009 (UTC)

Wouldn't even know where to begin with that. I'm flagging this for some soul who wants to save Wikipedia from a real problem. THF ( talk) 21:22, 14 February 2009 (UTC)
The bot could go around, and check every link. If it comes up with a 404 error for both, it skips the link. X clamation point 22:04, 14 February 2009 (UTC)

Coding... for the first part; the plan is to check the target link for a 404 response, and then check the replacement for a 200 before replacing. I'll probably throw in a log of links without a 200 on the replacement for human processing. If someone can tell me how to determine which cases need to be moved and which don't, I could look into the second part too. Anomie 00:12, 15 February 2009 (UTC) (X! already started coding Anomie 00:33, 15 February 2009 (UTC))

Sounds right. For the first part, if there's a way to instruct the bot to add a {dead link} tag, that would also alert editors.
For the second part, if the case is pending, it needs to be renamed; if there was a Supreme Court decision before Jan. 20, 2009, then the Supreme Court decision should be the page name. Unfortunately, the articles are quite a mess: some say "George W. Bush", others say "Bush"; and they haven't been maintained to show what the current status is. That might have to be done by hand, though a bot could add "update" tags to everything in the category. THF ( talk) 00:18, 15 February 2009 (UTC)
Coding... I was already writing this, but forgot to add the tag. I remember mere minutes after anomie put it on. I have talked with anomie about it, and they has conceded the programming to me. X clamation point 00:32, 15 February 2009 (UTC)
I don't think there's any need for an update tag, someone just has to go through everything in the category and check the status if it needs to be done by hand. If there is any sort of online database where the bot could look up the status of the case, that would be sufficient for a bot to do it. Anomie 00:33, 15 February 2009 (UTC)

Bot tagging for WP:FILM

Is it possible to have a bot run through all subcategories of Category:Films by year (excluding Category:The Wizard of Oz (1939 film), Category:Dragnet, Category:Dragnet episodes, Category:Monty Python and the Holy Grail, Category:Sholay, Category:Donnie Darko, Category:Chak De India, Category:Enchanted (film) and Category:Songs from Enchanted) and ensure that all articles have the {{ Film}} project banner on their talk page, keeping all existing assessments and other parameters where they currently exist?

In addition, can the bot add the appropriate task force parameters to the banner in the following categories:

I made this proposal at Wikipedia talk:WikiProject Films#Bot tagging articles for WP:FILM last week, where it met with approval. Let me know if you need anything else, this is my first such request here. :) PC78 ( talk) 23:07, 7 January 2009 (UTC)

Doing... Lego Kontribs TalkM 01:02, 8 January 2009 (UTC)
Any progress with this? PC78 ( talk) 23:29, 14 January 2009 (UTC)
Yes, I was wondering what had happened to this too! Lugnuts ( talk) 12:46, 16 January 2009 (UTC)
Off wiki issues came up. I am finishing the code for this now. Lego Kontribs TalkM 02:03, 19 January 2009 (UTC)
Cool, no worries. :) PC78 ( talk) 02:11, 19 January 2009 (UTC)

Thanks, I see Legobot did a run on this earlier today. I see that only articles starting with A and B were tagged; will the bot therefore be doing this in stages, rather than a single sweep? PC78 ( talk) 19:41, 22 January 2009 (UTC)

It hit an error while running so it only got to the B's. I have restarted it, so it should start going again. Lego Kontribs TalkM 05:03, 23 January 2009 (UTC)
Ah, OK. Can you please let me know when it's finished doing it's thing? Cheers! PC78 ( talk) 01:11, 24 January 2009 (UTC)

Y Done Lego Kontribs TalkM 02:16, 28 January 2009 (UTC)

One more request: can the bot add the appropriate task force parameters to the banner in the following categories:

These are two recent task forces of ours for which doing a manual tagging run would be extremely difficult, especially given the two countries' prodigious output and the English Wikipedia's natural systemic bias towards more comprehensive coverage of these national cinemas. Many thanks! Girolamo Savonarola ( talk) 18:58, 2 February 2009 (UTC)

Do the above excluded categories still have to be excluded? Lego Kontribs TalkM 00:22, 3 February 2009 (UTC)
Exclude the following: Category:United Kingdom film biography stubs, Category:Shaft, Category:High School Musical, Category:The Cheetah Girls, Category:T*Witches, Category:Zenon, Category:Police Squad! episodes, Category:National Lampoon's Animal House, Category:Beauty and the Beast, Category:Saw, Category:Lilo & Stitch, Category:The Lion King, Category:Enchanted (film), Category:Films distributed by Disney, Category:The Incredibles video games, Category:Walt Disney movie posters, Category:Asian American filmmakers, Category:English-language South Asian films, Category:Disney franchises, Category:Films distributed by Buena Vista International, Category:Shrek, Category:Evil Dead, Category:Looney Tunes games, Category:Images from Tom and Jerry, Category:Tom and Jerry video games, Category:Austin Powers games, Category:Final Destination, Category:Hercules: The Legendary Journeys, Category:Xena: Warrior Princess, Category:Tron video games, and Category:X-Men music. Thanks again! Girolamo Savonarola ( talk) 07:05, 3 February 2009 (UTC)

Reviving Any progress with this? Many thanks, Girolamo Savonarola ( talk) 13:28, 14 February 2009 (UTC)

As mentioned above, blindly recursing (even with a blacklist) is not acceptable. You need to specify exactly which categories needs tagged. Q T C 21:21, 17 February 2009 (UTC)

With respect, that did not stop the previous request from going ahead. PC78 ( talk) 21:28, 17 February 2009 (UTC)
With respect, it shouldn't have. It causes more problems then it fixes when it goes wrong. Using a list of categories that need the tag is simpler and has no chance of causing a complete mess of things. Q T C 21:56, 17 February 2009 (UTC)
How so? The categories to be tagged will be the same in either case. PC78 ( talk) 22:28, 17 February 2009 (UTC)
They may be the same at one point, but wont always. Say you go through the cat's and subcat's, think it's good. You post this botreq. Then in the time between you going through the list and somebody running the job, somebody makes a change somewhere in any of those n-levels of categorization. Can you be 100% sure wrong categories wont get marked? No. In the time it takes you to go through the list the first time, you could specify them in this botreq and turn the chance of incorrect tagging to 0. Q T C 23:30, 17 February 2009 (UTC)

Semi-automated translation of missing articles

After posting a more detailed description of my idea here, I would like to drop it by here to see what kind of reaction I will get. I would like to create a process where a bot goes to pages on the many other language wikis, runs a translate script on the article title, and looks for an article on the en wiki with both titles. When there's no article, it outputs to a list, with links to the other language articles, as well as multiple free language specific translate tools. Volunteers would then follow up on the list, and create article/remove them from the list when appropriate.

This bot would be run periodically to refresh the en wiki's shortcomings, and a blacklist can also be setup for pages that are determined to not be wanted. There's some other bits, but that's the jist of it. Anyways, although the bot would do several specific tasks, the most important is the first one: is it possible to retrieve interwiki data from the sidebar? -- Nick Penguin( contribs) 04:49, 17 February 2009 (UTC)

I don't think this is a good idea. Probably should let the discussion run at the village pump before getting a bot underway... Calliopejen1 ( talk) 18:02, 18 February 2009 (UTC) Didn't read thoroughly, will follow up. Calliopejen1 ( talk) 18:09, 18 February 2009 (UTC)

Just pointing out something...

I would like to request a bot to run at a future date to change all instances of ƿ to w and all instances of ȝ to g for the Anglo-Saxon wiki. There may be need of one to switch accents to macrons (á to ā). How easy would this be? --JJohnson1701 (talk) 07:33, 11 January 2009 (UTC)
(Please excuse my ignorance of the Anglo-Saxon language if the following sounds stupid). Technically, this is a very simple task. However, it is unlikely to be a desirable change. While, the majority of the usage of these characters are probably wrong it is unlikely they are all wrong. Articles often contain snippets of foreign language text (on purpose) and even references to characters themselves.
So even if these characters are never used in Anglo-Saxon words they might still be used correctly in articles. --ThaddeusB (talk) 00:37, 12 January 2009 (UTC)
The purpose in the Anglo Saxon Wiki would be a stylistic (MoS) change, and if it's agreed by consensus there I would be happy to do this task. Rich Farmbrough, 09:27 15 January 2009 (UTC).

This snippet is from archive number 24 I believe. While his request sounds innocent enough, what he didnt furnish was the fact that his request is based upon a bias of a very heated warfare about orthography.

Ultimately after all the blood and sweat, the users decided to make use of duplicate page versions, in a similar manner of the Gothic Wikipedia. When I say "the users," I mean regular contributors, as James/JJohnson1701 is never an active contributor, only surfaces once every several months, and has not written a single page which is any more than half a page in length.

That in itself is meaningless, but the point is, in making this move, he is attempting to defy our consensus to make use of both practices, as that is the only decision which gave us peace, and the ability to continue the project.

Discovering this, after all of that was over, is appalling quite frankly. Do not grant this bot request. — ᚹᚩᛞᛖᚾᚻᛖᛚᛗ ( talk) 14:05, 17 February 2009 (UTC)

This is the bot request page for the English Wikipedia; not the Anglo-Saxon Wikipedia. Neither the original request nor this objection belong on the English Wikipedia. Bot issues for the Anglo-Saxon Wikipedia need to be worked out on the Anglo-Saxon Wikipedia. -- JLaTondre ( talk) 04:14, 18 February 2009 (UTC)
Very well then, my apologies are in order. — ᚹᚩᛞᛖᚾᚻᛖᛚᛗ ( talk) 18:38, 18 February 2009 (UTC)

Broken translation requests

I've just developed a suite of templates that make requesting expansion from other language wikipedias much easier. See {{ Expand Spanish}} for example. One feature, though, of the templates may be problematic. When you tag an article, there is an optional parameter for the name of the article in the other language. If no title is specified, it defaults to assuming that the article in the other language has the same name as the English article. This is generally fine bc most translation requests are geographic places and biographies that have the same name in both languages. If the article names are different, this causes a problem. Can someone make a bot that goes through all the articles that are tagged with a template that is generated by {{ Expand language}}, then sees if the corresponding interwiki article exists, then if it doesn't either notifies the tagger, or puts a notice on the article talk page, or adds it to a list so translation project people can fix them manually? Thanks! Calliopejen1 ( talk) 17:53, 18 February 2009 (UTC)

Oh wait, an even better idea! I'll change it defaults to no link, and a bot could come by and automatically add the parameter if an article exists! Calliopejen1 ( talk) 18:07, 18 February 2009 (UTC) by looking to the interwiki according to the language of the tag. That is, if the bot sees {{ Expand Spanish}} it looks for [[es:XXXX]] and then changes it so it's {{Expand Spanish|XXXXX}}. Calliopejen1 ( talk) 18:42, 18 February 2009 (UTC)

Replacing calender templates

Templates of the style {{ FebruaryCalendar2008Source}} are correctly tagged T3 speedy deletion candidates because they are redundant to templates of the style {{ FebruaryCalendar|year=2008}}. I'd like to request a bot (should be easy to do) that replaces all instances of all templates of the first kind with those of the second kind, i.e. {{ MayCalendar2007Source}} with {{ MayCalendar|year=2007}} etc.

It should also be able to replace constructs like {{{{CURRENTMONTH}}Calender{{CURRENTYEAR}}Source}} with {{{{CURRENTMONTH}}Calender|year={{CURRENTYEAR}}}}.

And finally, it should tag all those former templates that are duplicates (i.e. of the style {{MonthCalenderYearSource}}) with {{ db-t3}} (don't forget the <noinclude>-tags for that) and list them on a subpage in my userspace so I can delete them after the waiting period is over. So who wants to code me that little thing? Regards So Why 11:37, 18 February 2009 (UTC)

I'll do it, but I'm wary of bot-marking a bunch of templates for WP:CSD. Has a consensus for this been reached elsewhere? Anomie 12:21, 18 February 2009 (UTC)
I don't see the problem. After all, T3 is a very simple criterion that only applies if the template is redundant and unused. But if you are wary about that, I'm happy if it just generates the tag-worthy list. I'll take care of it then. But thanks for your offer, I knew your bot would be perfect for such things :-) So Why 12:27, 18 February 2009 (UTC)
I'll have to think about this T3 thing. What about the {{MonthCalendarYear}} templates (without "Source")? Also, are the new templates really completely compatible with the old? I note that the old ones have endnote and note as aliases for EndNote, for example, that seems to be lacking in the new.
Hmmm... Why not just make one template that takes both the month and year as parameters? Make the parameter names sane ("1a" is really a horrible name) and AnomieBOT can translate those at the same time too. Anomie 02:15, 19 February 2009 (UTC)
I'm sorry, but I lack the insight as to why they are in this style, I just stumbled across the old ones on CSD duty. I invited Zzyzx11 ( talk · contribs) who tagged them to contribute here. Regards So Why 11:18, 19 February 2009 (UTC)

Ok, User:SoWhy asked me to provide some background: Before parserfunctions such as #time: and #if: were created in 2006 or 2007 (I can't remember), we had to create separate templates for each year. Thus the existence of {{ MayCalendar2004Source}}, {{ MayCalendar2005Source}}, {{ MayCalendar2006Source}}, etc. So with the existence of the parserfunctions, we could make general calendar templates there are more self-maintaining. So there have been discussions such as Wikipedia talk:WikiProject Days of the year/Archive 7#The calendar on the date pages and Template talk:JanuaryCalendar to have those kind of templates.

Well, finally we had the time to merge all the parameters into a few templates such {{ MayCalendar}}. I know it might look like spaghetti code, but it will have to do for now so it would be backward compatible for all the templates whose functionality were merged.

I believe I have already done most of the replacements already. The problem now is that since these templates were on so many pages, transcluded and cascading on multiple pages at a time, that I am currently waiting for the job queue to fully update all the backlinks so the "What Links Here" lists are fully accurate. I mean if you look at Special:WhatLinksHere/Template:MayCalendar2008Source, it lists a bunch of subpages of Portal:Music/DateOfBirth, but the template was actually only directly on the transcluded page Portal:Music/DateOfBirth/May.

Thus, any bot here is premature for the next month or two (last time I heard, the job queue takes about 40 days to fully complete a round of all the pages on Wikipedia). Cheers. Zzyzx11 (Talk) 16:37, 19 February 2009 (UTC)

Update table of bot edits

This page may have some use, but only if it's updated (it's currently a year out-of-date). Would some who'd got a minute look at it? - Jarry1250 ( t, c) 14:29, 19 February 2009 (UTC)

 Done from the API. Happymelon 18:13, 19 February 2009 (UTC)

Foundation and similar dates, in infoboxes

[I'm relisting this August 2008 request (including subsequent revisions), as the editor who said he would make the edits has not done so, nor replied to many enquiries as to progress (due at least in part to understandable family matters).]

To add "founded", "founded date" or "foundation", "released", "first broadcast" and similar dates to an infobox' hCard microformat, existing dates need to be converted to use {{ Start date}}.

I've compiled a list of relevant infoboxes at User:Pigsonthewing/to-do#Date conversions.

Thank you.

Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:20, 2 February 2009 (UTC)

It should be pointed out that conversion of these dates into a form that needlessly imposes restrictions on editors is a controversial proposal for which there is no consensus. Microformat dates can be supported without the cumbersome {{ start date}} template. See discussion at the WikiProject Time talk page. - J JMesserly ( talk) 20:49, 3 February 2009 (UTC)
This request has already been agreed; it is simply that the editor who undertook some months ago, to do it has been unavailable. The many manual replacements on a number of pages have been utterly uncontroversial. There are no "needlessly imposes restrictions on editors"; and {{ start date}} is successfully used in an even greater number of articles (well over 10K). There is currently no viable alternative. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:26, 3 February 2009 (UTC)

It is a controversial proposal. Perhaps it would be prudent to await consensus on this matter after the Time wikiproject has time to properly consider the desirability of needlessly encoding dates in an arcane format. - J JMesserly ( talk) 21:39, 3 February 2009 (UTC)

Please provide evidence of this supposed "controversy". Note that " I don't like it" does not constitute such evidence, and, as I said above that this change has already been agreed, with consensus. In referring to debate on the Time Wikiproject, you appear to be promoting a "rival" (sic) template which you yourself created only a couple of days ago, and which has no demonstrable community support. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 23:05, 3 February 2009 (UTC)
No evidence having been provided; I suggest we proceed. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 22:01, 6 February 2009 (UTC)
Anyone looking into this matter will know this isn't the case. For example, consider the following post on the microformats talk page:

I support the proposal of J JMesserly and favor the {{ start-date}}: Before all, Wikitext must remain human readable. (BTW: There's in fact currently no chance - even for programmers - to enter a date like "7 December 1941 8AM HST" using {{ Start date}}: I vainly tried {{Start date|1941|12|7|18|||Z}}, {{Start date|1941|12|7|18||Z}}, {{Start date|1941|12|7|18|Z}}). -- Geonick ( talk) 00:05, 5 February 2009 (UTC) (UTC) source

Pigsonthewing's assertion of no controversy is not accurate. There is no pressing need to encode dates in a wonkey format that make it more difficult for contributors to understand. Other contributors agree on this point. - J JMesserly ( talk) 02:41, 10 February 2009 (UTC)
You have been asked to provide evidence of this supposed "controversy". You have not done so. You have provided an out-of-context citation of just one editor liking one aspect of your experimental template, but not objecting to the above proposal. As I have already said twice before; the example of {{ Start date}} you give above is not one of the supported formats for that template; GIGO applies. Your pejorative use of the epithet "wonky" is fallacious. If you object to the use of {{ Start date}}, them nominate it for deletion. As you say: "It is true [{{ Start-date}}; your suggested alternative] is a new untested template and there may be bugs to fix with it"; as indeed there are. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 13:14, 10 February 2009 (UTC)

I propose consideration of this proposal be suspended until

  1. The documentation for {{ Start date}}, {{ End date}}, and the bot is improved to explain how the bot and the templates will deal with dates that are in the Julian calendar or the Roman Republican calendar.
  2. Andy Mabbett states that he has read the ISO 8601 standard. One should never state or imply that one complies with a standard unless one has read it. -- Gerry Ashton ( talk) 20:07, 10 February 2009 (UTC)
Please provide a reference for the Wikipedia policy which you imagine requires me to satisfy this bizarre demand. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 20:08, 13 February 2009 (UTC)

Other discussion

For those interested in the nature of the controversy, please see Manual of style- dates discussion on the unnecessary obscurity and error prone nature of the {{ start date}} template compared to alternatives that achieve the same goal. - J JMesserly ( talk) 15:47, 12 February 2009 (UTC)

There is no controversy. Please avoid unnecessary drama. There is no obscurity and the template is not "error prone"; unlike the supposed alternative. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 20:06, 13 February 2009 (UTC)

Controversy shown

It has been demonstrated by the thread above and at the Manual of style- dates page that bot runs employing {{ start date}} are controversial as evidenced by the responses from multiple other contributors. - J JMesserly ( talk) 19:07, 18 February 2009 (UTC)

No such controveversy has been demonstrated, as any one can see Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 18:23, 19 February 2009 (UTC)

Italicizing foreign words - such as kata (as in karate kata)

I wonder if it would be possible to get a bot to italicize all instances of the word kata. I tried doing it in AWB but couldn't figure out how to get the program to ignore it if it was already italicized. Thus, a command like "change all instances of kata to kata" would look only at the word inside the double single quotes, find it, add more double single quotes, and end up changing every instance of kata to 'kata', which is no good.

There is a potential for false positives, but I think that as long as the bot is confined to Category:Martial arts and its subcategories, it shouldn't be a problem.

Can anyone help out with this? Thanks. LordAmeth ( talk) 20:13, 19 February 2009 (UTC)

Did you try telling AWB to skip any pages containing ''kata'' (those are single apostrophes)? It seems like this would cover for you, since it is inducible that if a page has one instance of the word formatted correctly, then all instances in said page are formatted correctly. That of course assumes that you're willing to accept that induction, and that might leave you some pages with the non-formatted word still there. There is also probably a way to do exactly what you want using RegEx, but I'm really bad with that so you may have better luck either from someone else here or at another page like WP:VPT. As for automation, if you think that the above-mentioned skip parameter is acceptable I would be happy to file a BRFA and do this with my bot. Robert Skyhawk So sue me! ( You'll lose) 04:18, 20 February 2009 (UTC)
Totally untested, but a Perl-style replacement regex should be s/([^'][^'])(kata)([^'][^'])/${1}''${2}''${3}/i. No idea how to translate it into an AWB regex. Two important points about this: it won't italicize already-bolded instances, and it will make a hash of wikilinks. -- Carnildo ( talk) 10:41, 20 February 2009 (UTC)
AWB uses the same regex as Perl. However, I don't think you want to use that regex as it will also replace occurrences of kata within words (ex. 11kata22). I don't know, but I'd assume that might cause false positives. Something like s/\b(?:'')?kata(?:'')?\b/''kata''/i; would probably be better. -- JLaTondre ( talk) 13:14, 20 February 2009 (UTC)
I don't believe this is a job for a bot as the risk of false positives seems high. AWB would be more suitable. Its find & replace supports regex. -- JLaTondre ( talk) 13:14, 20 February 2009 (UTC)

Detection of articles that are better in other wikipedias

Following up on my previous request, I would like to be able to find which articles could benefit from translation. A bot could compare en.wiki articles (probably only stubs at this stage) with articles they are linked to via interwikis. Where the linked article is significantly long (a rudimentary measure of article quality), the bot could slap an {{ Expand Spanish}} (or another language) tag on the en.wiki. Or the bot could just output a list of these articles so they could be reviewed manually. Calliopejen1 ( talk) 18:02, 18 February 2009 (UTC)

This is more for the actual template usage but stub articles should not also have an expand tag, that is overkill. Expand tags are already excessively used on Wikipedia, to also have a bot place them on articles.... Garion96 (talk) 13:51, 19 February 2009 (UTC)
But the expand tag actually points to more information, and where the article is better in another wikipedia the auto-translate link lets people read it in imperfect but often acceptable english. I think stub tags are worthless anyways, so I think the redundancy should be resolved in the template's favor.... (I also think the ordinary {{ expand}} tag is pointless, but I suppose that is beside the point.) Calliopejen1 ( talk) 03:15, 20 February 2009 (UTC)
Stub tags at least are small. Imagine, you have a perfectly ok little article with a template about a third of the article size pointing editors to the Spanish Wikipedia where they might find information to expand the article..... (Or it points readers to a computer translation. I am not sure which is worse) That kind of information belongs on the article's talk page or at a list page at Wikiproject Spain. Garion96 (talk) 20:24, 20 February 2009 (UTC)
Except the point is that it isn't perfectly ok! Stubs generally are awful articles for people who actually want to learn about the subject matter at hand. Which is more useful for the reader- this stub article, or this machine translation from es.wiki? Obviously the english isn't perfect, but I really don't think there's any comparison as to which contains more useful information. Calliopejen1 ( talk) 21:40, 20 February 2009 (UTC)
It is a small article, therefore a stub. It will grow in time. No need for a huge template for that. (talk page would be fine of course). The same counts for República Cromagnon nightclub fire or Hospital de Sant Pau, two other articles I saw using this template. The more tags like these are being made, the more I understand User:Shanes/Why tags are evil. Garion96 (talk) 10:54, 21 February 2009 (UTC)

What you are not seeing Garion is that these tags are markedly ***different*** to other tags on wikipedia. It precisely because of editors like yourself who dismiss these articles as OK that very little has been done about it. By dismissing these tags as "all tags are evil" is just typical of the kind of narrow minded attitude that many have on here in regards to our potential and ways in which we can improve. They are not administered to cleanup etc they are administered to produce a net result in expansion in direct translation which eventually will undoubtedly start to produce massive results in the content of english wikipedia. They serve as a direct gateway between english and the other and keep track and also makes other aware that the article is in the process of being improved in correspondence. Fact is listing articles for translation behind the workspace and in the talk page at peoples request failed miserably for years. It never brought it to somebody's urgent attention that the articles can easily be expanded in minutes with the link provided so the articles would just be lying about in some barely ever used log in the workspace gathering dust and people visit the article and move on with no results. I don't think you quite understand the purpose of this proposal. Yes tags are ugly which is partly why they are so useful, it prompts people to try to quickly sort out whatever perceived problem the article is experiencing and can therefore remove it asap. As for size I don't see a huge template at all, looks no bigger than most of the templates we have. It is essential in my view that we dramatically increase the coordination of translation on wikipedia and root out the articles which have far superior articles on other wikipedias and begin to draw peoples atttention to doing something about it. Dr. Blofeld White cat 19:43, 22 February 2009 (UTC)

Oh, I think I am seeing it perfectly, I just don't agree with it. And if you read my earlier comments, my objections are not simply a summary as "tags are evil" so stop using that "kind of narrow minded attitude that many have on here" towards opinions they don't agree with. For a project like this one could use Wikiprojects to expand these articles. A bot could help for sure with creating a list to work with. I just think of readers, I don't think readers benefit from a huge tag on an article stating the article could be expned. I also don't think readers benefit from a computer translation. Garion96 (talk) 20:06, 22 February 2009 (UTC)

Not always, but then the tag isn't always intended to say to use google (which is far superior thatn most computer generated packages online). It is there as a background as is language groups and learning about translation. mOre often than not the editor is likely intelligent enough to either spot mistranslations by it or be able to proof read the foreign article themsevles and translate manually. Dr. Blofeld White cat 21:33, 22 February 2009 (UTC)

A perfect example of its purpose is Westerstetten for instance. Dr. Blofeld White cat 21:45, 22 February 2009 (UTC)

Asteroids

If you have a look at Category:Wikipedia pages with broken references you'll see hundreds and sometimes even thousands of asteroid stubs. I personally hold thay should be deleted non slower than they are created. But at least they should include {{reflist}} too keep them from cluttering up this category. The category was down to almost 2000 and improving, and I was about to do some serious work on it. But these thousands of asteroids came along. Perhaps a bot, or a small remark to the right person could help us out.

Please keep me posted (I mean, please tell me how you propose to delete all of them in one day, joking). Debresser ( talk) 22:37, 19 February 2009 (UTC)

Doing... Lego Kontribs TalkM 01:54, 20 February 2009 (UTC)

You're a hero. What do you do with them? Add reflist, or delete? Debresser ( talk) 09:09, 20 February 2009 (UTC)

If you just add reflist there is a request I'd like to ask from you. Could you teach me how to write a bot that adds "prod" to all of them? Debresser ( talk) 11:29, 20 February 2009 (UTC)

Y Done I finished adding {{ reflist}} to all of them. I'm not sure if it is a good idea to add prod tags to over 500 articles. Lego Kontribs TalkM 02:04, 21 February 2009 (UTC)

That's just great. Perhaps you would know how to go about recommending all of them for deletion? Debresser ( talk) 17:03, 21 February 2009 (UTC)

You're right. We have a discussion now at Wikipedia_talk:WikiProject_Astronomical_objects#main_belt_asteroids. I am arguing that there is consensus for turning all those stubs into redirects to a big list. You'd like to comment? Debresser ( talk) 22:27, 23 February 2009 (UTC)

Archive search box adder

Some articles' talk pages have huge archives. Adding auto search box to each one of them is an excellent duty for a bot or can be an additional task of an existing bot. It would just add {{Archive box|auto=yes|search=yes} to the proper line of talk pages. Logos5557 ( talk) 22:44, 22 February 2009 (UTC)

This would definitely help searching a number of talk pages easier. Smallman12q ( talk) 22:47, 24 February 2009 (UTC)

Bot to update project statistics

I'd like to request the creation of a bot to update these two project pages: Wikipedia:WikiProject Video games/Traffic statistics, Wikipedia:WikiProject Video games/Article statistics. It's not necessary for the bot to update the charts; but if it can then it's an added bonus. Thanks! SharkD ( talk) 02:22, 25 February 2009 (UTC)

i want a bot for theninja-rpg

sir, i want a bot for theninja-rpg.com it is a text based online game i want it to create ryo (in-game currency) and to train my character please help me sir —Preceding unsigned comment added by Rajansh mamoria ( talkcontribs) 15:55, 26 February 2009 (UTC)

Impossible This page is for requesting bots that do tasks on the English Wikipedia, not for requesting cheats for multiplayer games. -- Nn123645 ( talk) 17:27, 26 February 2009 (UTC)

Different name on Commons and Superseded Image

Is there a bot that fixes links to files that were uploaded to Commons under a different name (regular links too, not just image links)? I could have sworn that there was, but I have not seen anything at File:FlagTrujillo.JPG for three days. If not, could there be? The same for Template:Superseded-Image. ~ JohnnyMrNinja 09:01, 27 February 2009 (UTC)

Needs X Infobox

To reduce clutter on talkpages and to make sure the pages are categorized properly since most of the of Needs X Infobox just place them in the Requested Templates category when most WikiProjects have specialized categories to make it easier. I suggest the following templates are replaced on the article talk pages with the wikiproject pages with the appropriate needs-infobox switch, or if the WikiProject banner already exists, remove the template and update the WP banner with the switch.

Template WikiProject(/s) Replace With
{{ Needs television infobox}} WikiProject Television {{WikiProject Television|needs-infobox=yes}}
{{ Needs football biography infobox}} Wikiproject Football and
WikiProject Biography
{{WPBiography|sports-work-group=yes|needs-infobox=yes}}
{{Football|needs-infobox=yes}}

It might also be nice if the bot could check to see {{Infobox....}} exists in the article and then lists separately those so that they can be manually checked, but that isn't really needed. Peachey88 ( Talk Page | Contribs) 10:53, 23 February 2009 (UTC)

Coding... Are there any other templates, or is it just these two? [[Sam Korn]] (smoddy) 11:31, 23 February 2009 (UTC)
Just those two at the moment that I'm aware of (and could find (well except for the general needs infobox one, but that one shouldn't be done)). Peachey88 ( Talk Page | Contribs) 11:52, 23 February 2009 (UTC)
BRFA filed. Wikipedia:Bots/Requests for approval/Sambot 2. [[Sam Korn]] (smoddy) 16:03, 23 February 2009 (UTC)
Y Done. List of pages that need attention at User:Sambot/Tasks/Football infoboxes. {{ Needs television infobox}} is now orphaned. [[Sam Korn]] (smoddy) 17:43, 27 February 2009 (UTC)

Change links related to recent page move

I need a bot to accomplish one fairly simple task:

  1. Change links to Fukuoka, Fukuoka into links to Fukuoka

This is across all namespaces, if possible. Thanks! ··· 日本穣 ? · Talk to Nihonjoe 02:51, 23 February 2009 (UTC)

Sorry, WP:R2D overrides here unless "Fukuoka, Fukuoka" will eventually be made into a separate article and not always be a redirect. §hep Talk 03:31, 23 February 2009 (UTC)
If you are going to change it, you should change it to "[[Fukuoka]]" or "[[Fukuoka]], [[Fukuoka Prefecture]]", etc., not to "[[Fukuoka|Fukuoka, Fukuoka]]" as that would be stupid, cumbersome, and redundant. — CharlotteWebb 12:45, 28 February 2009 (UTC)

IoE links - changed string in URL

I don't know whether it will be possible for a Bot to take this on but it affects thousands of articles & would take months/years by hand. The web site Images of England (IoE) lists all of the listed buildings in England and is frequently used as a reference including in many FA & GA class articles. They have recently changed the format of the URLs returned by their database, meaning that each unique building number is the same, but any "string" in the URL which includes "search/details" will only work if the reader is already logged in to IoE for anyone else it presents a blank screen. If this section of the URL is replaced with "Details/Default" it works for everyone with no need to log in. As an example try comparing this with this one which both target information about St Andrews Church in Chew Stoke with the item number 32965 but the first one fails & the second one works. If a Bot was able to do this replacement that would be great. If I've not explained it properly or you need further information please don't hesitate to contact me.— Rod talk 18:16, 28 February 2009 (UTC)

Coding... -- JLaTondre ( talk) 18:59, 28 February 2009 (UTC)
BRFA filed at Wikipedia:Bots/Requests for approval/JL-Bot 4. -- JLaTondre ( talk) 19:30, 28 February 2009 (UTC)

WP Bosnia and Herzegovina

I need a bot to modify talk pages that have {{WikiProject Europe|BiH=yes}} in them to change to {{WikiProject Bosnia and Herzegovina}} PRODUCER ( talk) 19:34, 28 February 2009 (UTC)

I'll put in a BRFA. What (if any) category does that template and parameter combination put the talk page in?-- Rockfang ( talk) 20:48, 28 February 2009 (UTC)
From the template's doc it looks like Category:WikiProject Bosnia and Herzegovina articles. §hep Talk 20:52, 28 February 2009 (UTC)
Gracias.-- Rockfang ( talk) 21:20, 28 February 2009 (UTC)
Producer, do you want all types of talk pages, or only certain namespaces?-- Rockfang ( talk) 21:20, 28 February 2009 (UTC)
All of them I suppose PRODUCER ( talk) 22:10, 28 February 2009 (UTC)
Ok. I filed a BRFA. Now, we wait.-- Rockfang ( talk) 03:10, 1 March 2009 (UTC)

Creative Commons Flickr Bot

Proposal 1

Make a bot to compile a list of images on Flickr that are licenced under the Creative Commons attribution licence that can replace image:replace this image male.svg and image:replace this image female.svg. These will then be sighted to see if they are not blatant copy vios then uploaded to commons.

Propsal 2

Make a bot that transfers all images on Flickr that are licenced under the Creative Commons attribution licence (but crucially not people) to commons.

See related discussion here

-- DFS454 ( talk) 14:04, 28 February 2009 (UTC)

As this is a request for bot work that will occur on Wikimedia Commons, this needs to done via their bot process. They have their own bot request page. You should make your request there. -- JLaTondre ( talk) 15:08, 28 February 2009 (UTC)
There is a long established mechanism on Commons for transfering images from flickr. For a technical place to jump in, try here Commons:User:Flickr upload bot/step-by-step - J JMesserly ( talk) 16:10, 28 February 2009 (UTC)
I am aware of Flickr Upload bot what I meant was an automated process ,which scans media by itself, that is only sighted( Flagged revisions) by users. Technically how hard is it to code something like this? -- DFS454 ( talk) 21:48, 28 February 2009 (UTC)
I dunno, how hard is it to understand something this?
r'\[(?P<url>https?://[^\|\] ]+?(\.pdf|\.html|\.htm|\.php|\.asp|\.aspx|\.jsp)) *\| *(?P<label>[^\|\]]+?)\]
Actually, that one is pretty easy, as they go. You have to scan html pages and scan them for what you want, then execute more page fetches depending on what those pages tell you, and then of course all your code is broken the following week when someone decides to change their html page, breaking one of your routines and you no longer recall how it works so you have to rewrite it. Other than that, it is a piece of cake. - J JMesserly ( talk) 08:06, 1 March 2009 (UTC)

This is an AfD from earlier this year that resulted in the deletion of a few disambig pages from an old scheme of organizing that list. There are still quite a few links to it, but nobody followed up on the author's suggestion to have a bot change them. It'd probably take 10 minutes at most in AWB, but I no longer have Windows, so I'm asking here. From a quick count it's only about 200 links in total. Changing the links to List of Latin phrases or List of Latin phrases (full) would probably be fine, though if you're feeling really ambitious you could actually look at each link and send it to whichever of the 6 pages the list is now broken down into is appropriate. Thanks, -- Rory096 16:01, 1 March 2009 (UTC)

Not sure why anyone would have directly linked to these pages anyway. It would make more sense to link to the actual Latin phrase, then let the phrase redirect to whatever subdivision of the list currently contains that phrase. This is a case where pre-emptively bypassing redirects is actively harmful. Also I'm concerned about GFDL attribution issues if material was originally added to "A–E" but then cut and pasted to A–B when the former page grew too large. All of these obsolete segment titles should at least be undeleted and redirected to the main list page. If there is some way to find a list of Latin phrases which are currently a deleted redirect (because some bot discovered that they pointed to a deleted section of the list), these should be undeleted too. I'd estimate that the cleanup process will be more complicated than you think. — CharlotteWebb 16:54, 1 March 2009 (UTC)

IMDb links

Would it be possible to have a bot check through all articles using the |imdb_id= parameter in {{ Infobox Film}} to see which of these do not otherwise contain a link to IMDb, i.e. through the use of {{ imdb title}} (or any of its redirects), and present this data in the form of a numbered list? Such information would be useful in an ongoing debate over the use of such parameters in the infobox. Thanks in advance for any help! :) PC78 ( talk) 15:34, 2 February 2009 (UTC)

Bump. Is this request feasible or not? PC78 ( talk) 16:24, 9 February 2009 (UTC)
This request is certainly feasible, I'll get coding and see what I can come up with. As the bot isn't actually going to be editing - only reading - pages it won't need approval, so I should be able to run through the transclusions later today. I'll put the data in a subpage of my bot's userspace (or anywhere else sensible if you'd prefer). Richard 0612 10:11, 13 February 2009 (UTC)
I'm having a few issues with non-Latin characters in page titles, but I haven't given up, it'll just take a bit longer! Richard 0612 21:05, 13 February 2009 (UTC)
Perhaps someone else could take a look at this. Pywikipedia seems not to like Unicode characters... Richard 0612 22:22, 17 February 2009 (UTC)
I'll take a look. Anomie 03:29, 18 February 2009 (UTC)
Y Done Unless I screwed something up in my coding, this should be the list. There are 6112 article pages, and a handful of others. Feel free to copy it elsewhere if a permanent link into my sandbox isn't sufficient. Anomie 12:05, 18 February 2009 (UTC)
Many thanks to you both! PC78 ( talk) 15:58, 18 February 2009 (UTC)

How feasible would it be for a bot to remove the link from the infobox and add it to the relevant "External links" section of the article? PC78 ( talk) 17:04, 21 February 2009 (UTC)

Coding... It would be quite feasible, but I suggest having the bot process the other obsolete external link parameters (website and amg_id) fields at the same time, to get all three in one edit. I'd also have the bot generate a list of pages that need manual fixing or extra attention. Anomie 16:43, 22 February 2009 (UTC)
Yes, of course. There seems to be concensus at Template talk:Infobox Film for removing these parameters from the infobox, so all three will need to be processed. Whatever you think best. PC78 ( talk) 17:24, 22 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/AnomieBOT 24. Anomie 03:41, 23 February 2009 (UTC)
Thanks to PC78 for getting this started. Lugnuts ( talk) 14:00, 1 March 2009 (UTC)
Why is it preferable to have them in external links rather than in the infobox? Шизомби ( talk) 15:40, 1 March 2009 (UTC)
We are so not going to get into another debate here. You want Template_talk:Infobox_Film#External_links_.28imdb.2C_amg.2C_etc.29 for all the info. - Jarry1250 ( t, c) 15:55, 1 March 2009 (UTC)

You are doing one of the most controversial things I've ever seen a bot do. Please bot revert and wait for a real discussion on the matter. Very poor form. - Peregrine Fisher ( talk) ( contribs) 07:53, 2 March 2009 (UTC)

That looks like consensus to me. §hep Talk 09:01, 2 March 2009 (UTC)
This is a big change. Not quite as big as saying "no fair use images in wikipedia", but similar. One that should not be enforced by a bot. - Peregrine Fisher ( talk) ( contribs) 09:04, 2 March 2009 (UTC)
Removing external links from a single infobox is not the "big change" you seem to think it is. This was discussed at Template talk:Infobox Film, the discussion was advertised at the relevant WikiProject, and a concensus was reached; it is not so important or controversial that it requires a centralised discussion for the whole of Wikipedia, nor is this the venue for reopening the discussion. Please direct your comments to the template talk page. Thank you. PC78 ( talk) 12:25, 2 March 2009 (UTC)

WikiProject/Taskforce Spammer.

Sometimes it would be incredibly useful to be contact all WikiProjects and taskforces at once. I've look for bots that can do this, and I haven't found any which is currently able to contact all projects and taskforces in one fell swoop. Anyone willing to code this? Headbomb { ταλκ κοντριβς –  WP Physics} 04:35, 21 February 2009 (UTC)

If you can give me a specific list of categories/member lists to hit, I think I should be able to do it with AWB pretty efficiently (10 epm). Robert Skyhawk So sue me! ( You'll lose) 05:35, 21 February 2009 (UTC)
Sure, see (last column on the right give the link to the wikiprojects/taskforces)
There are a lot of inactive projects, but if they are inactive no one should care that they are contacted. Some projects and taskforces probably aren't listed, but that's the best list I know of. Headbomb { ταλκ κοντριβς –  WP Physics} 05:49, 21 February 2009 (UTC)
Wow...you really mean all of them don't you? But yes, if you have a message to distribute to all of these people, then I think AWB should be able to get it done in a reasonable amount of time...we could even organize multiple bots to split the workload. There is one potential issue though...you'll definitely need approval for this task, and I can already see that it will be hard to convince the Bot Approvals Group that you have a message that needs to go to this many people. Keep in mind that almost every user on the project is a member of at least one WikiProject. May I inquire as to exactly what kind of message you are needing to broadcast to all of these people? Perhaps WikiProject talk pages are a better way to go... - Robert Skyhawk So sue me! ( You'll lose) 06:00, 21 February 2009 (UTC) Retracted, see below.
Yes I really do mean all. :P Obviously I'm not delusional enough to think that one could spam the all WikiProjects and Taskforces without approval of some form, but there should at least be a bot that could deliver one-time messages when required. As for the message, it would be to let projects know about a new feature called WP:Article Alerts (which is a basically a way for WikiProjects to know about the AFDs, PRODs, WP:FACs, etc, relevant to them), so I'm not really worried about the BAG being against this message being spread out (and the ~1500 or so resulting edits). There are currently about 75 or so subscribers to Article Alerts right now, and feedback has been uniformly positive. User:ArticleAlertbot has been throughoughly tested and we've recently overhauled the page to get ready for a massive influx of projects subscribing at once, as well as an increase in bug reports and feature requests. All that is left to be done is to make the projects aware that this exists. Headbomb { ταλκ κοντριβς –  WP Physics} 06:12, 21 February 2009 (UTC)
I have a lot of sympathy - I spent a good hour on AWB manually posting my alert about my project, which is quite similar to yours (not in competition though, don't worry) to ~200 WikiProjects. It's long (at 4 epm), boring, but it does get the job done without the need for a BRFA, so if all else fails, you could consider it. - Jarry1250 ( t, c) 11:54, 21 February 2009 (UTC)
That does make sense. I am thoroughly satisfied with the job that ArticleAlertBot does. However, I think it would be much more efficient to simply post this message on every active WikiProject's talk page. That would reduce the amount of edits that need to be made dramatically, and would make BRFA approval much easier, while still ensuring that active project members (who presumably watch their projects' talk pages) will be notified of this bot's services. If you do in fact decide that this is what you want to do, then we can go from there. If you still think you want to go with member lists, then I hope we can make that work too. Robert Skyhawk So sue me! ( You'll lose) 15:42, 21 February 2009 (UTC)
If you just want a bot to get a message out to peoples talk pages quickly I have a bot that is already coded to do this and is very efficent. If you want it to go out to wiki project talk pages it should be able to be modified easily, It would need a list / cat / page or links or something similar to run from but I think this would be a better idea quicker and more efficent than AWB. (Already approved for the posting on talk pages) ·Add§hore· Talk To Me! 18:59, 21 February 2009 (UTC)
  • Robert Skyhawk: I'm not quite sure I understand what the difference is between "post[ing] this message on every active wikiprojects" and what I'm proposing.
  • Addshore: The lists are already given above. Last column gives the Wikiproject links (the bot would obviously post the their talk pages). Headbomb { ταλκ κοντριβς –  WP Physics} 21:41, 21 February 2009 (UTC)
Oh...I understand now. I don't think I quite saw what you were trying to do, but now that I realize that just using the WikiProjects' talk page is what you were trying to do all along, this seems much more reasonable. Robert Skyhawk So sue me! ( You'll lose) 23:33, 21 February 2009 (UTC)
Now, I'm kinda curious about what you thought it was I was asking for. Headbomb { ταλκ κοντριβς –  WP Physics} 05:09, 22 February 2009 (UTC)
It would be nice if the bot skipped the projects that already use AAB. Shouldn't be that hard to remove them from the list as there's only a handful. §hep Talk 05:14, 22 February 2009 (UTC)
Yes, they are all in Category:ArticleAlertbot subscriptions. However, it's been four month since the first subscribers, and quite a lot changed at WP:AAlerts during that time, so they might not be aware of the recent changes. I think I'd still push for all wikiprojects and taskforces regardless of subscription, if only give the links to newly created bug reports and feature request pages, but I'll leave this up to the BAG's judgement once they see the actual message. Headbomb { ταλκ κοντριβς –  WP Physics} 05:48, 22 February 2009 (UTC)
If you must know, HeadBomb, for a while I actually thought you wanted to take the member lists of the projects and spam every member's talk page...gross misunderstanding on my part. Robert Skyhawk So sue me! ( You'll lose) 05:44, 22 February 2009 (UTC)
Ugh, that'd be horrible and way out of line. I can see why you'd be concerned. Headbomb { ταλκ κοντριβς –  WP Physics} 05:48, 22 February 2009 (UTC)
So Headbomb you want a message on all the wikiproject talk pages listed there? If so yes my bot can do it and I will throw up a BRFA as soon as you say yes :P ·Add§hore· Talk To Me! 08:20, 22 February 2009 (UTC)
Yuppers. If the BAG wants to wait for the message before it approves, I can have it ready by the end of the day. Headbomb { ταλκ κοντριβς –  WP Physics} 08:25, 22 February 2009 (UTC)
Please see Wikipedia:Bots/Requests_for_approval/Addbot_19. ·Add§hore· Talk To Me! 08:30, 22 February 2009 (UTC)
I have 20 trial edits, if someone would care to give me the message and I will send it to the first 20 pages. ·Add§hore· Talk To Me! 17:11, 24 February 2009 (UTC)
Alright, here goes.

This is a notice to let you know about Article alerts, a fully-automated subscription-based news delivery system designed to notify WikiProjects and Taskforces when articles tagged by their banner enter a workflow such as Articles for deletion, Requests for comment, and Peer review ( full list). The reports are updated on a daily basis, and provide brief summaries of what happened, with relevant links to discussion or results when possible. A certain degree of customization is available; WikiProjects and Taskforces can choose which workflows to include, have individual reports generated for each workflow, have deletion discussion transcluded on the reports, and so on. An example of a customized report can be found at here.

If you are already subscribed to Article Alerts, it is now easier to report bugs and request new features. The developers also note that some subscribing WikiProjects and Taskforces use the display=none parameter, but forget to give a link to their alert page. Your alert page should be located at "Wikipedia:PROJECT-OR-TASKFORCE-HOMEPAGE/Article alerts".

Headbomb { ταλκ κοντριβς –  WP Physics} 20:49, 24 February 2009 (UTC)

Any updates? Headbomb { ταλκ κοντριβς –  WP Physics} 01:37, 28 February 2009 (UTC)

Sorry was away for a few days, I now have a small trial and I should use it up tonight. ·Add§hore· Talk To Me! 20:40, 1 March 2009 (UTC)
See this link for the trial edits.
Sorry I meant here ·Add§hore· Talk To Me! 21:04, 1 March 2009 (UTC)
Could someone from the BAG approve or decline this? The sooner this rolls out the better. Headbomb { ταλκ κοντριβς –  WP Physics} 22:07, 3 March 2009 (UTC)
Request approved, when do you want me to go through this list and add this message ? Feel free to add your own sig the the message and to change it in any way before I run the bot. Send me a message on my talk pag to confirm that you want the message to go out. Thanks. ·Add§hore· Talk To Me! 09:09, 7 March 2009 (UTC)

Lossless Image Optimization and Compression Bot

I have looked around wikipedia and noticed that most images are uncompressed (including the actual logo file:wiki.png) I believe that a bot that compresses images would help save bandwidth and reduce page download time. While some people may argue that the savings would be nominal, they would indeed help. Reduced bandwidth would save the wikimedia foundation money(remember, your donations pay for that bandwidth) and the reduced page load time would make people with slower connections happier.

On average, I have been able to compress some images by ~25%. Some more(5kb for the wikipedia text logo on www.wikipedia.org), some less(23bytes for the file:wiki.png). Compression can be accomplished in several ways. First, the color scale can be changed(such as from rgb to greyscale) can save kilobytes. Second, is the type of file such as jpeg, png, and gif. In some cases jpeg is better while png in others. Lastly, there is the actual compression through tools such as pngcrush, pngguantlet, and pngoutwin. The only downside is that like all compression, it is extremely computationally expensive. Together, they can compress an image a quarter or more.

I suggest that the bot begin with the standard mediawiki images followed by the top 1000 most viewed images. After that, it would simply work in order of "most bandwidth used" images. While I don't have the actual image download statistics(if someone could put them up, it would be nice), there can be savings. As I haven't programmed in years, I don't think I can write an adequate bot, but I can help. Please post if you support this idea or would like to comment, please do so. Smallman12q ( talk) 18:06, 22 February 2009 (UTC)

Edit 1: I would like to say that that what I have in mind is lossless compression. There is also a disccusion at Wikipedia:Village_pump_(technical)#Smaller Wikipedia Logo files Smallman12q ( talk) 22:22, 23 February 2009 (UTC)

Here is a site that offers online optimization: http://tools.dynamicdrive.com/imageoptimizer/ 72.90.135.45 ( talk) 18:54, 22 February 2009 (UTC)

Wow thanks, didn't know they had an online image "optimizer." That link is very useful! Smallman12q ( talk) 19:39, 22 February 2009 (UTC)

This won't do much good, if any. I think the minimal benefit of compressing the originals would be lost in the thumbnailing process. Plus you seem to ignore the possibility that anyone would want to download the uncompressed originals. The images that we upload are usually not the images that are displayed in articles. For example the Image:Felix Pedro.jpg I uploaded was 483×620px, 79,054 bytes:

http://upload.wikimedia.org/wikipedia/commons/1/1e/Felix_Pedro.jpg

You might argue that this is poorly compressed, with a 3.788 pixel–byte ratio, but most readers won't see this. The thumbnail you see on this page is rendered at 100×128px and uses 3,575 bytes:

http://upload.wikimedia.org/wikipedia/commons/thumb/1/1e/Felix_Pedro.jpg/100px-Felix_Pedro.jpg

Here the compression ratio is actually lower at 3.580. So even if we compressed the hell out of the full-size image (punishing anyone who wanted to print the original photo), the server would likely still generate thumbnails at the same file size as before (probably because it's intended to be fast, rather than efficient—your thumbnails have to be ready instantly when you hit the preview button to ask yourself "how does it look at this size") but probably be of measurably poorer quality. What would be the point of that?

If image loading times are a concern it would be better to use more aggressive compression (different software, or different settings within the same software) for the thumbnailing process rather than adulterating the originals, which shouldn't need to be compressed anyway. — CharlotteWebb 20:16, 22 February 2009 (UTC)

Charlotte is right, the images people actually see when looking at articles are generated by ImageMagick(the image processing software mediawiki uses) with predefined compression settings. The software does provide access to the original image by clicking on the image on it's File: page. This original should be left unadulterated. Chillum 20:18, 22 February 2009 (UTC)

Declined Not a good task for a bot. There is nothing wrong with recompressing a PNG, and nothing in particular wrong with changing to palletized or greyscale if it results in no change to the image (note that some programs don't "like" palletized images with an alpha channel), although as noted above it would not do a whole lot to reduce the bandwidth used in articles. But converting to greyscale when the image uses non-grey colors would be a bad idea, as would reducing the number of colors used while palletizing. Recompressing jpegs (or converting png to or from jpeg) would be a bad idea to do automatically (and not a very good idea to do in general unless you know what you're doing), as jpeg normally uses lossy compression. Anomie 21:57, 22 February 2009 (UTC)

Yes, I doubt a bot could reliably determine whether or not the colors in an image are "close enough" to grey that the viewer wouldn't notice a difference (especially if compression artifacts—often phantom shapes of false color—are present). The thumbnails will render at more or less the same file size regardless of whether the originals are compressed/corrupted. Forget the baby and bathwater, this would be about like pouring half the vodka down the sink (storing it in a smaller bottle), then adding water because you are still serving it in the same size glass. — CharlotteWebb 03:16, 23 February 2009 (UTC)

I believe I forgot to mention that the compression would be lossless. A bot wouldn't need to recognize if the colors "were close enough", instead only the number of colors present. For example, for about a year, the file:wiki.png file was uploaded as an rgb rather than a greyscale. And the actual "compession" would only be for png's so it would be lossless. Please assume that the compression is lossless. Also, file conversion such as png to gif could save additional bytes without any quality lost. Please let me know what you think of lossless compression. Smallman12q ( talk) 22:21, 23 February 2009 (UTC)

This isn't a good idea for a bot. WP:PERFORMANCE should be added to the above. Also, PNG to GIF is a bad idea...that's why we have WP:PIFU. §hep Talk 22:24, 23 February 2009 (UTC)
WP:PIFU is wrong regarding file types(especially GIFS). Small gifs can be significantly less than pngs of similar size and quality. And while SVG's can scale better than PNGs, PNGs are smaller.And as for WP:PERFORMANCE, pictures do cost a lot of bandwidth, so minor improvements can be multiplied.Please see several examples at Wikipedia:Village_pump_(technical)#Smaller Wikipedia Logo filesSmallman12q . You will notice that with the appropriate compression, a number of pictures(including the wikipedia logo) can be compressed without sacrificing quality. This in turn will reduce bandwidth usage(which your donations pay for). Smallman12q ( talk) 21:23, 24 February 2009 (UTC)
The savings are usually trivial. For example, if the numbers on the VP are correct, the reduced size of the logo will save the Foundation roughly 3 cents a year in bandwidth costs. -- Carnildo ( talk) 01:19, 25 February 2009 (UTC)
Comment I would like to say that I don't know what the actual statistics are. Perhaps if someone could provide a link or request them from the wikimedia foundation, then that can be argued. But no statistics(and hence no empirical evidence) means that we just believe what we want to believe (and that is very subjective). Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
Even if we uploaded smaller lossless copies and deleted the originals, deleted images are not actually deleted. They are still there for an admin to view or undelete. This is as intended as we always want the original version because many of the licenses we use require it. No space would be saved. Chillum 01:28, 25 February 2009 (UTC)
Comment I don't believe I said the originals would be deleted. I'm not here to save space. (The entire wikipedia is still less than 1TB so there isn't really much space to save). Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
Also it would not save bandwidth as all images shown on pages are created by mediawiki using image magick, the originals aren't sent unless you go and download them. If there was a need to save bandwidth the compression settings could be changes there. Also, since when are PNGs smaller than svgs? I suppose if it was a very complex drawing it could be, but svgs are normally pretty small. Chillum 01:31, 25 February 2009 (UTC)
Comment, then perhaps the mediawiki software should be modified. Small images stored in optimized png are generally smaller than svg. This can be seen at the WP:V thread. Smallman12q ( talk) 23:26, 25 February 2009 (UTC)

←Not sure what Smallman is trying to say but maybe he meant maybe he meant that the SVG→PNG thumbnails created by ImageMagick (or rsvg or whatever) have a larger file size than a visually similar PNG that was created manually. I can believe that, but that doesn't mean we should scrap the image-conversion software and leave a small man inside the server in charge of creating thumbnails. It… wouldn't scale.

Comment small man...clever ^.^. I don't mean manually created pngs, I mean automated ones. Its not hard...create a png from an svg and then compress the png. It will (generally) be smaller than the svg.(I'm referring to small pngs...the larger the png, the less likely it is to be smaller than its svg counterpart...not to mention the quality degradation).

Seriously something that actually would save bandwidth would be to tell the server to embed SVGs directly when the file size is smaller than that of the thumbnail that would otherwise be shown for the selected dimensions. But I suspect the outcry against this would be horrific. — CharlotteWebb 02:52, 25 February 2009 (UTC)

That's basically what I want...convert the SVGs as pngs when the file size is smaller...why there would be an outcry...I don't understand. Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
SVGs are text based and much easier to correct if there are errors you can lso translate them for sister projects without any problems. §hep Talk 23:43, 25 February 2009 (UTC)
No what I mean is this is already being done in all cases, even when the PNG is much larger than the SVG. It would be difficult to get this changed, as because certain users/browsers cannot or do not want to directly view the SVG files. — CharlotteWebb 12:33, 28 February 2009 (UTC)
Perhaps there was such a bot before? I found this User:Pngbot on File:Pinguim Crystal 2000.png. Perhaps there was a png optimizing bot before? Does anyone remember? Smallman12q ( talk) 02:11, 2 March 2009 (UTC)
That bot wasn't approved here, it was approved on Commons over 2 years ago. And is no longer active there. §hep Talk 17:06, 2 March 2009 (UTC)
Any reason why it was shut down, it seemed to have helped a bit. And if was approved in commons, why won't it be approved here? Smallman12q ( talk) 20:15, 2 March 2009 (UTC)
The BRFA process is a lot less quiet and I see it as less strict. 2 years ago it wasn't a real force like BRFA here. The bottom line is multiple bot ops have said it's a bad idea for a bot to do, it probably wouldn't get approved by BAG even if someone said they'd take on the task. There's some things that humans have to monitor. As a side thought, since all versions of an image are stored on the servers (even if deleted), wouldn't adding another image to that list just increase the amount of content we have to hold? §hep Talk 20:19, 2 March 2009 (UTC)
Well the idea is to reduce bandwidth and page loading time...storage these days is...well...very very very very low cost. Now what I don't understand is why it would need to be monitered by a person(for lossless compression). Why would a person need to monitor lossless compression?What is there to monitor? Smallman12q ( talk) 02:06, 3 March 2009 (UTC)

As noted on the VP, even your lossless changes weren't lossless. §hep Talk 04:21, 3 March 2009 (UTC)

That was a mistake on my behalf when I tried to optimize the images manually by reducing the color pallette size.

Above is a simple example of lossless PNG compression that can be recreated with a simple 10 trial run on PNGOUTWin or PNGGuantlet. Smallman12q ( talk) 12:45, 3 March 2009 (UTC)

Here is an excellent example in which compression could save some notable bandwidth... This image appeared on March 3, 2009 on the front page at [ [2]]

The front page gets an average of 5 million views a day. Every 175 thousand times the image is viewed, 1GB would be saved. If the image is viewed 1 million times, then 5.5GB bandwidth would be saved. Smallman12q ( talk) 00:57, 4 March 2009 (UTC)

As has been stated several times before in this discussion, usually the original images are not downloaded when the front page is viewed, only the thumbnail is. You pointed out a valid case above in which this is not true. According to this active discussion at Commons, the thumbnailer for GIFs was disabled some time ago for server performance reasons. Apparently they cannot be converted to PNGs using an automatic process because animated GIFs cannot be distinguished from single-frame GIFs in software. Wronkiew ( talk) 01:45, 4 March 2009 (UTC)
Ye...its best to give an example as I've seen. I would like to point out that your logic behind distinguishing single frames and multiple frames is flawed. Software can distinguish between single frames and multiple frames. Perhaps a software upgrade is in need? Smallman12q ( talk) 13:17, 4 March 2009 (UTC)
I'd just like to point out the over head from HTTP headers is ½ KB. Your efforts would far better spent to improve the crappy thumbnailer. — Dispenser 15:18, 4 March 2009 (UTC)
Since the thumbnailer creates images from the actual images, both need work(but you are right. The thumbnailer used here is inefficient, and doesn't properly compress the PNGs when converting from SVGs). Smallman12q ( talk) 22:38, 4 March 2009 (UTC)

Redirect tagger.

Since Article alerts has launched, its scope now included workflows such as WP:RfD. However, redirects are very rarely tagged, so it makes this feature less useful than in could be. So how about having a bot browse articles, check the "what links here", then tags the redirect with the same banners as the target article.

For example, quark has one redirect, quarks. The redirect tagger would copy the banners from talk:quark, and assess talk:quarks as redirect-class / NA-importance. It could run on a per-project basis, or continuously, whichever makes more sense to the BAG. I know WP:PHYS would be interested, and I'm sure other projects will show interest as well. Headbomb { ταλκ κοντριβς –  WP Physics} 03:29, 26 February 2009 (UTC)

WikiProject tagging of redirects is apparently controversial, see some of the discussion at Wikipedia:Bots/Requests for approval/MelonBot 11 for example. Anomie 03:47, 26 February 2009 (UTC)
I've place a notice on that BRFA to make sure bots aren't undoing or preventing each other's work. Headbomb { ταλκ κοντριβς –  WP Physics} 03:58, 26 February 2009 (UTC)
On a per-project-basis...I'm pretty sure almost any of the auto-assessor bots categorized above could do this. Most projects just don't use redirect class and some even discourage the tagging of redirects. §hep Talk 03:49, 26 February 2009 (UTC)
On a per-project basis then. Headbomb { ταλκ κοντριβς –  WP Physics} 03:58, 26 February 2009 (UTC)
The Tool should be change so it automatically check all the redirects (this should be easy enough to do), tagging them is unproductive. — Dispenser 06:58, 4 March 2009 (UTC)
What tool? ArticleAlertbot? It depends on categories (populated by banner tagging) and runs on a daily basis. Making it check all redirects for a project like WP:BIO is just not feasable nor desirable and would introduce an unnecessary strain on the servers (and considerable slow down the bot). Headbomb { ταλκ κοντριβς –  WP Physics} 07:42, 4 March 2009 (UTC)
Yet this is exactly what your asking for, it takes a considerable about of resources for a edit. This is to say nothing about MediaWiki maintaining the category; compared to a simple JOIN statement done on the toolserver or with the API. I would seriousely consider putting the redirect class up for deletion, just to prevent your line of thought. — Dispenser 14:08, 4 March 2009 (UTC)
The difference is AABot would do it everyday with your proposal, vs. once in a blue moon with the redirect tagger, only when a project feels like it's worth doing. Headbomb { ταλκ κοντριβς –  WP Physics} 20:40, 4 March 2009 (UTC)
Also, I'm not asking for a redirect fixer, I'm asking for a redirect tagger, so that ArticleAlertbot can pick them up when they enter workflows such as WP:RfD. Headbomb { ταλκ κοντριβς –  WP Physics} 01:50, 5 March 2009 (UTC)
I don't think you want to try to prove your point by nomming all of these for deletion. I suggest that Headbomb just ask any WikiProject tagger to do the run for him; but since it's a per-project basis almost every project tht wants their redirects tagged already has them tagged. §hep Talk 22:35, 4 March 2009 (UTC)
WikiProject taggers can't do it, because for taggers to work, they need to be put in categories. Those categories you've just pointed to are those that are already tagged, so it's pointless to do anything with them. What I want is a bot that places redirects in categories, according to what wikiproject their target are part of. Headbomb { ταλκ κοντριβς –  WP Physics} 01:57, 5 March 2009 (UTC)
They most certainly can. They would work off of Category:X articles by quality or similar. Load all of the articles in the cat, get the redirects of each article (What Links Here or similar), and tag their talk pages with {{ WikiProject X|class=Redirect|importance=NA}}. It's that simple. §hep Talk 02:12, 5 March 2009 (UTC)
I wasn't aware that existing taggers had that feature in them. Which of them can do it? Headbomb { ταλκ κοντριβς –  WP Physics} 02:17, 5 March 2009 (UTC)

(←)Not sure, I'm sure some do though. I'd do it for you, but AWB has been on the fritz for me recently. If you have AWB it's a simple manner of 3 steps or so to get a complete list of all redirects for a project. §hep Talk 02:20, 5 March 2009 (UTC)


You seem to have missed the point. The cost of the categories is nearly equal to the JOINs. It would take about 2 years using the JOIN method everyday for it to equal the cost of the same tagging run. In addtion, page regeneration will happen as the template updates, continuing to increase the cost. So the tagging method has a higher inital and running costs. — Dispenser 23:26, 4 March 2009 (UTC)
JOIN? Headbomb { ταλκ κοντριβς –  WP Physics} 01:51, 5 March 2009 (UTC)
Bots tag talk pages every day. If a project wants their redirects categorized we're not ones to stop them. §hep Talk 01:53, 5 March 2009 (UTC)
Many bots also do a poor job of what they're suppose to do. Redirect tagging falls under the "Just because you can do something doesn't mean that you should". — Dispenser 05:33, 5 March 2009 (UTC)
JOIN is the command to (temporarily) merge two database table. This is done on a almost constant bases with page titles in MediaWiki. — Dispenser 05:33, 5 March 2009 (UTC)

And of course, there is the small matter of updating every redirect talk page whenever a banner is added to or removed from the target page. The whole point of redirects and templates is to avoid duplication, not perpetuate it. The MediaWiki architecture is specifically designed to be as quick and efficient as possible in outputting data, with corresponding sacrifices on inputting it. Almost never will an argument that "editing page X once is better than reading data Y times" prove genuinely valid. Happymelon 15:22, 5 March 2009 (UTC)

Need a bot to change a template link

Hello,

I need a bot to make a change to a template location on about 250 or so portal pages. The templates that these portals use were created in the wrong name space, "Portal:". As part of some house cleaning, I moved the templates to the proper name space and need a bot to update the links on all of the pages so that they avoid the redirect.

The templates are:

Thank you, -- Jeremy ( blah blah 08:34, 27 February 2009 (UTC)

If it's purely to avoid the redirect - and you don't need the name for some other purpose - then it fails under WP:R#NOTBROKEN. - Jarry1250 ( t, c) 08:54, 27 February 2009 (UTC)

Actually I was more concerned with the problem covered in the next section: Aliases for templates can cause confusion and make migrations of template calls more complicated. For example, assume calls to T1 are to be changed ("migrated") to some new template TN1. To catch all calls, articles must be searched for {{T1}} and all aliases of T1 (T2 in this case). -- Jeremy ( blah blah 09:11, 27 February 2009 (UTC)

There is some sense to the idea of removing the cross-namespace redirects that have been created here. [[Sam Korn]] (smoddy) 18:16, 27 February 2009 (UTC)
I still would like to have this done, per the reasons I and Sam Korn have put forth. Thanks again, -- Jeremy ( blah blah 08:20, 1 March 2009 (UTC)
BRFA filed at Wikipedia:Bots/Requests for approval/Erik9bot 4. Erik9 ( talk) 23:35, 1 March 2009 (UTC)
 Done [3]. Erik9 ( talk) 01:18, 2 March 2009 (UTC)

Updating bot

A lot of the templates for the To-Do in the taskforces of the WP:MILHIST have, in the requested articles section, blue links, meaning that they are no longer requested. Could a bot go through and remove these as they are created? Or be run every 24 hours to remove them as they are created? Just a question, I have a little bot programming experience, but not enough for it to help. TARTARUS talk 01:23, 28 February 2009 (UTC)

Coding... Lego Kontribs TalkM 01:36, 28 February 2009 (UTC)
This is such a great help, thank you very much! TARTARUS talk 01:39, 28 February 2009 (UTC)
BRFA filed WP:Bots/Requests for approval/Legobot 11 Lego Kontribs TalkM 01:32, 5 March 2009 (UTC)

Deleted template removal

Resolved

Template:Infobox movie certificates was recently deleted at WP:TFD, but the template link is still present in many articles - too many to be easily removed individually. [4] I tried to remove them myself using AutoWikiBrowser, but because of the parameter within the template, it could not be done using the program, and therefore a bot would do much good here. – Dream out loud ( talk) 18:39, 1 March 2009 (UTC)

Why couldn't it be done with AWB? AWB does regex, so "\{\{Template:Infobox movie certificates[^}]*\}\}" with the ignore case box checked should work just as well as any bot could. - Jarry1250 ( t, c) 18:57, 1 March 2009 (UTC)
 Done. I did it with AWB; as Jarry1250 said you can do things like this easy enough with it if you know how to construct the regex.-- Dycedarg ж 21:04, 1 March 2009 (UTC)
Ok, thanks guys. I don't know much about constructing regex through AWB, otherwise I would have done it myself. – Dream out loud ( talk) 21:12, 1 March 2009 (UTC)

WikiProject Indiana

Hello! I was about to do something that is going to take me a week probably, and I thought.. Maybe a bot can help! I would like to have a bot look at the talk page of every article in Category:Indiana and all subcategories and make sure there is a {{WikiProject Indiana}} tag on the talk page. If there is not, I want it to add one without any parameters. This will put them all into the unassessed Indiana articles category. Then the projects members.. Probably all me, will be able to go through and assess them more quickly, without having to hunt for them first! I have a hunch that there are a couple hundred in there that are not tagged.. maybe more. Is this something that can be done by a bot? Charles Edward ( Talk) 03:02, 2 March 2009 (UTC)

Not like that. You have to specify specific categories to check. Subcats is too vague and always leads to chaos. Someone could generate a complete category tree for you to filter, but just going straight through the category isn't allowed anymore. §hep Talk 03:06, 2 March 2009 (UTC)
Ok. I will go the old fashioned way. Thanks for your help! Charles Edward ( Talk) 03:14, 2 March 2009 (UTC)
We can still do it if you give us a solid list of categories that we run through. Lego Kontribs TalkM 04:58, 2 March 2009 (UTC)

Spacing around <ref>s

IIRC the recommended style for spaces and punctuations around footnotes is:

word<ref> not word <ref> (no space before the <ref>) and
word,<ref> and word.<ref> not word<ref>, and word<ref>. (punctuation before the <ref> not after). Many articles have the spacing wrong; would a bot be the right way to fix this? Shreevatsa ( talk) 22:54, 2 March 2009 (UTC)
I think WP:AWB would probably be the best way to fix this. — Nn123645 ( talk) 01:52, 3 March 2009 (UTC)
You do not recall correctly; see WP:REFPUNC. Anomie 02:06, 3 March 2009 (UTC)
Thanks for the link, but I don't see how it's different from what I said. (No spaces before the footnotes, and punctuation—other than dashes—occurs before the footnote. I wasn't thinking about dashes; maybe that's what you meant? In which case thanks again for pointing it out.) Shreevatsa ( talk) 03:14, 3 March 2009 (UTC)
I was referring to this:

Some editors prefer the in-house style of journals such as Nature, which place references before punctuation. If an article has evolved using predominantly one style of ref tag placement, the whole article should conform to that style unless there is a consensus to change it.

It's unlikely that a bot could make that determination, and errors would result in excessive controversy. Personally, I like the "refs after punctuation" style, but... Anomie 03:25, 3 March 2009 (UTC)
Oh, thanks! I don't know how I missed that. :) Shreevatsa ( talk) 04:16, 3 March 2009 (UTC)
commonfixes.py get around that requirement by only operating if after the reference is the major style (>50%). But the routine is very tricky to implement if you want to take into consideration newlines. I still haven't gotten it fixed up and have about 20 diffs sitting on my desktop of issues. It's used in PDFbot and my tools, but I wouldn't recommend reusing it until I get the newline issues resolved. — Dispenser 04:32, 4 March 2009 (UTC)

Broken ESPN links

A recent redesign of http://espn.go.com has broken many of the links. The only information I have seen from ESPN itself is an unhelpful message at broken links, for example http://espn.go.com/nba/news/1999/1012/110905.html. See Wikipedia:Help desk#Missing footnote links for a discussion. Special:Linksearch currently displays 1795 links to http://espn.go.com in this search. Manual experimentation on a limited number of cases shows that many links to http://espn.go.com still work but if they are broken then it works to insert "static." or "assets." before espn.go.com, for example http://static.espn.go.com/nba/news/1999/1012/110905.html or http://assets.espn.go.com/nba/news/1999/1012/110905.html. Could a bot go through the links to test them and if they are broken then test whether a replacement works? Both "static." and "assets." worked in the cases I tried but I don't know whether it will always work. PrimeHunter ( talk) 01:35, 3 March 2009 (UTC)

Coding... X clamation point 01:36, 3 March 2009 (UTC)
BRFA filed X clamation point`
Thanks. I guess you meant http://static.espn.go.com and not http://static.go.com. PrimeHunter ( talk) 05:40, 3 March 2009 (UTC)

"Year" removal

It has been proposed at Wikipedia talk:WikiProject Years#"Year" that the word "Year", apparently added to all year articles at the outset, be eliminated, as specified in Wikipedia:WikiProject Years#Intro Section. As far as I can tell, "Year" was never in a proposed template in that project. Project approval is expected, but has not yet reached consensus.

The detailed proposal, would be: for each Year article, replace, at most once, at the beginning of a line, replace

Year '''(article name)'''
Year ''(article name)''

or

Year (article name)

by

'''(article name)'''

As this will hit approximately 3700 articles (I manually changed 1921–1923 and 1963–2059, using WP:AWB and other test edits.), I wanted to give the bot programmer a chance to code it efficiently. This is a run-once, so it may not be necessary to code it efficiently. — Arthur Rubin (talk) 03:07, 3 March 2009 (UTC)

If you do get consensus and a clear set of rules for the bot to follow, I can do this with Sambot. [[Sam Korn]] (smoddy) 17:42, 5 March 2009 (UTC)
OK, it's still being discussed, and morphing into a general discussion of reformatting the opening line. Still, if a consensus develops, I'll repost. This request will probably be archived before a consensus is reached, anyway. — Arthur Rubin (talk) 22:45, 6 March 2009 (UTC)

Photo required bot

Just as an experiment, I'm wondering if anyone would be kind enough to devote some time to putting together a bot to do the following, to create a useful page indicating for which parts of the county are photos required. Something like:

  • Parse all articles in category:Northumberland and child categories
  • For each article which uses {{ coord}} (i.e. has a geo-coordinate), evaluate whether there is a file: or image: tag on the page (i.e. is there a photo)
  • If no file or image, list the article name and the coordinate on a single results page (optionally grouped by category) such that {{ GeoGroupTemplate}} can be used to visualise all of the locations on a map.
  • Note that a couple of pieces of coordinate manging might be required:
    • in the output page, coord should be display=inline ... in the source page it will probably be display=title
    • an additional parameter |name= should be added, with article_name being used as the argument
    • if the coordinate in the article is in an infobox and not in a regular {{ coord}} then you're on you own...

By way of explanation, I /think/ that Northumberland articles are fairly well geo-coded, and so looking for coord gives us all places & things capable of being photographed. That said, a variant of the same thing which simply looks for northumberland articles without images might be just as interesting.

As is the way of these things, a) were such a thing on the toolserver or b) capable of being run as a bot for any project, it might be a useful thing. Right now I'm interested to see if it yields useful results for me in my neck of the woods. thanks -- Tagishsimon (talk) 03:42, 3 March 2009 (UTC)

If you are interested in lakes, you can browse Category:Wikipedia infobox lake articles without image with Google maps, e.g. [5]. -- User:Docu
That's just exactly what I'm looking for. (Only for Northumberland.) How can production of maps like these for, say, every country, state & county, be a bad thing? -- Tagishsimon (talk) 00:36, 6 March 2009 (UTC)
It works also with Category:Wikipedia requested photographs in Northumberland, but that requires the articles to be tagged first.
Another solution would be http://toolserver.org/~magnus/fist.php which can list all articles w/o images. One could match that with a list of coordinates from Northumberland, e.g. [6]. -- User:Docu

Need a Bot to Help School

My school needs a new field really bad and Kellogg's is having a contest thing where you need the most supporter. Every time you click and put in a code that is shown it counts as a supporter. I really need a bot that clicks on the button then puts in the code. I don't know but maybe it might require a password searcher? Thank you for your time. —Preceding unsigned comment added by 76.126.18.79 ( talk) 19:36, 5 March 2009 (UTC)

This is the place to request bots to improve Wikipedia - that is both irrelavent and illegal. We cannot help you. Dendodge Talk Contribs 19:51, 5 March 2009 (UTC)
From Wikipedia, the free encyclopedia
Archive 20 Archive 23 Archive 24 Archive 25 Archive 26 Archive 27 Archive 30


Changing the format of an infobox

Hello, I would like to change the format of {{ Infobox Taiwan Station}}. The current template codes on each page are like this:

{{Infobox_Taiwan_Station|
Title=|
SubTitle=|
EngTitle=|
EngSubTitle=|
ImageName=|
ImageSize=|
ImageCaption=|
Style=|
Place=|
Coordinates=|
Code=|
Operator=|
Line=|
StartDate=|
RouteType=|
TraLevel=|
TraElecCode=|
TraStartLocal=|
TraMile=‎|
}}

It is very disorganized. Based on other station infoboxes, I propose to change to this: (Note: The name of the infobox is changed too.)

{{Infobox Taiwan station
| title =
| en-title =
| image = example.jpg
| image_size = 250px
| image_caption =
| type = 
| address = 
| coordinates = {{coord||}}
| code =
| operator =
| line =
| opened =
| tra_level = 
| tra_code =
| tra_start = 
| tra_mileage = 
}}

Notice subtitles and route type are deleted because they are mostly unnecessary. Several names are replaced in order. Thanks, waiting for approval... impact F = check this 05:38, 25 January 2009 (UTC)

Bot needed to fix Year in Baseball template please

Hello. See Template talk:Year in baseball#suggestion. We need a bot that can replace the sidebar {{ Year in baseball}} with: 1) {{ Year in baseball top}}; and 2) at the bottom of the page the new footer {{ Year in baseball}}. This would be done on all of the "[YEAR] in baseball" articles, about 140 or so pages. Right before this is done, the code from the talk page needs to be copied onto the template so the current sidebar template becomes a navbox. Any takers? If so, please follow up at the talk page Template talk:Year in baseball#suggestion. Rgrds. -- Tombstone ( talk) 15:16, 21 January 2009 (UTC)

Doing... see talk page in previous comment. Richard 0612 20:25, 22 January 2009 (UTC)
Y Done. Richard 0612 15:10, 25 January 2009 (UTC)

in the template {{ Infobox Television film}}, i would like make it so that that filename only is required when adding images. this is easily accomplished by swapping:

this:

CURRENT
{{#if: {{{image|<noinclude>-</noinclude>}}} |
{{!}} style="text-align: center;" colspan="2" 
{{!}} {{{image}}} {{#if: {{{caption|<noinclude>-</noinclude>}}} 
| <br/><span style="font-size: 95%; line-height:1.5em;">{{{caption}}}</span> }}

with this:

NEW
{{#if:{{{image|}}}| 
{{!}} style="font-size: 95%; line-height:1.5em; text-align: center;" colspan="2" 
{{!}} [[File:{{{image}}}|{{#if:{{{image_size|}}}|<!--then:-->{{px|{{{image_size}}} }}
|<!--else:-->220px}}|]] {{#if:{{{caption|}}}|<br />{{{caption}}}}}


however, each article (>500) using this template will need to be gleaned of the wiki markup coding in that field.

 
from this: 

| image     = [[file:example.jpg|220px]]

to this:

| image     = example.jpg
| imagesize = 220px

.....submitted for your approval. -- emerson7 23:00, 23 January 2009 (UTC)

Yes, I'm going to have a go at it, but it's quite possible someone's already got a bot approved for this sort of thing and I will be happy to let them take over. Is it also possible that articles are using image and not file? Coding... - Jarry1250 ( t, c) 09:48, 24 January 2009 (UTC)
 Done - any problems? - Jarry1250 ( t, c) 18:54, 25 January 2009 (UTC)

Pride events

Is there a bot that can place move requests on every article in a template? All 60+ articles in Template:Pride Events need to be moved (through 1 discussion page of coarse), so is there a bot that can place the same move template on them all or will I have to manually do it? TJ Spyke 07:24, 25 January 2009 (UTC)

This would be an easy task to do with AWB. §hepTalk 20:02, 25 January 2009 (UTC)

Spelling change: Kveta Peschke to Květa Peschke

Can about change all the links from Kveta Peschke to Květa Peschke please, as this is her correct spelling. There might be 100s more similar to this requests, for Czech names, so maybe I will have to make a bot myself. -- Voletyvole ( talk) 11:28, 25 January 2009 (UTC)

I think (though I've been over-ruled many a time) that it might fall foul of WP:R#NOTBROKEN, but I'm sure a more experience bot writer would give you a better answer. - Jarry1250 ( t, c) 11:41, 25 January 2009 (UTC)
Declined Not a good task for a bot.: Jarry is right, it's usually not a good idea to make hundreds of edits to bypass a redirect, as it has little benefit. As long as it's an accepted alternate spelling updating the links doesn't really seem necessary here. Richard 0612 12:00, 25 January 2009 (UTC)
In addition, Kveta Peschke is by far the most common spelling in English sources so the article should probably be moved to that per Wikipedia:Naming conventions (use English). PrimeHunter ( talk) 12:29, 25 January 2009 (UTC)
OK. Thanks for the info. I think the message here is "it doesn't matter so much about spelling. You can do more worthy things with your time on WP". I will do more worthy things ;) -- Voletyvole ( talk) 16:01, 25 January 2009 (UTC)
I have added a rule to RegExTypoFix to make the change from "Kveta Peschke" to "Květa Peschke" here. We already have several similar rules to correct foreign names, so I don't see any reason not to add this one to the list. -- ThaddeusB ( talk) 16:39, 25 January 2009 (UTC)
Will you be fed up if I add a humungous bunch of Czechslovak and Polish (and other Slav) names to this AWB Typos list? I have made an offline list, featuring a shed load thus. I beg your pardon if this isn't quite the right place to ask (n00b errors, ya know dudes, what can y'all do?) Cheers 4 the links though. -- Voletyvole ( talk) 21:43, 25 January 2009 (UTC)
I personally wouldn't mind, but there are those that feel the list shouldn't be clogged with "worthless entries." As long as the error occurs at least 5-10 times on Wikipedia its probably safe to add. -- ThaddeusB ( talk) 02:28, 26 January 2009 (UTC)
This would be better than the first suggestion as (a) it could fix several misspellings per edit and (b) it would also fix ones which are not part of a link. — CharlotteWebb 02:52, 26 January 2009 (UTC)

Need a bot to update a table

I need a bot to update this table on a monthly basis. It's not too hard to do by hand, but then it's even easier for a bot. :) The required information for the quality/importance columns can just be copied directly from here. The AfD columns require that the bot parse the archive of this page for the previous month and count the number of bullets in the lists (the "*" character), as well as the number of instances of the words "Delete", "Redirected", "Keep" and "Merged". Thanks! SharkD ( talk) 05:19, 24 January 2009 (UTC)

Unless someone has an already existing bot that can handle this, I will be happy to incorporate this request into WikiStatsBOT. -- ThaddeusB ( talk) 06:25, 24 January 2009 (UTC)
Thanks very much! The final structure of the table has not been finalized: a third fourth set of statistics may appear shortly. However, none of the existing columns will be removed; so if you want to start figuring out how to create the bot, go ahead. SharkD ( talk) 11:28, 25 January 2009 (UTC)
OK, I added the fourth set of statistics. The stats can be gotten from here. You just need to enter the correct page link and date and then copy the total. SharkD ( talk) 11:58, 25 January 2009 (UTC)
I removed them again since they're duplicates of information that appears in another page already. SharkD ( talk) 20:00, 26 January 2009 (UTC)

Template:Swiss Presidents

Please update {{Swiss Presidents}} to {{Presidents of the Swiss Confederation}} as according to Special:WhatLinksHere/Template:Swiss_Presidents about 80 articles of former presidents still use the old name template, which causes problems, like these articles still showing up at the disambiguation Special:WhatLinksHere/Eduard_Müller for some reason, instead at the proper article Special:WhatLinksHere/Eduard_Müller_(Swiss_politician). The admin that had made the move a year ago had not bothered to fix this. --  Matthead   Discuß   22:25, 28 January 2009 (UTC)

Need another bot to update a different table

I need a bot to update the tables in this page. The information can be gotton using the article traffic statistics tool. There's also a backlog of about 8 months that would need to be filled on the bot's first run. If the bot could also update the graphs it would be doubleplusgood. Ideally I would like the graphs to be SVG images instead of PNG, but I'm not real sure how this would be accomplished. Thanks! SharkD ( talk) 20:02, 26 January 2009 (UTC)

Coding... Lego Kontribs TalkM 01:09, 29 January 2009 (UTC)

Put stub template on stub articles

Can anyone put {{ Taiwan-film-stub}} and replace {{ Taiwan-stub}} and {{ Film-stub}} on Taiwanese film stubs? The list is here. Thanks! :)

Can anyone do the request I had above??? impact F = check this 00:32, 27 January 2009 (UTC)

You might want to also try requesting this over at Wikipedia:WikiProject Stub sorting, since I think that's one of the reasons why the project exists. SharkD ( talk) 01:49, 27 January 2009 (UTC)
I did. We had a consensus to create the stub and stub category. Here. Thanks for concerning. impact F = check this 01:59, 27 January 2009 (UTC)
This is probably a task better suited for AWB. You can request it be done by someone with AWB here. -- ThaddeusB ( talk) 05:01, 27 January 2009 (UTC)
Doing... with AWB. §hepTalk 22:03, 27 January 2009 (UTC)
Y Done §hepTalk 03:01, 29 January 2009 (UTC)

WikiProject Triathlon banner tagging

Could we get the WikiProject Triathlon banner ( Template:WP Triathlon) on all articles in Category:Triathlon and Category:Duathlon and all subcategories of both (I have checked for exceptions, but there are none). A number of articles have already been tagged. Thanks. Yboy83 ( talk) 09:42, 28 January 2009 (UTC)

Possible Possible : TinucherianBot ( talk · contribs · count) can do this for you -- Tinu Cherian - 11:52, 28 January 2009 (UTC)
Doing... : TinucherianBot ( talk · contribs) working on this -- Tinu Cherian - 19:09, 28 January 2009 (UTC)
Y Done : The bot task is completed. Tagged around 455 article/cat talk pages -- Tinu Cherian - 02:24, 29 January 2009 (UTC)
Many thanks. Regards, Yboy83 ( talk) 08:39, 29 January 2009 (UTC)

Uncategorized articles

The backlog at Wikipedia:WikiProject Categories/uncategorized is running low. The category is usually populated by Alaibot directly from database dumps but it seems that Alai is AWOL. I don't know if anyone has code available to perform a similar task, nor do I know if Alai ever published his code. In any case, I'd appreciate if someone can at least tag articles of Special:UncategorizedPages which are indeed uncategorized. Note that in fact most of the articles appearing on the special page are categorized because of cache issues which is why Alaibot was so useful. Pichpich ( talk) 21:37, 30 January 2009 (UTC)

PS: anybody know what's up with Alai? Pichpich ( talk) 21:37, 30 January 2009 (UTC)
I currently run through the special page with User:UnCatBot, though I suppose I could use the API to generate the data manually, which would alleviate the cache problem. — Nn123645 ( talk) 02:03, 31 January 2009 (UTC)

Prodbot

Not sure if there's any bot that already fulfills this function, but I monitor articles proposed for deletion from time to time and find it quite tiresome to add {{ oldprodfull}} to talkpages. What about having a bot search old revisions of articles for prod notices and updating talkpages with the appropriate details accordingly? Skomorokh 02:39, 31 January 2009 (UTC)

Betacommandbot

PLEASE reinstate betacommandbot! he was sooooooooooo cool! please reinstate him soon! why did you get rid of him??????? —Preceding unsigned comment added by 216.160.167.169 ( talk) 21:19, 31 January 2009 (UTC)

for these reasons, in short. - Jarry1250 ( t, c) 21:27, 31 January 2009 (UTC)

Delsort

Would someone have the time and inclination to create a one-step WP:DELSORT process? User:Hrafn suggested that a bot periodically pick up on delsort tags placed directly in an AfD and automagically transclude the AfD in question on to the target page if not already present. It would cut the work of deletion sorting roughly in half. Jclemens ( talk) 18:48, 29 January 2009 (UTC)

So the request is that the bot check active AFDs for links to any subpage of Wikipedia:WikiProject Deletion sorting, and add a transclusion to the page in question (just under the "<!-- New AFD's should be placed on top of the list, directly below this line -->") if it's not already present? Anomie 00:59, 30 January 2009 (UTC)
Exactly. It would allow delsort tagging without needing to depart from the AfD/t page. Jclemens ( talk) 01:32, 30 January 2009 (UTC)
The detection algorithm would likely be very similar to the one that WP:RFC uses to detect {{RFC[topic]}} templates in article talk pages (but simplified by the fact that it would only have to check the AfD subspace), but instead look for {{subst:delsort|[topic]}} (may need to be altered slightly and/or turned into a non-subst template). Hrafn Talk Stalk( P) 03:30, 30 January 2009 (UTC)
Coding... No need to unsubst the template, it just needs to look for a link to any WP:DELSORT subpages in any active AFD and compare that to the list of templates transcluded in those subpages. Anomie 03:42, 30 January 2009 (UTC)
I'm having a slight issue, the finding of the AFDs part is working fine but it seems every sorting page has a slightly different "insert new AFDs here" line (and some lack the line completely). Anomie 12:06, 30 January 2009 (UTC)
Sounds like a good idea, though I would strongly recommend Jayvdb's awesome script to anyone unaware of it - it already makes sorting a one step process. the wub "?!" 13:28, 1 February 2009 (UTC)

IMDb links

Would it be possible to have a bot check through all articles using the |imdb_id= parameter in {{ Infobox Film}} to see which of these do not otherwise contain a link to IMDb, i.e. through the use of {{ imdb title}} (or any of its redirects), and present this data in the form of a numbered list? Such information would be useful in an ongoing debate over the use of such parameters in the infobox. Thanks in advance for any help! :) PC78 ( talk) 15:34, 2 February 2009 (UTC)

User pages in article space categories

Can a bot remove all user pages from article space categories and monitor and new additions? I am often removing user page categories since uses will:

  • create draft articles in user space for eventual moving to article space
  • editors sometimes userfy articles and forget to remove or comment out the categories.

It seems a feasible task for a bot. Hmm. Had a think about it. How do you define a non-article category? Everything under Category:Wikipedia administration? All cats (with some exceptions) that have Wikipedia or template in the category name? -- Alan Liefting ( talk) - 02:56, 4 February 2009 (UTC)

Annoying as it may be when using a category, a user busy fettling a copy of an article in userspace will not thank you for coming along and stomping all over the categories in the copy article. If you are going to do anything, at best, comment out the cats, don;t delete them. -- Tagishsimon (talk) 03:33, 4 February 2009 (UTC)
Yes. That is what I have been doing. I have not received any flack yet even though I have deleted some on occasion. If it was done by a bot I would no longer in the cyberfiring line. Some users may not be aware of their subpages even though there is now the "subpages" link in the contributions page. -- Alan Liefting ( talk) - 05:07, 4 February 2009 (UTC)

OrphanBot

Could someone please write a bot to patrol this page to tag orphaned articles as such? This would help WikiProject Orphanage in our work at de-orphaning articles. Thanks, ErikTheBikeMan ( talk) 21:47, 3 February 2009 (UTC)

Tell me that you are not going to put a visible tag onto an orphan article? The result of this will be thousands of article's so tagged for the next n years. Tagging on this scale is akin to vandalism. -- Tagishsimon (talk) 03:34, 4 February 2009 (UTC)
The hyperbole is uncalled for and completely incorrect. Tagging orphan articles as orphans is standard practice and has been done for years. It is a standard maintenance tag that assists the encyclopedia. -- JLaTondre ( talk) 03:51, 4 February 2009 (UTC)
That's your view. Mine is that such tags disfigure articles. There is little prospect that many articles will be un-orphaned. The corollary is that such tags will persist, to no good effect, for years to come. Being an orphan is not the most important thing about an article. Indeed it barely rates. Mass tagging of such articles would be a crime. -- Tagishsimon (talk) 03:54, 4 February 2009 (UTC)
And clearly everybody disagrees with you. BJ Talk 03:55, 4 February 2009 (UTC)
Possibly, though I always like a little evidence to back up a wild assertion like that. Take it to Wikipedia:Village pump (policy)#Policy on Article Tags and put the bot request on hold until it is settled. -- Tagishsimon (talk) 04:08, 4 February 2009 (UTC)
The very fact of the template's existence and continued use is very clear evidence. BJ Talk 04:19, 4 February 2009 (UTC)
As I said, take it to Wikipedia:Village pump (policy)#Policy on Article Tags. -- Tagishsimon (talk) 04:21, 4 February 2009 (UTC)
And oddly for your thesis, BJ, Template talk:Orphan sees a 5:5 split between those who think spamming articles with the orphan tag is a good idea, and those who do not. Clearly your "clearly everybody disagrees with you" angle was just invention. -- Tagishsimon (talk) 04:43, 4 February 2009 (UTC)

Excellent idea. Saves us the trouble of clicking "What link here" on a bunch of stubs. Over time these will be de-orphaned, and thus get more attention, thus improve. Tag away!

Replacing accessmonthday parameters in citeweb template

Could someone (or something) replace all the outdated accessmonthday fields in Ayumi Hamasaki with whatever the correct field(s) is/are? I r t3h n00b when it comes to this kind of thing, so sorry if this is in the wrong section or whatever. Thanks! Ink Runner ( talk) 18:30, 3 February 2009 (UTC)

It would be quite difficult for a bot to determine whether or not the url still contains the content we are intending to cite. In some cases it may be replaced by new content, with the old stuff being moved elsewhere (needing to be located again and perma-linked if possible). In other cases it may be deleted altogether but the host server does not provide a proper 404 notice which such a bot might depend on. Overall this would create too many false positives and defeat the purpose of tracking the most recent date on which the correct content was known to be accessible. — CharlotteWebb 19:07, 3 February 2009 (UTC)
Ah, okay. Thanks muchly! Ink Runner ( talk) 19:12, 3 February 2009 (UTC)
The only way this could work well is if the bot had a specific string of text to check for, such as the exact quotation being cited in the ref (of course everybody knows footnoted quotes are bad juju, so proceed with caution). — CharlotteWebb 20:16, 3 February 2009 (UTC)
I may have misunderstood Ink Runner ( talk · contribs)'s request (this was a topic of discussion over at Talk:Ayumi Hamasaki first); but I don't think we're looking to revalidate the reference. Rather, just change from using accessmonthday (and accessyear), instead using accessdate. For example, if a reference now has:
accessmonthday=December 2|accessyear=2008
We'd like it instead to have:
accessdate=2008-12-2
If I've got that correct, that seems like a task that would be amenable to botting. Ink Runner, did I read you correctly? TJRC ( talk) 21:41, 3 February 2009 (UTC)
Wikipedia:Bots/Requests for approval/MelonBot 12 might be of some interest. - Jarry1250 ( t, c) 21:44, 3 February 2009 (UTC)
Okay, that would actually be a good idea and easy to do too. Oops, I read "outdated" to mean "too many days ago" etc. — CharlotteWebb 02:38, 4 February 2009 (UTC)
Looks like MelonBot's on it. It's hit three articles on my watchlist since this discussion (including Days/Green, Ink Runner). I imagine it's just a matter of time until it gets to Ayumi Hamasaki. TJRC ( talk) 07:33, 4 February 2009 (UTC)

Archiving with delay

Since we often discuss highly contentious issues in our project ( WP:SLR), we agreed to wait some time after marking topics as "resolved", before archiving them, so people get a chance to say: "No, this isn't resolved yet!" I asked Cobi, whose well documented bots I would have loved to use, but they can't do that. Earlier, we had a purely time triggered bot, but the problem with that was that it also archived sections that were just at a momentary standstill. — Sebastian 09:35, 31 January 2009 (UTC)

Something that I've seen used to delay archiving is to sign the section with a timestamp in the future. I'm not sure if that works for all archive bots, though. -- Carnildo ( talk) 09:54, 31 January 2009 (UTC)
Great idea! We could start from a timestamps like {{#time: Y-m-d h}}. [1] The two things that are missing are (1) How to add a day or two to that timestamp? (2) Which bot can look for a different day each time? — Sebastian 10:16, 31 January 2009 (UTC)
With ClueBot III you could set the archive time to something extremely old (ie 100 days) and when the discussion was done tag it with {{ resolved}}. Or, WP:GL/I uses DyceBot operated by Dycedarg. The bot tags a section as stale after 2 weeks of no activity, then archives them a week later. When a section is marked with {{ resolved|1=~~~~~}} it archives the section after the tag has been placed for three days. He might be able to do something similar for you. §hepTalk 03:45, 2 February 2009 (UTC)
Cool, it sounds like DyceBot does just what we need! I'll ask Dycedarg. Thanks a lot for the lead! — Sebastian 19:41, 5 February 2009 (UTC)

Category tagging requests

Could someone tag all the following categories with their proper banners?

-- Jeremy ( Blah blah...) 20:29, 4 February 2009 (UTC)

You're quite sure that you want Category talk:Red Bull Air Race World Series pilots tagged with {{ WikiProject Soft Drinks}}? How about Category talk:Snooker and Category talk:Cocktails with WPBeer? The other two look alright, but the paragraph at the top of the page was added for a reason: tagging by category is hit-and-miss at the best of times, and just tagging the actual categories doesn't decrease the risk that much. Happymelon 21:04, 4 February 2009 (UTC)
  1. Soft drink advertising is a sub cat of soft drinks, so yeah.
  2. I took care of the Beer/Drinking establishments entanglements, they are sibling cats not parent/child cats.
  3. As far as I can tell, Category:Cocktails is not a child of category:Beer, there is an overlapping Category:Cocktails with beer but not a parent/child relationship.
I have looked at these categories down about three branches and can see some overlap but no major entanglements other than the beer/bar one that I fixed. So, I believe that this is a goof to go. -- Jeremy ( Blah blah...) 21:38, 4 February 2009 (UTC)

Remove flag

Can some one have a bot find {{flagicon|Ireland|rugby}} and replace with [[Ireland national rugby union team|Ireland]] as per Wikipedia_talk:WikiProject_Rugby_union#Wider_opinion_needed Gnevin ( talk)

Also {{ru|IRE}} Gnevin ( talk) 23:35, 2 February 2009 (UTC)

Identifying virtually identical redirects with different targets

I found these two redirect pages redirecting to two DIFFERENT articles:

That should not happen and I fixed it. I've seen this situation maybe a couple of dozen times. In another case, I found these three redirecting to three DIFFERENT articles:

A bot cannot decide what pages things like this ought to redirect to, if any, but I would think a bot could be constructed to

  • find things like this;
  • make a list of them so that Wikipedians can go down the list and find those within their competence and fix them;
  • possibly call them to the attention of the appropriate WikiProjects based on the target articles' category tags. Michael Hardy ( talk) 18:39, 4 February 2009 (UTC)
  • Could you specify a little more precisely how "things like this" are to be defined? -- R'n'B ( call me Russ) 13:57, 6 February 2009 (UTC)
  • Differing only in choice of dash, internal spacing with the dash capitalization, and a final "s"? Just a wild guess.... — Arthur Rubin (talk) 02:30, 7 February 2009 (UTC)

Internal project page tagging

Could someone tag all the following projects with their proper banners and proper categories?

  • All pages under Wikipedia:WikiProject Soft drinks with {{WikiProject Soft Drinks |class=project |importance=na}}, categorize as [[category:WikiProject Soft Drinks|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Spirits with {{WikiProject Spirits |class=project |importance=na}}, categorize as [[category:WikiProject Spirits|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Wine with {{WikiProject Wine |class=project |importance=na}}, categorize as [[category:WikiProject Wine|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Beer with {{WikiProject Beer |class=project |importance=na}}, categorize as [[category:WikiProject Beer|{{PAGENAME}}]]?
  • All pages under Wikipedia:WikiProject Food and drink with {{WikiProject Food and drink |class=project |importance=na}}, categorize as [[category:WikiProject Food and drink|{{PAGENAME}}]]?

Thanks, Jeremy ( Blah blah...) 20:04, 5 February 2009 (UTC)

Do you mean subpages of the project page? Lego Kontribs TalkM 04:45, 6 February 2009 (UTC)

yes. -- Jeremy ( Blah blah...) 06:31, 6 February 2009 (UTC)

Coding... Lego Kontribs TalkM 03:57, 7 February 2009 (UTC)

Can someone add an automatic bot that patrols the pages in the Heroes template? Raiku Lucifer Samiyaza 04:00, 7 February 2009 (UTC)

Redundant As X! mentioned earlier, ClueBot, VoABot II, SoxBot III are already running. There is no need for a bot to "patrol" such a small set of pages. Anomie 04:09, 7 February 2009 (UTC)

Can someone add an automatic bot that patrols featured articles? Raiku Lucifer Samiyaza 04:04, 7 February 2009 (UTC)

Redundant As X! mentioned earlier, ClueBot, VoABot II, SoxBot III are already running. Anomie 04:09, 7 February 2009 (UTC)

Stub templates not updating

Greetings all! This is my first bot request, so be kind. Over at WP:SFD we need something that will update stub categories on hundreds of articles, as the templates have been either renamed, deleted, or redirected. Alaibot used to do this, but sadly Alai has vanished off the radar since Dec 13, and work is backing up. There are currently, for example, articles linked to the non-existent Category:European organization stubs which should fall into the new Category:European organisation stubs, since the template's category was renamed. I hope I'm explaining myself all right. Can we recruit a bot to go through the stub cats on a regular basis and fix this? I assume it's due to server lag, but no one wants to null-edit a gazillion stub articles (I'm sure that's what caused Grutness' arthritis...;). Cheers, Pegship ( talk) 05:37, 8 February 2009 (UTC)

Forgive the huge gap in my understanding of Wikipedia, but is there not some sort of redirection from one category name to another? If there were, you might come unstuck on WP:R#NOTBROKEN, but for my own laziness I have no idea whether there is or isn't. - Jarry1250 ( t, c) 09:49, 8 February 2009 (UTC)
I'm not sure whether that's the issue (aren't we a pair?)...What happened is the template {{ Euro-org-stub}} used to place articles in Category:European organization stubs. The category was renamed Category:European organisation stubs and changed accordingly in the template code. I assume that if there were no server lag, articles tagged with {{ Euro-org-stub}} should start moving from Category:European organization stubs to Category:European organisation stubs. That wouldn't require a redirect, I think. Pegship ( talk) 21:07, 8 February 2009 (UTC)
This will get fixed automatically by the job queue. All the pages will be updated so that they now reflect the proper category. Lego Kontribs TalkM 03:21, 9 February 2009 (UTC)
That has been my experience in the past, but since about mid-December I have noticed a longer and longer wait for the work to get caught up. Right now the job queue is at 1.5 million - is this high, normal, abnormal? Pegship ( talk) 03:45, 9 February 2009 (UTC)
Manual:Job queue says a few million isn't abnormal during peak hours; I generally see it at around 500,000 – 1+ million. §hepTalk 02:32, 10 February 2009 (UTC)
If it turns out that you do need a bot, my AWB bot can assist you. Let me know and I'll get started on the approval process. Robert Skyhawk So sue me! ( You'll lose) 04:58, 9 February 2009 (UTC)

Not exactly a bot request, buy you guys are the computer geniuses. A new way to communicate.

Please take a look at Wikipedia:Village_pump_(proposals)#IM_and_VOIP. Thanks. - Peregrine Fisher ( talk) ( contribs) 06:42, 11 February 2009 (UTC)

Speedy-delete bot

Hi, I was just wondering if it would be possible to make a bot that would check, when someone removes a speedy delete tag, that it wasn't the page creator, and if it was, revert the edit? It would also be nice if the bot could move {{ hangon}} templates to the proper location. This would make monitoring new pages much easier, since ATM every time I csd an article I have to watch it until it gets deleted, because there is a 25-30% chance that the creator will remove the tag. -Zeus- u c 01:34, 10 February 2009 (UTC)

I thought there was a bot that already handeled this task. — Nn123645 ( talk) 20:41, 10 February 2009 (UTC)
I don't think so, it's happened a lot to me and always I have to revert it. I don't know if there's something for AfD, but that would be nice too. -Zeus- u c 20:52, 10 February 2009 (UTC)
Ahh here it is I guess it didn't get approved. — Nn123645 ( talk) 20:55, 10 February 2009 (UTC)
Doing... while I hate to take on another project because I feel like I'm over extending myself, I think I can do this fairly easily by modifing code I have already written. — Nn123645 ( talk) 20:59, 10 February 2009 (UTC)
Wow, thanks. It looks like the creator withdrew it cause he couldn't get it work. After reading some of the concerns in the original thread, maybe it would be better if it just warned whoever put the csd if it gets removed by the creator. I don't know what's protocol for bots. -Zeus- u c 22:09, 10 February 2009 (UTC)

If the original article falls into a class like "nocontent" or "nocontext", and the author fixes the article and then removes the speedy tag, reverting these edits would essentially be vandalism by the bot and might cause us to lose a good article (if the CSD-patrolling admin doesn't properly check the history). Wikipedia is not a bureaucracy, let's not turn it into one by making a strong but ignorable rule on CSD tags into bot-enforced policy. Kusma ( talk) 12:59, 12 February 2009 (UTC)

I see your point, I'm thinking I will set up the bot so it will only revert new editors (say with less than maybe less than 200 mainspace edits), and be 1RR complient. In the case of an an experienced editor I think the best option would be to place a notice on the person's talk page who placed the tag and not on the page creators page. As far as that sernerio the CSD tag is only a request to delete the page, it is up to the admin to verify that the page really does meet CSD, if the admin is wrong there is always WP:DRV for the page creator to get the page recreated. The speedy tag is pretty clear that only another user is allowed to remove the tag, if the actual page creator wants to argue his/her case he/she should do so with {{ hangon}}. — Nn123645 ( talk) 13:19, 12 February 2009 (UTC)
In any case, the bot should not revert the edit that removed the CSD template, but rather re-add the CSD template to the (possibly expanded) article (the new page might no longer be an A1 or A7). Another case where reverting might not be the best course of action is the fairly common scenario where a page is created, then CSD tagged, and then the author blanks the page. It is not necessary to further embarass the author by reverting this edit; instead, a {{ db-blanked}} should be added. A good bot for this is probably failry complicated, and a bad bot is too scary for newcomers. Kusma ( talk) 15:23, 12 February 2009 (UTC)
I can see this is going to get more complicated than I originally thought (which is pretty much the story of anything), I'm thinking I will have the bot use both the IRC and RSS feeds to get the diffs of what exactly was removed, if it was just the template I will have the bot use undo to save some bandwidth (not rollback so it doesn't revert all edits by that user), if its something other than just the templates I will have the bot readd the tags using &section=0. I suppose I could try to determine the type of CSD using various methods (size of page, number of sections, etc., see this list to see the (general) types of things I'm thinking of ) to make sure it does readd a wrong CSD and notify the person who placed. Depending on whether the page may still meet the criteria I can have it decide whether to only notice the person who requested speedy deletion or undo the edit and notify. — Nn123645 ( talk) 15:37, 12 February 2009 (UTC)

Removing and replacing inappropriately placed protection templates

The recent addition of the {{PROTECTIONLEVEL}} magic word has allowed the protection templates to output a category instead of visible material when they are placed inappropriately, that is, when the protection level of the page does not match the protection template. This category is visible at Category:Wikipedia pages with incorrect protection templates. I initially thought that the speed at which this category would fill would be low enough that it would be manageable; I was wrong.

Therefore, I request that someone create a bot to handle most cases, using basic logic along the lines of:

  1. IF the page is not protected, THEN remove all protection templates from the page
  2. IF the page is edit-protected but the move-protection is autoconfirmed, THEN remove all move-protection templates from the page
  3. IF the page is move-protected AND the move-protection is sysop AND the move-protection expiry is "indefinite", THEN add {{ pp-move-indef}} to the page

This bot could be run once daily to eliminate virtually all the backlog for the aforementioned category. The logic could be improved to make more intelligent decisions and provide other benefits, but this minimal workflow would be sufficient. Thank you for considering this request. {{ Nihiltres| talk| log}} 03:46, 8 February 2009 (UTC)

Coding... Lego Kontribs TalkM 04:53, 12 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/Legobot III Lego Kontribs TalkM 16:40, 13 February 2009 (UTC)

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Cleaning up ISBN entries in infoboxes

It was formerly the case that filling in the |isbn= field with a raw number in infoboxes such as {{ Infobox book}} did not activate the ISBN magic coding linking to Special:Booksources. This meant that "ISBN" had to be entered into the field as well: that is "|isbn=ISBN 1412806461" rather than simply |isbn=1412806461. The code seems to have been fixed now, meaning that there are a lot of entries with redundant "ISBN" coding. It's obviously an issue of minor style/presentation importance, but should be a relatively easy task to code for. All a bot would have to do is check whether "ISBN" followed "|isbn=" and if so, remove the former (with appropriate spacing). Anyone willing to take this on? Skomorokh 16:33, 12 February 2009 (UTC)

How "important" is it - just how bad does it look with the extra ISBN in it? If it's worth doing, I'd be more than happy to have LivingBot do it. - Jarry1250 ( t, c) 17:38, 12 February 2009 (UTC)
Not very important, it just makes us look a little amateurish. See the infobox here for an example. Skomorokh 21:50, 13 February 2009 (UTC)
Some of them are also enclosed in quote marks (isbn = "ISBN ###"). - Jarry1250 ( t, c) 10:42, 14 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/LivingBot 7

Quick request

Could we have a bot change all the links to Buddah Records to point to the correct spelling, Buddah Records? Thanks Chubbles ( talk) 16:45, 12 February 2009 (UTC)

Declined Not a good task for a bot. or editor for that matter. Policy states you should just have a redirect. Now if its a spelling issue the best way to do fix that would be WP:AWB, as spell checking/spell changing bots are listed on frequently denied bots. — Nn123645 ( talk) 17:14, 12 February 2009 (UTC)
Yeah, it'd be great if someone with AWB could do that, then. Chubbles ( talk) 17:21, 12 February 2009 (UTC)
Wikipedia:AutoWikiBrowser/Tasks -- Tagishsimon (talk) 17:28, 12 February 2009 (UTC)

Change of administration means massive cleanup project needed

  1. The Obama administration has completely scotched the websites of the Bush administration. No redirects, no nothing, everything except the pre-2001 history pages is just down the memory hole. Hey, it's their computers, they can do what they want, but for us, this means that thousands of links in the encyclopedia to www.whitehouse.gov/* are now dead links, and need to be changed to http://georgewbush-whitehouse.archives.gov/ -- but surely in the last few months people have added links to legitimate www.whitehouse.gov pages. So care is needed.
  2. Somewhat more trivially, there are about 200 articles about cases pending in United States courts that are now all mistitled as X v. Bush. Because Bush was sued in his official capacity, rather than in his personal capacity, the cases have been renamed under Federal Rule of Civil Procedure 25(d), and articles need to be moved to X v. Obama pages.

I raise it here in case there's an easy way to set up a project or bot to tag all these dead links and move all these outdated/mistitled pages. (Some of the v. Bush cases are closed, and thus correctly titled; some of the whitehouse.gov pages work, so not all the links are dead, so perhaps not.) THF ( talk) 06:25, 13 February 2009 (UTC), updated 14:06, 13 February 2009 (UTC)

If this were brought to the administration's attention, could the fact that there are thousands of such links and the fact (if it is a fact?) that Wikipedia is an important institution influence them to put back some links en masse? Michael Hardy ( talk) 21:48, 13 February 2009 (UTC)
There might be legal reasons why the old site was moved; and, to be fair, it might have been low-level staffers in the Bush administration that did it. But if we get a bot going fast, it will be less of a problem. NB that editors unaware of what has happened are simply removing dead links instead of tagging them, so we're already starting to lose information. THF ( talk) 13:51, 14 February 2009 (UTC)

So save the list of articles and urls now, and work on developing a bot in the meantime. That way you'll be able to get a list of pages where you'd need to manually review the edit history after the bot finishes (ones where the bot cannot find a link to modify). — CharlotteWebb 20:52, 14 February 2009 (UTC)

Wouldn't even know where to begin with that. I'm flagging this for some soul who wants to save Wikipedia from a real problem. THF ( talk) 21:22, 14 February 2009 (UTC)
The bot could go around, and check every link. If it comes up with a 404 error for both, it skips the link. X clamation point 22:04, 14 February 2009 (UTC)

Coding... for the first part; the plan is to check the target link for a 404 response, and then check the replacement for a 200 before replacing. I'll probably throw in a log of links without a 200 on the replacement for human processing. If someone can tell me how to determine which cases need to be moved and which don't, I could look into the second part too. Anomie 00:12, 15 February 2009 (UTC) (X! already started coding Anomie 00:33, 15 February 2009 (UTC))

Sounds right. For the first part, if there's a way to instruct the bot to add a {dead link} tag, that would also alert editors.
For the second part, if the case is pending, it needs to be renamed; if there was a Supreme Court decision before Jan. 20, 2009, then the Supreme Court decision should be the page name. Unfortunately, the articles are quite a mess: some say "George W. Bush", others say "Bush"; and they haven't been maintained to show what the current status is. That might have to be done by hand, though a bot could add "update" tags to everything in the category. THF ( talk) 00:18, 15 February 2009 (UTC)
Coding... I was already writing this, but forgot to add the tag. I remember mere minutes after anomie put it on. I have talked with anomie about it, and they has conceded the programming to me. X clamation point 00:32, 15 February 2009 (UTC)
I don't think there's any need for an update tag, someone just has to go through everything in the category and check the status if it needs to be done by hand. If there is any sort of online database where the bot could look up the status of the case, that would be sufficient for a bot to do it. Anomie 00:33, 15 February 2009 (UTC)

Bot tagging for WP:FILM

Is it possible to have a bot run through all subcategories of Category:Films by year (excluding Category:The Wizard of Oz (1939 film), Category:Dragnet, Category:Dragnet episodes, Category:Monty Python and the Holy Grail, Category:Sholay, Category:Donnie Darko, Category:Chak De India, Category:Enchanted (film) and Category:Songs from Enchanted) and ensure that all articles have the {{ Film}} project banner on their talk page, keeping all existing assessments and other parameters where they currently exist?

In addition, can the bot add the appropriate task force parameters to the banner in the following categories:

I made this proposal at Wikipedia talk:WikiProject Films#Bot tagging articles for WP:FILM last week, where it met with approval. Let me know if you need anything else, this is my first such request here. :) PC78 ( talk) 23:07, 7 January 2009 (UTC)

Doing... Lego Kontribs TalkM 01:02, 8 January 2009 (UTC)
Any progress with this? PC78 ( talk) 23:29, 14 January 2009 (UTC)
Yes, I was wondering what had happened to this too! Lugnuts ( talk) 12:46, 16 January 2009 (UTC)
Off wiki issues came up. I am finishing the code for this now. Lego Kontribs TalkM 02:03, 19 January 2009 (UTC)
Cool, no worries. :) PC78 ( talk) 02:11, 19 January 2009 (UTC)

Thanks, I see Legobot did a run on this earlier today. I see that only articles starting with A and B were tagged; will the bot therefore be doing this in stages, rather than a single sweep? PC78 ( talk) 19:41, 22 January 2009 (UTC)

It hit an error while running so it only got to the B's. I have restarted it, so it should start going again. Lego Kontribs TalkM 05:03, 23 January 2009 (UTC)
Ah, OK. Can you please let me know when it's finished doing it's thing? Cheers! PC78 ( talk) 01:11, 24 January 2009 (UTC)

Y Done Lego Kontribs TalkM 02:16, 28 January 2009 (UTC)

One more request: can the bot add the appropriate task force parameters to the banner in the following categories:

These are two recent task forces of ours for which doing a manual tagging run would be extremely difficult, especially given the two countries' prodigious output and the English Wikipedia's natural systemic bias towards more comprehensive coverage of these national cinemas. Many thanks! Girolamo Savonarola ( talk) 18:58, 2 February 2009 (UTC)

Do the above excluded categories still have to be excluded? Lego Kontribs TalkM 00:22, 3 February 2009 (UTC)
Exclude the following: Category:United Kingdom film biography stubs, Category:Shaft, Category:High School Musical, Category:The Cheetah Girls, Category:T*Witches, Category:Zenon, Category:Police Squad! episodes, Category:National Lampoon's Animal House, Category:Beauty and the Beast, Category:Saw, Category:Lilo & Stitch, Category:The Lion King, Category:Enchanted (film), Category:Films distributed by Disney, Category:The Incredibles video games, Category:Walt Disney movie posters, Category:Asian American filmmakers, Category:English-language South Asian films, Category:Disney franchises, Category:Films distributed by Buena Vista International, Category:Shrek, Category:Evil Dead, Category:Looney Tunes games, Category:Images from Tom and Jerry, Category:Tom and Jerry video games, Category:Austin Powers games, Category:Final Destination, Category:Hercules: The Legendary Journeys, Category:Xena: Warrior Princess, Category:Tron video games, and Category:X-Men music. Thanks again! Girolamo Savonarola ( talk) 07:05, 3 February 2009 (UTC)

Reviving Any progress with this? Many thanks, Girolamo Savonarola ( talk) 13:28, 14 February 2009 (UTC)

As mentioned above, blindly recursing (even with a blacklist) is not acceptable. You need to specify exactly which categories needs tagged. Q T C 21:21, 17 February 2009 (UTC)

With respect, that did not stop the previous request from going ahead. PC78 ( talk) 21:28, 17 February 2009 (UTC)
With respect, it shouldn't have. It causes more problems then it fixes when it goes wrong. Using a list of categories that need the tag is simpler and has no chance of causing a complete mess of things. Q T C 21:56, 17 February 2009 (UTC)
How so? The categories to be tagged will be the same in either case. PC78 ( talk) 22:28, 17 February 2009 (UTC)
They may be the same at one point, but wont always. Say you go through the cat's and subcat's, think it's good. You post this botreq. Then in the time between you going through the list and somebody running the job, somebody makes a change somewhere in any of those n-levels of categorization. Can you be 100% sure wrong categories wont get marked? No. In the time it takes you to go through the list the first time, you could specify them in this botreq and turn the chance of incorrect tagging to 0. Q T C 23:30, 17 February 2009 (UTC)

Semi-automated translation of missing articles

After posting a more detailed description of my idea here, I would like to drop it by here to see what kind of reaction I will get. I would like to create a process where a bot goes to pages on the many other language wikis, runs a translate script on the article title, and looks for an article on the en wiki with both titles. When there's no article, it outputs to a list, with links to the other language articles, as well as multiple free language specific translate tools. Volunteers would then follow up on the list, and create article/remove them from the list when appropriate.

This bot would be run periodically to refresh the en wiki's shortcomings, and a blacklist can also be setup for pages that are determined to not be wanted. There's some other bits, but that's the jist of it. Anyways, although the bot would do several specific tasks, the most important is the first one: is it possible to retrieve interwiki data from the sidebar? -- Nick Penguin( contribs) 04:49, 17 February 2009 (UTC)

I don't think this is a good idea. Probably should let the discussion run at the village pump before getting a bot underway... Calliopejen1 ( talk) 18:02, 18 February 2009 (UTC) Didn't read thoroughly, will follow up. Calliopejen1 ( talk) 18:09, 18 February 2009 (UTC)

Just pointing out something...

I would like to request a bot to run at a future date to change all instances of ƿ to w and all instances of ȝ to g for the Anglo-Saxon wiki. There may be need of one to switch accents to macrons (á to ā). How easy would this be? --JJohnson1701 (talk) 07:33, 11 January 2009 (UTC)
(Please excuse my ignorance of the Anglo-Saxon language if the following sounds stupid). Technically, this is a very simple task. However, it is unlikely to be a desirable change. While, the majority of the usage of these characters are probably wrong it is unlikely they are all wrong. Articles often contain snippets of foreign language text (on purpose) and even references to characters themselves.
So even if these characters are never used in Anglo-Saxon words they might still be used correctly in articles. --ThaddeusB (talk) 00:37, 12 January 2009 (UTC)
The purpose in the Anglo Saxon Wiki would be a stylistic (MoS) change, and if it's agreed by consensus there I would be happy to do this task. Rich Farmbrough, 09:27 15 January 2009 (UTC).

This snippet is from archive number 24 I believe. While his request sounds innocent enough, what he didnt furnish was the fact that his request is based upon a bias of a very heated warfare about orthography.

Ultimately after all the blood and sweat, the users decided to make use of duplicate page versions, in a similar manner of the Gothic Wikipedia. When I say "the users," I mean regular contributors, as James/JJohnson1701 is never an active contributor, only surfaces once every several months, and has not written a single page which is any more than half a page in length.

That in itself is meaningless, but the point is, in making this move, he is attempting to defy our consensus to make use of both practices, as that is the only decision which gave us peace, and the ability to continue the project.

Discovering this, after all of that was over, is appalling quite frankly. Do not grant this bot request. — ᚹᚩᛞᛖᚾᚻᛖᛚᛗ ( talk) 14:05, 17 February 2009 (UTC)

This is the bot request page for the English Wikipedia; not the Anglo-Saxon Wikipedia. Neither the original request nor this objection belong on the English Wikipedia. Bot issues for the Anglo-Saxon Wikipedia need to be worked out on the Anglo-Saxon Wikipedia. -- JLaTondre ( talk) 04:14, 18 February 2009 (UTC)
Very well then, my apologies are in order. — ᚹᚩᛞᛖᚾᚻᛖᛚᛗ ( talk) 18:38, 18 February 2009 (UTC)

Broken translation requests

I've just developed a suite of templates that make requesting expansion from other language wikipedias much easier. See {{ Expand Spanish}} for example. One feature, though, of the templates may be problematic. When you tag an article, there is an optional parameter for the name of the article in the other language. If no title is specified, it defaults to assuming that the article in the other language has the same name as the English article. This is generally fine bc most translation requests are geographic places and biographies that have the same name in both languages. If the article names are different, this causes a problem. Can someone make a bot that goes through all the articles that are tagged with a template that is generated by {{ Expand language}}, then sees if the corresponding interwiki article exists, then if it doesn't either notifies the tagger, or puts a notice on the article talk page, or adds it to a list so translation project people can fix them manually? Thanks! Calliopejen1 ( talk) 17:53, 18 February 2009 (UTC)

Oh wait, an even better idea! I'll change it defaults to no link, and a bot could come by and automatically add the parameter if an article exists! Calliopejen1 ( talk) 18:07, 18 February 2009 (UTC) by looking to the interwiki according to the language of the tag. That is, if the bot sees {{ Expand Spanish}} it looks for [[es:XXXX]] and then changes it so it's {{Expand Spanish|XXXXX}}. Calliopejen1 ( talk) 18:42, 18 February 2009 (UTC)

Replacing calender templates

Templates of the style {{ FebruaryCalendar2008Source}} are correctly tagged T3 speedy deletion candidates because they are redundant to templates of the style {{ FebruaryCalendar|year=2008}}. I'd like to request a bot (should be easy to do) that replaces all instances of all templates of the first kind with those of the second kind, i.e. {{ MayCalendar2007Source}} with {{ MayCalendar|year=2007}} etc.

It should also be able to replace constructs like {{{{CURRENTMONTH}}Calender{{CURRENTYEAR}}Source}} with {{{{CURRENTMONTH}}Calender|year={{CURRENTYEAR}}}}.

And finally, it should tag all those former templates that are duplicates (i.e. of the style {{MonthCalenderYearSource}}) with {{ db-t3}} (don't forget the <noinclude>-tags for that) and list them on a subpage in my userspace so I can delete them after the waiting period is over. So who wants to code me that little thing? Regards So Why 11:37, 18 February 2009 (UTC)

I'll do it, but I'm wary of bot-marking a bunch of templates for WP:CSD. Has a consensus for this been reached elsewhere? Anomie 12:21, 18 February 2009 (UTC)
I don't see the problem. After all, T3 is a very simple criterion that only applies if the template is redundant and unused. But if you are wary about that, I'm happy if it just generates the tag-worthy list. I'll take care of it then. But thanks for your offer, I knew your bot would be perfect for such things :-) So Why 12:27, 18 February 2009 (UTC)
I'll have to think about this T3 thing. What about the {{MonthCalendarYear}} templates (without "Source")? Also, are the new templates really completely compatible with the old? I note that the old ones have endnote and note as aliases for EndNote, for example, that seems to be lacking in the new.
Hmmm... Why not just make one template that takes both the month and year as parameters? Make the parameter names sane ("1a" is really a horrible name) and AnomieBOT can translate those at the same time too. Anomie 02:15, 19 February 2009 (UTC)
I'm sorry, but I lack the insight as to why they are in this style, I just stumbled across the old ones on CSD duty. I invited Zzyzx11 ( talk · contribs) who tagged them to contribute here. Regards So Why 11:18, 19 February 2009 (UTC)

Ok, User:SoWhy asked me to provide some background: Before parserfunctions such as #time: and #if: were created in 2006 or 2007 (I can't remember), we had to create separate templates for each year. Thus the existence of {{ MayCalendar2004Source}}, {{ MayCalendar2005Source}}, {{ MayCalendar2006Source}}, etc. So with the existence of the parserfunctions, we could make general calendar templates there are more self-maintaining. So there have been discussions such as Wikipedia talk:WikiProject Days of the year/Archive 7#The calendar on the date pages and Template talk:JanuaryCalendar to have those kind of templates.

Well, finally we had the time to merge all the parameters into a few templates such {{ MayCalendar}}. I know it might look like spaghetti code, but it will have to do for now so it would be backward compatible for all the templates whose functionality were merged.

I believe I have already done most of the replacements already. The problem now is that since these templates were on so many pages, transcluded and cascading on multiple pages at a time, that I am currently waiting for the job queue to fully update all the backlinks so the "What Links Here" lists are fully accurate. I mean if you look at Special:WhatLinksHere/Template:MayCalendar2008Source, it lists a bunch of subpages of Portal:Music/DateOfBirth, but the template was actually only directly on the transcluded page Portal:Music/DateOfBirth/May.

Thus, any bot here is premature for the next month or two (last time I heard, the job queue takes about 40 days to fully complete a round of all the pages on Wikipedia). Cheers. Zzyzx11 (Talk) 16:37, 19 February 2009 (UTC)

Update table of bot edits

This page may have some use, but only if it's updated (it's currently a year out-of-date). Would some who'd got a minute look at it? - Jarry1250 ( t, c) 14:29, 19 February 2009 (UTC)

 Done from the API. Happymelon 18:13, 19 February 2009 (UTC)

Foundation and similar dates, in infoboxes

[I'm relisting this August 2008 request (including subsequent revisions), as the editor who said he would make the edits has not done so, nor replied to many enquiries as to progress (due at least in part to understandable family matters).]

To add "founded", "founded date" or "foundation", "released", "first broadcast" and similar dates to an infobox' hCard microformat, existing dates need to be converted to use {{ Start date}}.

I've compiled a list of relevant infoboxes at User:Pigsonthewing/to-do#Date conversions.

Thank you.

Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:20, 2 February 2009 (UTC)

It should be pointed out that conversion of these dates into a form that needlessly imposes restrictions on editors is a controversial proposal for which there is no consensus. Microformat dates can be supported without the cumbersome {{ start date}} template. See discussion at the WikiProject Time talk page. - J JMesserly ( talk) 20:49, 3 February 2009 (UTC)
This request has already been agreed; it is simply that the editor who undertook some months ago, to do it has been unavailable. The many manual replacements on a number of pages have been utterly uncontroversial. There are no "needlessly imposes restrictions on editors"; and {{ start date}} is successfully used in an even greater number of articles (well over 10K). There is currently no viable alternative. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:26, 3 February 2009 (UTC)

It is a controversial proposal. Perhaps it would be prudent to await consensus on this matter after the Time wikiproject has time to properly consider the desirability of needlessly encoding dates in an arcane format. - J JMesserly ( talk) 21:39, 3 February 2009 (UTC)

Please provide evidence of this supposed "controversy". Note that " I don't like it" does not constitute such evidence, and, as I said above that this change has already been agreed, with consensus. In referring to debate on the Time Wikiproject, you appear to be promoting a "rival" (sic) template which you yourself created only a couple of days ago, and which has no demonstrable community support. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 23:05, 3 February 2009 (UTC)
No evidence having been provided; I suggest we proceed. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 22:01, 6 February 2009 (UTC)
Anyone looking into this matter will know this isn't the case. For example, consider the following post on the microformats talk page:

I support the proposal of J JMesserly and favor the {{ start-date}}: Before all, Wikitext must remain human readable. (BTW: There's in fact currently no chance - even for programmers - to enter a date like "7 December 1941 8AM HST" using {{ Start date}}: I vainly tried {{Start date|1941|12|7|18|||Z}}, {{Start date|1941|12|7|18||Z}}, {{Start date|1941|12|7|18|Z}}). -- Geonick ( talk) 00:05, 5 February 2009 (UTC) (UTC) source

Pigsonthewing's assertion of no controversy is not accurate. There is no pressing need to encode dates in a wonkey format that make it more difficult for contributors to understand. Other contributors agree on this point. - J JMesserly ( talk) 02:41, 10 February 2009 (UTC)
You have been asked to provide evidence of this supposed "controversy". You have not done so. You have provided an out-of-context citation of just one editor liking one aspect of your experimental template, but not objecting to the above proposal. As I have already said twice before; the example of {{ Start date}} you give above is not one of the supported formats for that template; GIGO applies. Your pejorative use of the epithet "wonky" is fallacious. If you object to the use of {{ Start date}}, them nominate it for deletion. As you say: "It is true [{{ Start-date}}; your suggested alternative] is a new untested template and there may be bugs to fix with it"; as indeed there are. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 13:14, 10 February 2009 (UTC)

I propose consideration of this proposal be suspended until

  1. The documentation for {{ Start date}}, {{ End date}}, and the bot is improved to explain how the bot and the templates will deal with dates that are in the Julian calendar or the Roman Republican calendar.
  2. Andy Mabbett states that he has read the ISO 8601 standard. One should never state or imply that one complies with a standard unless one has read it. -- Gerry Ashton ( talk) 20:07, 10 February 2009 (UTC)
Please provide a reference for the Wikipedia policy which you imagine requires me to satisfy this bizarre demand. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 20:08, 13 February 2009 (UTC)

Other discussion

For those interested in the nature of the controversy, please see Manual of style- dates discussion on the unnecessary obscurity and error prone nature of the {{ start date}} template compared to alternatives that achieve the same goal. - J JMesserly ( talk) 15:47, 12 February 2009 (UTC)

There is no controversy. Please avoid unnecessary drama. There is no obscurity and the template is not "error prone"; unlike the supposed alternative. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 20:06, 13 February 2009 (UTC)

Controversy shown

It has been demonstrated by the thread above and at the Manual of style- dates page that bot runs employing {{ start date}} are controversial as evidenced by the responses from multiple other contributors. - J JMesserly ( talk) 19:07, 18 February 2009 (UTC)

No such controveversy has been demonstrated, as any one can see Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 18:23, 19 February 2009 (UTC)

Italicizing foreign words - such as kata (as in karate kata)

I wonder if it would be possible to get a bot to italicize all instances of the word kata. I tried doing it in AWB but couldn't figure out how to get the program to ignore it if it was already italicized. Thus, a command like "change all instances of kata to kata" would look only at the word inside the double single quotes, find it, add more double single quotes, and end up changing every instance of kata to 'kata', which is no good.

There is a potential for false positives, but I think that as long as the bot is confined to Category:Martial arts and its subcategories, it shouldn't be a problem.

Can anyone help out with this? Thanks. LordAmeth ( talk) 20:13, 19 February 2009 (UTC)

Did you try telling AWB to skip any pages containing ''kata'' (those are single apostrophes)? It seems like this would cover for you, since it is inducible that if a page has one instance of the word formatted correctly, then all instances in said page are formatted correctly. That of course assumes that you're willing to accept that induction, and that might leave you some pages with the non-formatted word still there. There is also probably a way to do exactly what you want using RegEx, but I'm really bad with that so you may have better luck either from someone else here or at another page like WP:VPT. As for automation, if you think that the above-mentioned skip parameter is acceptable I would be happy to file a BRFA and do this with my bot. Robert Skyhawk So sue me! ( You'll lose) 04:18, 20 February 2009 (UTC)
Totally untested, but a Perl-style replacement regex should be s/([^'][^'])(kata)([^'][^'])/${1}''${2}''${3}/i. No idea how to translate it into an AWB regex. Two important points about this: it won't italicize already-bolded instances, and it will make a hash of wikilinks. -- Carnildo ( talk) 10:41, 20 February 2009 (UTC)
AWB uses the same regex as Perl. However, I don't think you want to use that regex as it will also replace occurrences of kata within words (ex. 11kata22). I don't know, but I'd assume that might cause false positives. Something like s/\b(?:'')?kata(?:'')?\b/''kata''/i; would probably be better. -- JLaTondre ( talk) 13:14, 20 February 2009 (UTC)
I don't believe this is a job for a bot as the risk of false positives seems high. AWB would be more suitable. Its find & replace supports regex. -- JLaTondre ( talk) 13:14, 20 February 2009 (UTC)

Detection of articles that are better in other wikipedias

Following up on my previous request, I would like to be able to find which articles could benefit from translation. A bot could compare en.wiki articles (probably only stubs at this stage) with articles they are linked to via interwikis. Where the linked article is significantly long (a rudimentary measure of article quality), the bot could slap an {{ Expand Spanish}} (or another language) tag on the en.wiki. Or the bot could just output a list of these articles so they could be reviewed manually. Calliopejen1 ( talk) 18:02, 18 February 2009 (UTC)

This is more for the actual template usage but stub articles should not also have an expand tag, that is overkill. Expand tags are already excessively used on Wikipedia, to also have a bot place them on articles.... Garion96 (talk) 13:51, 19 February 2009 (UTC)
But the expand tag actually points to more information, and where the article is better in another wikipedia the auto-translate link lets people read it in imperfect but often acceptable english. I think stub tags are worthless anyways, so I think the redundancy should be resolved in the template's favor.... (I also think the ordinary {{ expand}} tag is pointless, but I suppose that is beside the point.) Calliopejen1 ( talk) 03:15, 20 February 2009 (UTC)
Stub tags at least are small. Imagine, you have a perfectly ok little article with a template about a third of the article size pointing editors to the Spanish Wikipedia where they might find information to expand the article..... (Or it points readers to a computer translation. I am not sure which is worse) That kind of information belongs on the article's talk page or at a list page at Wikiproject Spain. Garion96 (talk) 20:24, 20 February 2009 (UTC)
Except the point is that it isn't perfectly ok! Stubs generally are awful articles for people who actually want to learn about the subject matter at hand. Which is more useful for the reader- this stub article, or this machine translation from es.wiki? Obviously the english isn't perfect, but I really don't think there's any comparison as to which contains more useful information. Calliopejen1 ( talk) 21:40, 20 February 2009 (UTC)
It is a small article, therefore a stub. It will grow in time. No need for a huge template for that. (talk page would be fine of course). The same counts for República Cromagnon nightclub fire or Hospital de Sant Pau, two other articles I saw using this template. The more tags like these are being made, the more I understand User:Shanes/Why tags are evil. Garion96 (talk) 10:54, 21 February 2009 (UTC)

What you are not seeing Garion is that these tags are markedly ***different*** to other tags on wikipedia. It precisely because of editors like yourself who dismiss these articles as OK that very little has been done about it. By dismissing these tags as "all tags are evil" is just typical of the kind of narrow minded attitude that many have on here in regards to our potential and ways in which we can improve. They are not administered to cleanup etc they are administered to produce a net result in expansion in direct translation which eventually will undoubtedly start to produce massive results in the content of english wikipedia. They serve as a direct gateway between english and the other and keep track and also makes other aware that the article is in the process of being improved in correspondence. Fact is listing articles for translation behind the workspace and in the talk page at peoples request failed miserably for years. It never brought it to somebody's urgent attention that the articles can easily be expanded in minutes with the link provided so the articles would just be lying about in some barely ever used log in the workspace gathering dust and people visit the article and move on with no results. I don't think you quite understand the purpose of this proposal. Yes tags are ugly which is partly why they are so useful, it prompts people to try to quickly sort out whatever perceived problem the article is experiencing and can therefore remove it asap. As for size I don't see a huge template at all, looks no bigger than most of the templates we have. It is essential in my view that we dramatically increase the coordination of translation on wikipedia and root out the articles which have far superior articles on other wikipedias and begin to draw peoples atttention to doing something about it. Dr. Blofeld White cat 19:43, 22 February 2009 (UTC)

Oh, I think I am seeing it perfectly, I just don't agree with it. And if you read my earlier comments, my objections are not simply a summary as "tags are evil" so stop using that "kind of narrow minded attitude that many have on here" towards opinions they don't agree with. For a project like this one could use Wikiprojects to expand these articles. A bot could help for sure with creating a list to work with. I just think of readers, I don't think readers benefit from a huge tag on an article stating the article could be expned. I also don't think readers benefit from a computer translation. Garion96 (talk) 20:06, 22 February 2009 (UTC)

Not always, but then the tag isn't always intended to say to use google (which is far superior thatn most computer generated packages online). It is there as a background as is language groups and learning about translation. mOre often than not the editor is likely intelligent enough to either spot mistranslations by it or be able to proof read the foreign article themsevles and translate manually. Dr. Blofeld White cat 21:33, 22 February 2009 (UTC)

A perfect example of its purpose is Westerstetten for instance. Dr. Blofeld White cat 21:45, 22 February 2009 (UTC)

Asteroids

If you have a look at Category:Wikipedia pages with broken references you'll see hundreds and sometimes even thousands of asteroid stubs. I personally hold thay should be deleted non slower than they are created. But at least they should include {{reflist}} too keep them from cluttering up this category. The category was down to almost 2000 and improving, and I was about to do some serious work on it. But these thousands of asteroids came along. Perhaps a bot, or a small remark to the right person could help us out.

Please keep me posted (I mean, please tell me how you propose to delete all of them in one day, joking). Debresser ( talk) 22:37, 19 February 2009 (UTC)

Doing... Lego Kontribs TalkM 01:54, 20 February 2009 (UTC)

You're a hero. What do you do with them? Add reflist, or delete? Debresser ( talk) 09:09, 20 February 2009 (UTC)

If you just add reflist there is a request I'd like to ask from you. Could you teach me how to write a bot that adds "prod" to all of them? Debresser ( talk) 11:29, 20 February 2009 (UTC)

Y Done I finished adding {{ reflist}} to all of them. I'm not sure if it is a good idea to add prod tags to over 500 articles. Lego Kontribs TalkM 02:04, 21 February 2009 (UTC)

That's just great. Perhaps you would know how to go about recommending all of them for deletion? Debresser ( talk) 17:03, 21 February 2009 (UTC)

You're right. We have a discussion now at Wikipedia_talk:WikiProject_Astronomical_objects#main_belt_asteroids. I am arguing that there is consensus for turning all those stubs into redirects to a big list. You'd like to comment? Debresser ( talk) 22:27, 23 February 2009 (UTC)

Archive search box adder

Some articles' talk pages have huge archives. Adding auto search box to each one of them is an excellent duty for a bot or can be an additional task of an existing bot. It would just add {{Archive box|auto=yes|search=yes} to the proper line of talk pages. Logos5557 ( talk) 22:44, 22 February 2009 (UTC)

This would definitely help searching a number of talk pages easier. Smallman12q ( talk) 22:47, 24 February 2009 (UTC)

Bot to update project statistics

I'd like to request the creation of a bot to update these two project pages: Wikipedia:WikiProject Video games/Traffic statistics, Wikipedia:WikiProject Video games/Article statistics. It's not necessary for the bot to update the charts; but if it can then it's an added bonus. Thanks! SharkD ( talk) 02:22, 25 February 2009 (UTC)

i want a bot for theninja-rpg

sir, i want a bot for theninja-rpg.com it is a text based online game i want it to create ryo (in-game currency) and to train my character please help me sir —Preceding unsigned comment added by Rajansh mamoria ( talkcontribs) 15:55, 26 February 2009 (UTC)

Impossible This page is for requesting bots that do tasks on the English Wikipedia, not for requesting cheats for multiplayer games. -- Nn123645 ( talk) 17:27, 26 February 2009 (UTC)

Different name on Commons and Superseded Image

Is there a bot that fixes links to files that were uploaded to Commons under a different name (regular links too, not just image links)? I could have sworn that there was, but I have not seen anything at File:FlagTrujillo.JPG for three days. If not, could there be? The same for Template:Superseded-Image. ~ JohnnyMrNinja 09:01, 27 February 2009 (UTC)

Needs X Infobox

To reduce clutter on talkpages and to make sure the pages are categorized properly since most of the of Needs X Infobox just place them in the Requested Templates category when most WikiProjects have specialized categories to make it easier. I suggest the following templates are replaced on the article talk pages with the wikiproject pages with the appropriate needs-infobox switch, or if the WikiProject banner already exists, remove the template and update the WP banner with the switch.

Template WikiProject(/s) Replace With
{{ Needs television infobox}} WikiProject Television {{WikiProject Television|needs-infobox=yes}}
{{ Needs football biography infobox}} Wikiproject Football and
WikiProject Biography
{{WPBiography|sports-work-group=yes|needs-infobox=yes}}
{{Football|needs-infobox=yes}}

It might also be nice if the bot could check to see {{Infobox....}} exists in the article and then lists separately those so that they can be manually checked, but that isn't really needed. Peachey88 ( Talk Page | Contribs) 10:53, 23 February 2009 (UTC)

Coding... Are there any other templates, or is it just these two? [[Sam Korn]] (smoddy) 11:31, 23 February 2009 (UTC)
Just those two at the moment that I'm aware of (and could find (well except for the general needs infobox one, but that one shouldn't be done)). Peachey88 ( Talk Page | Contribs) 11:52, 23 February 2009 (UTC)
BRFA filed. Wikipedia:Bots/Requests for approval/Sambot 2. [[Sam Korn]] (smoddy) 16:03, 23 February 2009 (UTC)
Y Done. List of pages that need attention at User:Sambot/Tasks/Football infoboxes. {{ Needs television infobox}} is now orphaned. [[Sam Korn]] (smoddy) 17:43, 27 February 2009 (UTC)

Change links related to recent page move

I need a bot to accomplish one fairly simple task:

  1. Change links to Fukuoka, Fukuoka into links to Fukuoka

This is across all namespaces, if possible. Thanks! ··· 日本穣 ? · Talk to Nihonjoe 02:51, 23 February 2009 (UTC)

Sorry, WP:R2D overrides here unless "Fukuoka, Fukuoka" will eventually be made into a separate article and not always be a redirect. §hep Talk 03:31, 23 February 2009 (UTC)
If you are going to change it, you should change it to "[[Fukuoka]]" or "[[Fukuoka]], [[Fukuoka Prefecture]]", etc., not to "[[Fukuoka|Fukuoka, Fukuoka]]" as that would be stupid, cumbersome, and redundant. — CharlotteWebb 12:45, 28 February 2009 (UTC)

IoE links - changed string in URL

I don't know whether it will be possible for a Bot to take this on but it affects thousands of articles & would take months/years by hand. The web site Images of England (IoE) lists all of the listed buildings in England and is frequently used as a reference including in many FA & GA class articles. They have recently changed the format of the URLs returned by their database, meaning that each unique building number is the same, but any "string" in the URL which includes "search/details" will only work if the reader is already logged in to IoE for anyone else it presents a blank screen. If this section of the URL is replaced with "Details/Default" it works for everyone with no need to log in. As an example try comparing this with this one which both target information about St Andrews Church in Chew Stoke with the item number 32965 but the first one fails & the second one works. If a Bot was able to do this replacement that would be great. If I've not explained it properly or you need further information please don't hesitate to contact me.— Rod talk 18:16, 28 February 2009 (UTC)

Coding... -- JLaTondre ( talk) 18:59, 28 February 2009 (UTC)
BRFA filed at Wikipedia:Bots/Requests for approval/JL-Bot 4. -- JLaTondre ( talk) 19:30, 28 February 2009 (UTC)

WP Bosnia and Herzegovina

I need a bot to modify talk pages that have {{WikiProject Europe|BiH=yes}} in them to change to {{WikiProject Bosnia and Herzegovina}} PRODUCER ( talk) 19:34, 28 February 2009 (UTC)

I'll put in a BRFA. What (if any) category does that template and parameter combination put the talk page in?-- Rockfang ( talk) 20:48, 28 February 2009 (UTC)
From the template's doc it looks like Category:WikiProject Bosnia and Herzegovina articles. §hep Talk 20:52, 28 February 2009 (UTC)
Gracias.-- Rockfang ( talk) 21:20, 28 February 2009 (UTC)
Producer, do you want all types of talk pages, or only certain namespaces?-- Rockfang ( talk) 21:20, 28 February 2009 (UTC)
All of them I suppose PRODUCER ( talk) 22:10, 28 February 2009 (UTC)
Ok. I filed a BRFA. Now, we wait.-- Rockfang ( talk) 03:10, 1 March 2009 (UTC)

Creative Commons Flickr Bot

Proposal 1

Make a bot to compile a list of images on Flickr that are licenced under the Creative Commons attribution licence that can replace image:replace this image male.svg and image:replace this image female.svg. These will then be sighted to see if they are not blatant copy vios then uploaded to commons.

Propsal 2

Make a bot that transfers all images on Flickr that are licenced under the Creative Commons attribution licence (but crucially not people) to commons.

See related discussion here

-- DFS454 ( talk) 14:04, 28 February 2009 (UTC)

As this is a request for bot work that will occur on Wikimedia Commons, this needs to done via their bot process. They have their own bot request page. You should make your request there. -- JLaTondre ( talk) 15:08, 28 February 2009 (UTC)
There is a long established mechanism on Commons for transfering images from flickr. For a technical place to jump in, try here Commons:User:Flickr upload bot/step-by-step - J JMesserly ( talk) 16:10, 28 February 2009 (UTC)
I am aware of Flickr Upload bot what I meant was an automated process ,which scans media by itself, that is only sighted( Flagged revisions) by users. Technically how hard is it to code something like this? -- DFS454 ( talk) 21:48, 28 February 2009 (UTC)
I dunno, how hard is it to understand something this?
r'\[(?P<url>https?://[^\|\] ]+?(\.pdf|\.html|\.htm|\.php|\.asp|\.aspx|\.jsp)) *\| *(?P<label>[^\|\]]+?)\]
Actually, that one is pretty easy, as they go. You have to scan html pages and scan them for what you want, then execute more page fetches depending on what those pages tell you, and then of course all your code is broken the following week when someone decides to change their html page, breaking one of your routines and you no longer recall how it works so you have to rewrite it. Other than that, it is a piece of cake. - J JMesserly ( talk) 08:06, 1 March 2009 (UTC)

This is an AfD from earlier this year that resulted in the deletion of a few disambig pages from an old scheme of organizing that list. There are still quite a few links to it, but nobody followed up on the author's suggestion to have a bot change them. It'd probably take 10 minutes at most in AWB, but I no longer have Windows, so I'm asking here. From a quick count it's only about 200 links in total. Changing the links to List of Latin phrases or List of Latin phrases (full) would probably be fine, though if you're feeling really ambitious you could actually look at each link and send it to whichever of the 6 pages the list is now broken down into is appropriate. Thanks, -- Rory096 16:01, 1 March 2009 (UTC)

Not sure why anyone would have directly linked to these pages anyway. It would make more sense to link to the actual Latin phrase, then let the phrase redirect to whatever subdivision of the list currently contains that phrase. This is a case where pre-emptively bypassing redirects is actively harmful. Also I'm concerned about GFDL attribution issues if material was originally added to "A–E" but then cut and pasted to A–B when the former page grew too large. All of these obsolete segment titles should at least be undeleted and redirected to the main list page. If there is some way to find a list of Latin phrases which are currently a deleted redirect (because some bot discovered that they pointed to a deleted section of the list), these should be undeleted too. I'd estimate that the cleanup process will be more complicated than you think. — CharlotteWebb 16:54, 1 March 2009 (UTC)

IMDb links

Would it be possible to have a bot check through all articles using the |imdb_id= parameter in {{ Infobox Film}} to see which of these do not otherwise contain a link to IMDb, i.e. through the use of {{ imdb title}} (or any of its redirects), and present this data in the form of a numbered list? Such information would be useful in an ongoing debate over the use of such parameters in the infobox. Thanks in advance for any help! :) PC78 ( talk) 15:34, 2 February 2009 (UTC)

Bump. Is this request feasible or not? PC78 ( talk) 16:24, 9 February 2009 (UTC)
This request is certainly feasible, I'll get coding and see what I can come up with. As the bot isn't actually going to be editing - only reading - pages it won't need approval, so I should be able to run through the transclusions later today. I'll put the data in a subpage of my bot's userspace (or anywhere else sensible if you'd prefer). Richard 0612 10:11, 13 February 2009 (UTC)
I'm having a few issues with non-Latin characters in page titles, but I haven't given up, it'll just take a bit longer! Richard 0612 21:05, 13 February 2009 (UTC)
Perhaps someone else could take a look at this. Pywikipedia seems not to like Unicode characters... Richard 0612 22:22, 17 February 2009 (UTC)
I'll take a look. Anomie 03:29, 18 February 2009 (UTC)
Y Done Unless I screwed something up in my coding, this should be the list. There are 6112 article pages, and a handful of others. Feel free to copy it elsewhere if a permanent link into my sandbox isn't sufficient. Anomie 12:05, 18 February 2009 (UTC)
Many thanks to you both! PC78 ( talk) 15:58, 18 February 2009 (UTC)

How feasible would it be for a bot to remove the link from the infobox and add it to the relevant "External links" section of the article? PC78 ( talk) 17:04, 21 February 2009 (UTC)

Coding... It would be quite feasible, but I suggest having the bot process the other obsolete external link parameters (website and amg_id) fields at the same time, to get all three in one edit. I'd also have the bot generate a list of pages that need manual fixing or extra attention. Anomie 16:43, 22 February 2009 (UTC)
Yes, of course. There seems to be concensus at Template talk:Infobox Film for removing these parameters from the infobox, so all three will need to be processed. Whatever you think best. PC78 ( talk) 17:24, 22 February 2009 (UTC)
BRFA filed Wikipedia:Bots/Requests for approval/AnomieBOT 24. Anomie 03:41, 23 February 2009 (UTC)
Thanks to PC78 for getting this started. Lugnuts ( talk) 14:00, 1 March 2009 (UTC)
Why is it preferable to have them in external links rather than in the infobox? Шизомби ( talk) 15:40, 1 March 2009 (UTC)
We are so not going to get into another debate here. You want Template_talk:Infobox_Film#External_links_.28imdb.2C_amg.2C_etc.29 for all the info. - Jarry1250 ( t, c) 15:55, 1 March 2009 (UTC)

You are doing one of the most controversial things I've ever seen a bot do. Please bot revert and wait for a real discussion on the matter. Very poor form. - Peregrine Fisher ( talk) ( contribs) 07:53, 2 March 2009 (UTC)

That looks like consensus to me. §hep Talk 09:01, 2 March 2009 (UTC)
This is a big change. Not quite as big as saying "no fair use images in wikipedia", but similar. One that should not be enforced by a bot. - Peregrine Fisher ( talk) ( contribs) 09:04, 2 March 2009 (UTC)
Removing external links from a single infobox is not the "big change" you seem to think it is. This was discussed at Template talk:Infobox Film, the discussion was advertised at the relevant WikiProject, and a concensus was reached; it is not so important or controversial that it requires a centralised discussion for the whole of Wikipedia, nor is this the venue for reopening the discussion. Please direct your comments to the template talk page. Thank you. PC78 ( talk) 12:25, 2 March 2009 (UTC)

WikiProject/Taskforce Spammer.

Sometimes it would be incredibly useful to be contact all WikiProjects and taskforces at once. I've look for bots that can do this, and I haven't found any which is currently able to contact all projects and taskforces in one fell swoop. Anyone willing to code this? Headbomb { ταλκ κοντριβς –  WP Physics} 04:35, 21 February 2009 (UTC)

If you can give me a specific list of categories/member lists to hit, I think I should be able to do it with AWB pretty efficiently (10 epm). Robert Skyhawk So sue me! ( You'll lose) 05:35, 21 February 2009 (UTC)
Sure, see (last column on the right give the link to the wikiprojects/taskforces)
There are a lot of inactive projects, but if they are inactive no one should care that they are contacted. Some projects and taskforces probably aren't listed, but that's the best list I know of. Headbomb { ταλκ κοντριβς –  WP Physics} 05:49, 21 February 2009 (UTC)
Wow...you really mean all of them don't you? But yes, if you have a message to distribute to all of these people, then I think AWB should be able to get it done in a reasonable amount of time...we could even organize multiple bots to split the workload. There is one potential issue though...you'll definitely need approval for this task, and I can already see that it will be hard to convince the Bot Approvals Group that you have a message that needs to go to this many people. Keep in mind that almost every user on the project is a member of at least one WikiProject. May I inquire as to exactly what kind of message you are needing to broadcast to all of these people? Perhaps WikiProject talk pages are a better way to go... - Robert Skyhawk So sue me! ( You'll lose) 06:00, 21 February 2009 (UTC) Retracted, see below.
Yes I really do mean all. :P Obviously I'm not delusional enough to think that one could spam the all WikiProjects and Taskforces without approval of some form, but there should at least be a bot that could deliver one-time messages when required. As for the message, it would be to let projects know about a new feature called WP:Article Alerts (which is a basically a way for WikiProjects to know about the AFDs, PRODs, WP:FACs, etc, relevant to them), so I'm not really worried about the BAG being against this message being spread out (and the ~1500 or so resulting edits). There are currently about 75 or so subscribers to Article Alerts right now, and feedback has been uniformly positive. User:ArticleAlertbot has been throughoughly tested and we've recently overhauled the page to get ready for a massive influx of projects subscribing at once, as well as an increase in bug reports and feature requests. All that is left to be done is to make the projects aware that this exists. Headbomb { ταλκ κοντριβς –  WP Physics} 06:12, 21 February 2009 (UTC)
I have a lot of sympathy - I spent a good hour on AWB manually posting my alert about my project, which is quite similar to yours (not in competition though, don't worry) to ~200 WikiProjects. It's long (at 4 epm), boring, but it does get the job done without the need for a BRFA, so if all else fails, you could consider it. - Jarry1250 ( t, c) 11:54, 21 February 2009 (UTC)
That does make sense. I am thoroughly satisfied with the job that ArticleAlertBot does. However, I think it would be much more efficient to simply post this message on every active WikiProject's talk page. That would reduce the amount of edits that need to be made dramatically, and would make BRFA approval much easier, while still ensuring that active project members (who presumably watch their projects' talk pages) will be notified of this bot's services. If you do in fact decide that this is what you want to do, then we can go from there. If you still think you want to go with member lists, then I hope we can make that work too. Robert Skyhawk So sue me! ( You'll lose) 15:42, 21 February 2009 (UTC)
If you just want a bot to get a message out to peoples talk pages quickly I have a bot that is already coded to do this and is very efficent. If you want it to go out to wiki project talk pages it should be able to be modified easily, It would need a list / cat / page or links or something similar to run from but I think this would be a better idea quicker and more efficent than AWB. (Already approved for the posting on talk pages) ·Add§hore· Talk To Me! 18:59, 21 February 2009 (UTC)
  • Robert Skyhawk: I'm not quite sure I understand what the difference is between "post[ing] this message on every active wikiprojects" and what I'm proposing.
  • Addshore: The lists are already given above. Last column gives the Wikiproject links (the bot would obviously post the their talk pages). Headbomb { ταλκ κοντριβς –  WP Physics} 21:41, 21 February 2009 (UTC)
Oh...I understand now. I don't think I quite saw what you were trying to do, but now that I realize that just using the WikiProjects' talk page is what you were trying to do all along, this seems much more reasonable. Robert Skyhawk So sue me! ( You'll lose) 23:33, 21 February 2009 (UTC)
Now, I'm kinda curious about what you thought it was I was asking for. Headbomb { ταλκ κοντριβς –  WP Physics} 05:09, 22 February 2009 (UTC)
It would be nice if the bot skipped the projects that already use AAB. Shouldn't be that hard to remove them from the list as there's only a handful. §hep Talk 05:14, 22 February 2009 (UTC)
Yes, they are all in Category:ArticleAlertbot subscriptions. However, it's been four month since the first subscribers, and quite a lot changed at WP:AAlerts during that time, so they might not be aware of the recent changes. I think I'd still push for all wikiprojects and taskforces regardless of subscription, if only give the links to newly created bug reports and feature request pages, but I'll leave this up to the BAG's judgement once they see the actual message. Headbomb { ταλκ κοντριβς –  WP Physics} 05:48, 22 February 2009 (UTC)
If you must know, HeadBomb, for a while I actually thought you wanted to take the member lists of the projects and spam every member's talk page...gross misunderstanding on my part. Robert Skyhawk So sue me! ( You'll lose) 05:44, 22 February 2009 (UTC)
Ugh, that'd be horrible and way out of line. I can see why you'd be concerned. Headbomb { ταλκ κοντριβς –  WP Physics} 05:48, 22 February 2009 (UTC)
So Headbomb you want a message on all the wikiproject talk pages listed there? If so yes my bot can do it and I will throw up a BRFA as soon as you say yes :P ·Add§hore· Talk To Me! 08:20, 22 February 2009 (UTC)
Yuppers. If the BAG wants to wait for the message before it approves, I can have it ready by the end of the day. Headbomb { ταλκ κοντριβς –  WP Physics} 08:25, 22 February 2009 (UTC)
Please see Wikipedia:Bots/Requests_for_approval/Addbot_19. ·Add§hore· Talk To Me! 08:30, 22 February 2009 (UTC)
I have 20 trial edits, if someone would care to give me the message and I will send it to the first 20 pages. ·Add§hore· Talk To Me! 17:11, 24 February 2009 (UTC)
Alright, here goes.

This is a notice to let you know about Article alerts, a fully-automated subscription-based news delivery system designed to notify WikiProjects and Taskforces when articles tagged by their banner enter a workflow such as Articles for deletion, Requests for comment, and Peer review ( full list). The reports are updated on a daily basis, and provide brief summaries of what happened, with relevant links to discussion or results when possible. A certain degree of customization is available; WikiProjects and Taskforces can choose which workflows to include, have individual reports generated for each workflow, have deletion discussion transcluded on the reports, and so on. An example of a customized report can be found at here.

If you are already subscribed to Article Alerts, it is now easier to report bugs and request new features. The developers also note that some subscribing WikiProjects and Taskforces use the display=none parameter, but forget to give a link to their alert page. Your alert page should be located at "Wikipedia:PROJECT-OR-TASKFORCE-HOMEPAGE/Article alerts".

Headbomb { ταλκ κοντριβς –  WP Physics} 20:49, 24 February 2009 (UTC)

Any updates? Headbomb { ταλκ κοντριβς –  WP Physics} 01:37, 28 February 2009 (UTC)

Sorry was away for a few days, I now have a small trial and I should use it up tonight. ·Add§hore· Talk To Me! 20:40, 1 March 2009 (UTC)
See this link for the trial edits.
Sorry I meant here ·Add§hore· Talk To Me! 21:04, 1 March 2009 (UTC)
Could someone from the BAG approve or decline this? The sooner this rolls out the better. Headbomb { ταλκ κοντριβς –  WP Physics} 22:07, 3 March 2009 (UTC)
Request approved, when do you want me to go through this list and add this message ? Feel free to add your own sig the the message and to change it in any way before I run the bot. Send me a message on my talk pag to confirm that you want the message to go out. Thanks. ·Add§hore· Talk To Me! 09:09, 7 March 2009 (UTC)

Lossless Image Optimization and Compression Bot

I have looked around wikipedia and noticed that most images are uncompressed (including the actual logo file:wiki.png) I believe that a bot that compresses images would help save bandwidth and reduce page download time. While some people may argue that the savings would be nominal, they would indeed help. Reduced bandwidth would save the wikimedia foundation money(remember, your donations pay for that bandwidth) and the reduced page load time would make people with slower connections happier.

On average, I have been able to compress some images by ~25%. Some more(5kb for the wikipedia text logo on www.wikipedia.org), some less(23bytes for the file:wiki.png). Compression can be accomplished in several ways. First, the color scale can be changed(such as from rgb to greyscale) can save kilobytes. Second, is the type of file such as jpeg, png, and gif. In some cases jpeg is better while png in others. Lastly, there is the actual compression through tools such as pngcrush, pngguantlet, and pngoutwin. The only downside is that like all compression, it is extremely computationally expensive. Together, they can compress an image a quarter or more.

I suggest that the bot begin with the standard mediawiki images followed by the top 1000 most viewed images. After that, it would simply work in order of "most bandwidth used" images. While I don't have the actual image download statistics(if someone could put them up, it would be nice), there can be savings. As I haven't programmed in years, I don't think I can write an adequate bot, but I can help. Please post if you support this idea or would like to comment, please do so. Smallman12q ( talk) 18:06, 22 February 2009 (UTC)

Edit 1: I would like to say that that what I have in mind is lossless compression. There is also a disccusion at Wikipedia:Village_pump_(technical)#Smaller Wikipedia Logo files Smallman12q ( talk) 22:22, 23 February 2009 (UTC)

Here is a site that offers online optimization: http://tools.dynamicdrive.com/imageoptimizer/ 72.90.135.45 ( talk) 18:54, 22 February 2009 (UTC)

Wow thanks, didn't know they had an online image "optimizer." That link is very useful! Smallman12q ( talk) 19:39, 22 February 2009 (UTC)

This won't do much good, if any. I think the minimal benefit of compressing the originals would be lost in the thumbnailing process. Plus you seem to ignore the possibility that anyone would want to download the uncompressed originals. The images that we upload are usually not the images that are displayed in articles. For example the Image:Felix Pedro.jpg I uploaded was 483×620px, 79,054 bytes:

http://upload.wikimedia.org/wikipedia/commons/1/1e/Felix_Pedro.jpg

You might argue that this is poorly compressed, with a 3.788 pixel–byte ratio, but most readers won't see this. The thumbnail you see on this page is rendered at 100×128px and uses 3,575 bytes:

http://upload.wikimedia.org/wikipedia/commons/thumb/1/1e/Felix_Pedro.jpg/100px-Felix_Pedro.jpg

Here the compression ratio is actually lower at 3.580. So even if we compressed the hell out of the full-size image (punishing anyone who wanted to print the original photo), the server would likely still generate thumbnails at the same file size as before (probably because it's intended to be fast, rather than efficient—your thumbnails have to be ready instantly when you hit the preview button to ask yourself "how does it look at this size") but probably be of measurably poorer quality. What would be the point of that?

If image loading times are a concern it would be better to use more aggressive compression (different software, or different settings within the same software) for the thumbnailing process rather than adulterating the originals, which shouldn't need to be compressed anyway. — CharlotteWebb 20:16, 22 February 2009 (UTC)

Charlotte is right, the images people actually see when looking at articles are generated by ImageMagick(the image processing software mediawiki uses) with predefined compression settings. The software does provide access to the original image by clicking on the image on it's File: page. This original should be left unadulterated. Chillum 20:18, 22 February 2009 (UTC)

Declined Not a good task for a bot. There is nothing wrong with recompressing a PNG, and nothing in particular wrong with changing to palletized or greyscale if it results in no change to the image (note that some programs don't "like" palletized images with an alpha channel), although as noted above it would not do a whole lot to reduce the bandwidth used in articles. But converting to greyscale when the image uses non-grey colors would be a bad idea, as would reducing the number of colors used while palletizing. Recompressing jpegs (or converting png to or from jpeg) would be a bad idea to do automatically (and not a very good idea to do in general unless you know what you're doing), as jpeg normally uses lossy compression. Anomie 21:57, 22 February 2009 (UTC)

Yes, I doubt a bot could reliably determine whether or not the colors in an image are "close enough" to grey that the viewer wouldn't notice a difference (especially if compression artifacts—often phantom shapes of false color—are present). The thumbnails will render at more or less the same file size regardless of whether the originals are compressed/corrupted. Forget the baby and bathwater, this would be about like pouring half the vodka down the sink (storing it in a smaller bottle), then adding water because you are still serving it in the same size glass. — CharlotteWebb 03:16, 23 February 2009 (UTC)

I believe I forgot to mention that the compression would be lossless. A bot wouldn't need to recognize if the colors "were close enough", instead only the number of colors present. For example, for about a year, the file:wiki.png file was uploaded as an rgb rather than a greyscale. And the actual "compession" would only be for png's so it would be lossless. Please assume that the compression is lossless. Also, file conversion such as png to gif could save additional bytes without any quality lost. Please let me know what you think of lossless compression. Smallman12q ( talk) 22:21, 23 February 2009 (UTC)

This isn't a good idea for a bot. WP:PERFORMANCE should be added to the above. Also, PNG to GIF is a bad idea...that's why we have WP:PIFU. §hep Talk 22:24, 23 February 2009 (UTC)
WP:PIFU is wrong regarding file types(especially GIFS). Small gifs can be significantly less than pngs of similar size and quality. And while SVG's can scale better than PNGs, PNGs are smaller.And as for WP:PERFORMANCE, pictures do cost a lot of bandwidth, so minor improvements can be multiplied.Please see several examples at Wikipedia:Village_pump_(technical)#Smaller Wikipedia Logo filesSmallman12q . You will notice that with the appropriate compression, a number of pictures(including the wikipedia logo) can be compressed without sacrificing quality. This in turn will reduce bandwidth usage(which your donations pay for). Smallman12q ( talk) 21:23, 24 February 2009 (UTC)
The savings are usually trivial. For example, if the numbers on the VP are correct, the reduced size of the logo will save the Foundation roughly 3 cents a year in bandwidth costs. -- Carnildo ( talk) 01:19, 25 February 2009 (UTC)
Comment I would like to say that I don't know what the actual statistics are. Perhaps if someone could provide a link or request them from the wikimedia foundation, then that can be argued. But no statistics(and hence no empirical evidence) means that we just believe what we want to believe (and that is very subjective). Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
Even if we uploaded smaller lossless copies and deleted the originals, deleted images are not actually deleted. They are still there for an admin to view or undelete. This is as intended as we always want the original version because many of the licenses we use require it. No space would be saved. Chillum 01:28, 25 February 2009 (UTC)
Comment I don't believe I said the originals would be deleted. I'm not here to save space. (The entire wikipedia is still less than 1TB so there isn't really much space to save). Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
Also it would not save bandwidth as all images shown on pages are created by mediawiki using image magick, the originals aren't sent unless you go and download them. If there was a need to save bandwidth the compression settings could be changes there. Also, since when are PNGs smaller than svgs? I suppose if it was a very complex drawing it could be, but svgs are normally pretty small. Chillum 01:31, 25 February 2009 (UTC)
Comment, then perhaps the mediawiki software should be modified. Small images stored in optimized png are generally smaller than svg. This can be seen at the WP:V thread. Smallman12q ( talk) 23:26, 25 February 2009 (UTC)

←Not sure what Smallman is trying to say but maybe he meant maybe he meant that the SVG→PNG thumbnails created by ImageMagick (or rsvg or whatever) have a larger file size than a visually similar PNG that was created manually. I can believe that, but that doesn't mean we should scrap the image-conversion software and leave a small man inside the server in charge of creating thumbnails. It… wouldn't scale.

Comment small man...clever ^.^. I don't mean manually created pngs, I mean automated ones. Its not hard...create a png from an svg and then compress the png. It will (generally) be smaller than the svg.(I'm referring to small pngs...the larger the png, the less likely it is to be smaller than its svg counterpart...not to mention the quality degradation).

Seriously something that actually would save bandwidth would be to tell the server to embed SVGs directly when the file size is smaller than that of the thumbnail that would otherwise be shown for the selected dimensions. But I suspect the outcry against this would be horrific. — CharlotteWebb 02:52, 25 February 2009 (UTC)

That's basically what I want...convert the SVGs as pngs when the file size is smaller...why there would be an outcry...I don't understand. Smallman12q ( talk) 23:26, 25 February 2009 (UTC)
SVGs are text based and much easier to correct if there are errors you can lso translate them for sister projects without any problems. §hep Talk 23:43, 25 February 2009 (UTC)
No what I mean is this is already being done in all cases, even when the PNG is much larger than the SVG. It would be difficult to get this changed, as because certain users/browsers cannot or do not want to directly view the SVG files. — CharlotteWebb 12:33, 28 February 2009 (UTC)
Perhaps there was such a bot before? I found this User:Pngbot on File:Pinguim Crystal 2000.png. Perhaps there was a png optimizing bot before? Does anyone remember? Smallman12q ( talk) 02:11, 2 March 2009 (UTC)
That bot wasn't approved here, it was approved on Commons over 2 years ago. And is no longer active there. §hep Talk 17:06, 2 March 2009 (UTC)
Any reason why it was shut down, it seemed to have helped a bit. And if was approved in commons, why won't it be approved here? Smallman12q ( talk) 20:15, 2 March 2009 (UTC)
The BRFA process is a lot less quiet and I see it as less strict. 2 years ago it wasn't a real force like BRFA here. The bottom line is multiple bot ops have said it's a bad idea for a bot to do, it probably wouldn't get approved by BAG even if someone said they'd take on the task. There's some things that humans have to monitor. As a side thought, since all versions of an image are stored on the servers (even if deleted), wouldn't adding another image to that list just increase the amount of content we have to hold? §hep Talk 20:19, 2 March 2009 (UTC)
Well the idea is to reduce bandwidth and page loading time...storage these days is...well...very very very very low cost. Now what I don't understand is why it would need to be monitered by a person(for lossless compression). Why would a person need to monitor lossless compression?What is there to monitor? Smallman12q ( talk) 02:06, 3 March 2009 (UTC)

As noted on the VP, even your lossless changes weren't lossless. §hep Talk 04:21, 3 March 2009 (UTC)

That was a mistake on my behalf when I tried to optimize the images manually by reducing the color pallette size.

Above is a simple example of lossless PNG compression that can be recreated with a simple 10 trial run on PNGOUTWin or PNGGuantlet. Smallman12q ( talk) 12:45, 3 March 2009 (UTC)

Here is an excellent example in which compression could save some notable bandwidth... This image appeared on March 3, 2009 on the front page at [ [2]]

The front page gets an average of 5 million views a day. Every 175 thousand times the image is viewed, 1GB would be saved. If the image is viewed 1 million times, then 5.5GB bandwidth would be saved. Smallman12q ( talk) 00:57, 4 March 2009 (UTC)

As has been stated several times before in this discussion, usually the original images are not downloaded when the front page is viewed, only the thumbnail is. You pointed out a valid case above in which this is not true. According to this active discussion at Commons, the thumbnailer for GIFs was disabled some time ago for server performance reasons. Apparently they cannot be converted to PNGs using an automatic process because animated GIFs cannot be distinguished from single-frame GIFs in software. Wronkiew ( talk) 01:45, 4 March 2009 (UTC)
Ye...its best to give an example as I've seen. I would like to point out that your logic behind distinguishing single frames and multiple frames is flawed. Software can distinguish between single frames and multiple frames. Perhaps a software upgrade is in need? Smallman12q ( talk) 13:17, 4 March 2009 (UTC)
I'd just like to point out the over head from HTTP headers is ½ KB. Your efforts would far better spent to improve the crappy thumbnailer. — Dispenser 15:18, 4 March 2009 (UTC)
Since the thumbnailer creates images from the actual images, both need work(but you are right. The thumbnailer used here is inefficient, and doesn't properly compress the PNGs when converting from SVGs). Smallman12q ( talk) 22:38, 4 March 2009 (UTC)

Redirect tagger.

Since Article alerts has launched, its scope now included workflows such as WP:RfD. However, redirects are very rarely tagged, so it makes this feature less useful than in could be. So how about having a bot browse articles, check the "what links here", then tags the redirect with the same banners as the target article.

For example, quark has one redirect, quarks. The redirect tagger would copy the banners from talk:quark, and assess talk:quarks as redirect-class / NA-importance. It could run on a per-project basis, or continuously, whichever makes more sense to the BAG. I know WP:PHYS would be interested, and I'm sure other projects will show interest as well. Headbomb { ταλκ κοντριβς –  WP Physics} 03:29, 26 February 2009 (UTC)

WikiProject tagging of redirects is apparently controversial, see some of the discussion at Wikipedia:Bots/Requests for approval/MelonBot 11 for example. Anomie 03:47, 26 February 2009 (UTC)
I've place a notice on that BRFA to make sure bots aren't undoing or preventing each other's work. Headbomb { ταλκ κοντριβς –  WP Physics} 03:58, 26 February 2009 (UTC)
On a per-project-basis...I'm pretty sure almost any of the auto-assessor bots categorized above could do this. Most projects just don't use redirect class and some even discourage the tagging of redirects. §hep Talk 03:49, 26 February 2009 (UTC)
On a per-project basis then. Headbomb { ταλκ κοντριβς –  WP Physics} 03:58, 26 February 2009 (UTC)
The Tool should be change so it automatically check all the redirects (this should be easy enough to do), tagging them is unproductive. — Dispenser 06:58, 4 March 2009 (UTC)
What tool? ArticleAlertbot? It depends on categories (populated by banner tagging) and runs on a daily basis. Making it check all redirects for a project like WP:BIO is just not feasable nor desirable and would introduce an unnecessary strain on the servers (and considerable slow down the bot). Headbomb { ταλκ κοντριβς –  WP Physics} 07:42, 4 March 2009 (UTC)
Yet this is exactly what your asking for, it takes a considerable about of resources for a edit. This is to say nothing about MediaWiki maintaining the category; compared to a simple JOIN statement done on the toolserver or with the API. I would seriousely consider putting the redirect class up for deletion, just to prevent your line of thought. — Dispenser 14:08, 4 March 2009 (UTC)
The difference is AABot would do it everyday with your proposal, vs. once in a blue moon with the redirect tagger, only when a project feels like it's worth doing. Headbomb { ταλκ κοντριβς –  WP Physics} 20:40, 4 March 2009 (UTC)
Also, I'm not asking for a redirect fixer, I'm asking for a redirect tagger, so that ArticleAlertbot can pick them up when they enter workflows such as WP:RfD. Headbomb { ταλκ κοντριβς –  WP Physics} 01:50, 5 March 2009 (UTC)
I don't think you want to try to prove your point by nomming all of these for deletion. I suggest that Headbomb just ask any WikiProject tagger to do the run for him; but since it's a per-project basis almost every project tht wants their redirects tagged already has them tagged. §hep Talk 22:35, 4 March 2009 (UTC)
WikiProject taggers can't do it, because for taggers to work, they need to be put in categories. Those categories you've just pointed to are those that are already tagged, so it's pointless to do anything with them. What I want is a bot that places redirects in categories, according to what wikiproject their target are part of. Headbomb { ταλκ κοντριβς –  WP Physics} 01:57, 5 March 2009 (UTC)
They most certainly can. They would work off of Category:X articles by quality or similar. Load all of the articles in the cat, get the redirects of each article (What Links Here or similar), and tag their talk pages with {{ WikiProject X|class=Redirect|importance=NA}}. It's that simple. §hep Talk 02:12, 5 March 2009 (UTC)
I wasn't aware that existing taggers had that feature in them. Which of them can do it? Headbomb { ταλκ κοντριβς –  WP Physics} 02:17, 5 March 2009 (UTC)

(←)Not sure, I'm sure some do though. I'd do it for you, but AWB has been on the fritz for me recently. If you have AWB it's a simple manner of 3 steps or so to get a complete list of all redirects for a project. §hep Talk 02:20, 5 March 2009 (UTC)


You seem to have missed the point. The cost of the categories is nearly equal to the JOINs. It would take about 2 years using the JOIN method everyday for it to equal the cost of the same tagging run. In addtion, page regeneration will happen as the template updates, continuing to increase the cost. So the tagging method has a higher inital and running costs. — Dispenser 23:26, 4 March 2009 (UTC)
JOIN? Headbomb { ταλκ κοντριβς –  WP Physics} 01:51, 5 March 2009 (UTC)
Bots tag talk pages every day. If a project wants their redirects categorized we're not ones to stop them. §hep Talk 01:53, 5 March 2009 (UTC)
Many bots also do a poor job of what they're suppose to do. Redirect tagging falls under the "Just because you can do something doesn't mean that you should". — Dispenser 05:33, 5 March 2009 (UTC)
JOIN is the command to (temporarily) merge two database table. This is done on a almost constant bases with page titles in MediaWiki. — Dispenser 05:33, 5 March 2009 (UTC)

And of course, there is the small matter of updating every redirect talk page whenever a banner is added to or removed from the target page. The whole point of redirects and templates is to avoid duplication, not perpetuate it. The MediaWiki architecture is specifically designed to be as quick and efficient as possible in outputting data, with corresponding sacrifices on inputting it. Almost never will an argument that "editing page X once is better than reading data Y times" prove genuinely valid. Happymelon 15:22, 5 March 2009 (UTC)

Need a bot to change a template link

Hello,

I need a bot to make a change to a template location on about 250 or so portal pages. The templates that these portals use were created in the wrong name space, "Portal:". As part of some house cleaning, I moved the templates to the proper name space and need a bot to update the links on all of the pages so that they avoid the redirect.

The templates are:

Thank you, -- Jeremy ( blah blah 08:34, 27 February 2009 (UTC)

If it's purely to avoid the redirect - and you don't need the name for some other purpose - then it fails under WP:R#NOTBROKEN. - Jarry1250 ( t, c) 08:54, 27 February 2009 (UTC)

Actually I was more concerned with the problem covered in the next section: Aliases for templates can cause confusion and make migrations of template calls more complicated. For example, assume calls to T1 are to be changed ("migrated") to some new template TN1. To catch all calls, articles must be searched for {{T1}} and all aliases of T1 (T2 in this case). -- Jeremy ( blah blah 09:11, 27 February 2009 (UTC)

There is some sense to the idea of removing the cross-namespace redirects that have been created here. [[Sam Korn]] (smoddy) 18:16, 27 February 2009 (UTC)
I still would like to have this done, per the reasons I and Sam Korn have put forth. Thanks again, -- Jeremy ( blah blah 08:20, 1 March 2009 (UTC)
BRFA filed at Wikipedia:Bots/Requests for approval/Erik9bot 4. Erik9 ( talk) 23:35, 1 March 2009 (UTC)
 Done [3]. Erik9 ( talk) 01:18, 2 March 2009 (UTC)

Updating bot

A lot of the templates for the To-Do in the taskforces of the WP:MILHIST have, in the requested articles section, blue links, meaning that they are no longer requested. Could a bot go through and remove these as they are created? Or be run every 24 hours to remove them as they are created? Just a question, I have a little bot programming experience, but not enough for it to help. TARTARUS talk 01:23, 28 February 2009 (UTC)

Coding... Lego Kontribs TalkM 01:36, 28 February 2009 (UTC)
This is such a great help, thank you very much! TARTARUS talk 01:39, 28 February 2009 (UTC)
BRFA filed WP:Bots/Requests for approval/Legobot 11 Lego Kontribs TalkM 01:32, 5 March 2009 (UTC)

Deleted template removal

Resolved

Template:Infobox movie certificates was recently deleted at WP:TFD, but the template link is still present in many articles - too many to be easily removed individually. [4] I tried to remove them myself using AutoWikiBrowser, but because of the parameter within the template, it could not be done using the program, and therefore a bot would do much good here. – Dream out loud ( talk) 18:39, 1 March 2009 (UTC)

Why couldn't it be done with AWB? AWB does regex, so "\{\{Template:Infobox movie certificates[^}]*\}\}" with the ignore case box checked should work just as well as any bot could. - Jarry1250 ( t, c) 18:57, 1 March 2009 (UTC)
 Done. I did it with AWB; as Jarry1250 said you can do things like this easy enough with it if you know how to construct the regex.-- Dycedarg ж 21:04, 1 March 2009 (UTC)
Ok, thanks guys. I don't know much about constructing regex through AWB, otherwise I would have done it myself. – Dream out loud ( talk) 21:12, 1 March 2009 (UTC)

WikiProject Indiana

Hello! I was about to do something that is going to take me a week probably, and I thought.. Maybe a bot can help! I would like to have a bot look at the talk page of every article in Category:Indiana and all subcategories and make sure there is a {{WikiProject Indiana}} tag on the talk page. If there is not, I want it to add one without any parameters. This will put them all into the unassessed Indiana articles category. Then the projects members.. Probably all me, will be able to go through and assess them more quickly, without having to hunt for them first! I have a hunch that there are a couple hundred in there that are not tagged.. maybe more. Is this something that can be done by a bot? Charles Edward ( Talk) 03:02, 2 March 2009 (UTC)

Not like that. You have to specify specific categories to check. Subcats is too vague and always leads to chaos. Someone could generate a complete category tree for you to filter, but just going straight through the category isn't allowed anymore. §hep Talk 03:06, 2 March 2009 (UTC)
Ok. I will go the old fashioned way. Thanks for your help! Charles Edward ( Talk) 03:14, 2 March 2009 (UTC)
We can still do it if you give us a solid list of categories that we run through. Lego Kontribs TalkM 04:58, 2 March 2009 (UTC)

Spacing around <ref>s

IIRC the recommended style for spaces and punctuations around footnotes is:

word<ref> not word <ref> (no space before the <ref>) and
word,<ref> and word.<ref> not word<ref>, and word<ref>. (punctuation before the <ref> not after). Many articles have the spacing wrong; would a bot be the right way to fix this? Shreevatsa ( talk) 22:54, 2 March 2009 (UTC)
I think WP:AWB would probably be the best way to fix this. — Nn123645 ( talk) 01:52, 3 March 2009 (UTC)
You do not recall correctly; see WP:REFPUNC. Anomie 02:06, 3 March 2009 (UTC)
Thanks for the link, but I don't see how it's different from what I said. (No spaces before the footnotes, and punctuation—other than dashes—occurs before the footnote. I wasn't thinking about dashes; maybe that's what you meant? In which case thanks again for pointing it out.) Shreevatsa ( talk) 03:14, 3 March 2009 (UTC)
I was referring to this:

Some editors prefer the in-house style of journals such as Nature, which place references before punctuation. If an article has evolved using predominantly one style of ref tag placement, the whole article should conform to that style unless there is a consensus to change it.

It's unlikely that a bot could make that determination, and errors would result in excessive controversy. Personally, I like the "refs after punctuation" style, but... Anomie 03:25, 3 March 2009 (UTC)
Oh, thanks! I don't know how I missed that. :) Shreevatsa ( talk) 04:16, 3 March 2009 (UTC)
commonfixes.py get around that requirement by only operating if after the reference is the major style (>50%). But the routine is very tricky to implement if you want to take into consideration newlines. I still haven't gotten it fixed up and have about 20 diffs sitting on my desktop of issues. It's used in PDFbot and my tools, but I wouldn't recommend reusing it until I get the newline issues resolved. — Dispenser 04:32, 4 March 2009 (UTC)

Broken ESPN links

A recent redesign of http://espn.go.com has broken many of the links. The only information I have seen from ESPN itself is an unhelpful message at broken links, for example http://espn.go.com/nba/news/1999/1012/110905.html. See Wikipedia:Help desk#Missing footnote links for a discussion. Special:Linksearch currently displays 1795 links to http://espn.go.com in this search. Manual experimentation on a limited number of cases shows that many links to http://espn.go.com still work but if they are broken then it works to insert "static." or "assets." before espn.go.com, for example http://static.espn.go.com/nba/news/1999/1012/110905.html or http://assets.espn.go.com/nba/news/1999/1012/110905.html. Could a bot go through the links to test them and if they are broken then test whether a replacement works? Both "static." and "assets." worked in the cases I tried but I don't know whether it will always work. PrimeHunter ( talk) 01:35, 3 March 2009 (UTC)

Coding... X clamation point 01:36, 3 March 2009 (UTC)
BRFA filed X clamation point`
Thanks. I guess you meant http://static.espn.go.com and not http://static.go.com. PrimeHunter ( talk) 05:40, 3 March 2009 (UTC)

"Year" removal

It has been proposed at Wikipedia talk:WikiProject Years#"Year" that the word "Year", apparently added to all year articles at the outset, be eliminated, as specified in Wikipedia:WikiProject Years#Intro Section. As far as I can tell, "Year" was never in a proposed template in that project. Project approval is expected, but has not yet reached consensus.

The detailed proposal, would be: for each Year article, replace, at most once, at the beginning of a line, replace

Year '''(article name)'''
Year ''(article name)''

or

Year (article name)

by

'''(article name)'''

As this will hit approximately 3700 articles (I manually changed 1921–1923 and 1963–2059, using WP:AWB and other test edits.), I wanted to give the bot programmer a chance to code it efficiently. This is a run-once, so it may not be necessary to code it efficiently. — Arthur Rubin (talk) 03:07, 3 March 2009 (UTC)

If you do get consensus and a clear set of rules for the bot to follow, I can do this with Sambot. [[Sam Korn]] (smoddy) 17:42, 5 March 2009 (UTC)
OK, it's still being discussed, and morphing into a general discussion of reformatting the opening line. Still, if a consensus develops, I'll repost. This request will probably be archived before a consensus is reached, anyway. — Arthur Rubin (talk) 22:45, 6 March 2009 (UTC)

Photo required bot

Just as an experiment, I'm wondering if anyone would be kind enough to devote some time to putting together a bot to do the following, to create a useful page indicating for which parts of the county are photos required. Something like:

  • Parse all articles in category:Northumberland and child categories
  • For each article which uses {{ coord}} (i.e. has a geo-coordinate), evaluate whether there is a file: or image: tag on the page (i.e. is there a photo)
  • If no file or image, list the article name and the coordinate on a single results page (optionally grouped by category) such that {{ GeoGroupTemplate}} can be used to visualise all of the locations on a map.
  • Note that a couple of pieces of coordinate manging might be required:
    • in the output page, coord should be display=inline ... in the source page it will probably be display=title
    • an additional parameter |name= should be added, with article_name being used as the argument
    • if the coordinate in the article is in an infobox and not in a regular {{ coord}} then you're on you own...

By way of explanation, I /think/ that Northumberland articles are fairly well geo-coded, and so looking for coord gives us all places & things capable of being photographed. That said, a variant of the same thing which simply looks for northumberland articles without images might be just as interesting.

As is the way of these things, a) were such a thing on the toolserver or b) capable of being run as a bot for any project, it might be a useful thing. Right now I'm interested to see if it yields useful results for me in my neck of the woods. thanks -- Tagishsimon (talk) 03:42, 3 March 2009 (UTC)

If you are interested in lakes, you can browse Category:Wikipedia infobox lake articles without image with Google maps, e.g. [5]. -- User:Docu
That's just exactly what I'm looking for. (Only for Northumberland.) How can production of maps like these for, say, every country, state & county, be a bad thing? -- Tagishsimon (talk) 00:36, 6 March 2009 (UTC)
It works also with Category:Wikipedia requested photographs in Northumberland, but that requires the articles to be tagged first.
Another solution would be http://toolserver.org/~magnus/fist.php which can list all articles w/o images. One could match that with a list of coordinates from Northumberland, e.g. [6]. -- User:Docu

Need a Bot to Help School

My school needs a new field really bad and Kellogg's is having a contest thing where you need the most supporter. Every time you click and put in a code that is shown it counts as a supporter. I really need a bot that clicks on the button then puts in the code. I don't know but maybe it might require a password searcher? Thank you for your time. —Preceding unsigned comment added by 76.126.18.79 ( talk) 19:36, 5 March 2009 (UTC)

This is the place to request bots to improve Wikipedia - that is both irrelavent and illegal. We cannot help you. Dendodge Talk Contribs 19:51, 5 March 2009 (UTC)

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook