This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 45 | Archive 46 | Archive 47 | Archive 48 | Archive 49 | Archive 50 | → | Archive 55 |
In the past there was a bot that was taking care of tagging these images Wikipedia:Database reports/Unused non-free files as orphaned. I've been trying to keep up when it wasn't working, but given the number there, it's getting to be overwhelming. If someone can remember which bot it was, I'd be happy (with a bit of assistance in getting it started) to take over running the bot. Skier Dude ( talk) 03:13, 2 May 2012 (UTC)
There is an existing category: Stub-Class Women's History articles. Could someone please run a bot that puts the template {{ Women's-History-stub}} at the bottom of the articles within that category? Thank you for your help. Maile66 ( talk) 12:43, 5 May 2012 (UTC)
Actually, this has all been a big mistake. I went to bot requests, because someone suggested I run an AWB on my own. I didn't want to do that, so I put the request here. But it's a longer story than that. Refer to the Stub types for deletion link above to see. Rcsprinter123, I apologize for putting this here. Aren't you all glad I didn't just run my own AWB? Maile66 ( talk) 14:58, 7 May 2012 (UTC)
Hi, similar to Wikipedia:Bot requests/Archive 47#ndash --> spaced ndash, I'm looking for a bot to switch over {{ mdash}} to {{ spaced mdash}} per the outcome of Template talk:Spaced mdash#Requested move. Jenks24 ( talk) 15:11, 7 May 2012 (UTC)
Will it be better if we have a bot which can change dates in sortable tables to the {{
dts}}
template, so they can be sorted correctly? Many people, even if they know how to make sortable tables, do not know how the JavaScript sorting algorithm actually works, and when sortable tables are created with raw dates they are sorted alphabetically. Such bots may be able to read the date format and use the same format in the template. (
Delhi Daredevils is one example of an article with raw dates in sortable tables).
jfd34 (
talk) 15:22, 8 May 2012 (UTC)
See
[1]. <poem></poem>
tags are more efficient and less distracting in the editing window IMO than <br>
/ similar tags on every line. So could a bot go through Shakespearean sonnet pages, remove all instances of <br>
, </br>
, or <br/>
, and place <poem></poem>
around the text, inside the {{
sonnet}} template?
It Is Me Here
t /
c 17:47, 28 April 2012 (UTC)
OK, so do we now have consensus? It Is Me Here t / c 19:34, 4 May 2012 (UTC)
Sounds good. Done (Anyone can feel free to change if they feel the defaults shouldn't be so large, but now there aren't big red error messages at Template:Sonnet.) — madman 16:21, 11 May 2012 (UTC)
The links at Special:LinkSearch/*.blog.taragana.com are now all redirects to subdomains at gaeatimes.com. Not sure if anyone is able to work out a means to convert these to direct links to the articles. — billinghurst sDrewth 11:20, 6 May 2012 (UTC)
Is it possible to get all links/signatures to my former usernames, user:N, and User:Nardman1, and their respective talk pages, updated? There is a new user who wishes to use the User:N moniker and I would like to avoid any confusion in the future. Please also include subpages, most notably User:N/modern Jesus - Nard 14:33, 11 May 2012 (UTC)
BRFA filed. I also went ahead and advertised the discussion at WP:VPR and WT:R. Nard, if this gets approval for trial, would you like the bot to log pages it can't handle to a particular subpage, or should it just spam your main talk page? Anomie ⚔ 02:51, 12 May 2012 (UTC)
Is there a way to make a bot that can notify my alternate account talk page when I have a message on my main account, as my alternate account redirects. CTJF83 23:57, 11 May 2012 (UTC)
Template:Category TOC, which is used if there is more than 400 pages in a category, is transcluded into over 48,000 articles. I have had a look at some of the categories where it is used and many of them actually had less than 200 pages in them. Having the template in these categories is unneeded and confusing clutter. Can we get a bot to remove them if there are less than 400 pages in a category? Alternatively, slip me a list of the pages and I will do it with AWB and take FULL responsibility for the task (which goes without saying). -- Alan Liefting ( talk - contribs) 04:37, 14 May 2012 (UTC)
I need a bot that will update the league table and results of the 2012 Kenyan Premier League, the 2012 FKF Division One, the 2012 Kenyan Women's Premier League and their future seasons on a regular basis (every 4 days). Manually updating them gets extremely tedious as I normally have other tasks to perform in or outside Wikipedia. Also, I, and any other person who edits the articles, will not always be available to update the articles. I know coming up with code for this can take extremely long, so if you can, which I hope, can it be ready in the next 2-3 months? Davykamanzi → talk · contribs 17:18, 8 May 2012 (UTC)
Following the recent ARBCOM drama, we'll need some people to pick up many tasks done by Smackbot/Helpful Pixie Bot. Specifically
{{
cite xxx|isbn=0123456789}}
to {{
cite xxx}}
{{
please check ISBN}}}}Some other tasks can be found at User:Helpful Pixie Bot#Tasks and authorisations, but the above seem to be the most important. Headbomb { talk / contribs / physics / books} 19:05, 15 May 2012 (UTC)
May I draw your attention to RFC: Deploying 'Start date' template in infoboxes on this page's talk page? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:37, 17 May 2012 (UTC)
I know I already brought this up in the past, but here we go again. Is there a bot that can read some specific data from a website and automatically update an article with that data? I am thinking of articles such as Rosetta@home, where the infobox stats such as Active users, Total users, Active hosts and Total hosts could be auto-updated with info from http://boincstats.com/en/stats/14/project/detail or Wieferich@Home, where the infobox stats also could be auto updated. I guess there are other types of articles (about other things than dc projects) where this could be useful. This might also be useful for example for company articles to keep things like the revenue / profit up-to-date. -- Toshio Yamaguchi ( tlk− ctb) 17:14, 19 May 2012 (UTC)
We have an article List of ISO 639-3 codes with a search button. However, it relies on stubs having been created for each ISO code, and many of these are missing or link to the wrong page.
Could we create the missing stubs, and redirect the outdated/misdirected ones, so that our ISO search interface works properly?
The full list of ISO codes (as of February), with their ISO names, is at
Wikipedia:WikiProject Languages/Articles by code, in the piped format [[ISO name|ISO code]]
.
My request is as follows:
|iso3=
or |lc(#)=
set to the ISO 639-3 code in question |lc1=
, |lc20=
).For example, the very first ISO code in the list at
Wikipedia:WikiProject Languages/Articles by code is aaa
. This is a blue link, piped to
Ghotuo language. The ISO redirect,
ISO 639:aaa, also links to
Ghotuo language. These match, and the article has an infobox with iso3
set equal to aaa
, which also matches. The link is therefore correct, so our ISO search will find the proper article, and nothing needs to be done by the bot.
The second code in the list, aab
, has no corresponding
ISO 639:aab stub, so one needs to be created. In the code list, aab
is piped to
Alumu-Tesu language, which redirects to
Alumu language, which has an infobox with iso3=aab
. That matches, so
ISO 639:aab should be directed to
Alumu language. (Note the format of the previous ISO redirect, which has {{R from ISO 639|AAA}}
in it. All of our ISO redirects have this, and this should be replicated by the bot. If there are any ISO redirects which are missing that note (which triggers a hidden category), they should be fixed by the bot.)
aai
is a red link, so the bot would do nothing.
The stub for aaq
,
ISO 639:aaq, already exists, and links to
Abnaki language. The code in the list is piped to
Eastern Abnaki language, which redirects to
Abnaki language. That article has an info box with two ISO codes. The first is ours: lc1=aaq
(ignoring white space), so again everything matches and the bot does nothing.
The stub for abo
,
ISO 639:abo, also exists, but it links to
Tivoid languages. On the code page, the abo
link is instead piped to
Abon language. Now,
Abon language has an info box with iso3=abo
, so that's where the ISO stub should direct. The bot should therefore edit
ISO 639:abo so that it points to
Abon language.
For abs
, the ISO redirect and the link on the ISO list both ultimately link to
Malay trade and creole languages#Ambonese Malay, though the piped link goes through a redirect at
Ambonese Malay. The linked section of that article is the correct destination for both, so the bot does nothing. There are several info boxes on that page, but I don't expect the bot to be sophisticated enough to verify it's linking to the correct section. (If you can do that, or at least verify that they both link to the same section, that would be great, but probably not necessary.)
For adg
, the ISO redirect
ISO 639:adg links to
Aranda language. This has the proper ISO code in the info box. However, the piped link in the list,
Andegerebinha language, redirects to a different article,
Andegerebinha dialect, which also has the proper ISO code in the info box. This is presumably too complex for the bot to work out, so it would edit the ISO rd stub to link instead to
Wikipedia:WikiProject Languages/Articles by code, and add the comment <!--ISO 639-3 code for Andegerebinha language-->
.
For zaa
,
ISO 639:zaa links to
Zapotec languages. The code list links to
Sierra de Juárez Zapotec language, which also redirects to
Zapotec languages. However, the infobox in that article does not have an iso3
or lc#
field set equal to zaa
, so
ISO 639:zaa should be redirected to
Wikipedia:WikiProject Languages/Articles by code, with a comment containing the ISO name.
If the ISO redirect exists, but the corresponding article does not, that would be a mismatch and so also be relinked to Wikipedia:WikiProject Languages/Articles by code. If the ISO redirect does not exist, and the language article does, but does not have the code in the info box, then again there is a mismatch; the ISO redirect would be created and linked to Wikipedia:WikiProject Languages/Articles by code. Etc.
This is something that might be repeated every year, as articles are created, merged, split, etc. — kwami ( talk) 06:06, 20 May 2012 (UTC)
Can a bot be please tasked to remove {{ WikiProject Natural phenols and polyphenols}} from all the talk pages it currently resides on? The associated WikiProject was deleted/userified, so the template should no longer be used. See discussions at Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles, Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles and User_talk:Nihonjoe#MFD (who suggested this bot task). Thank you. ChemNerd ( talk) 13:30, 16 May 2012 (UTC)
Given my still fresh in the memory history of bot tasks that have ruffled some feathers in the past I don't feel its appropriate for me to do this personally but here is some simple code for an AWB module that should do most of what you are asking if someone wants to use it. It may not be 100% so the usual AWB warning disclaimers apply.
public string ProcessArticle(string ArticleText, string ArticleTitle, int wikiNamespace, out string Summary, out bool Skip) { Skip = false; Summary = "Deprecate WPNPP and Replace WPNPP with WPChemistry"; //Replace WPNPP with WPChemistry ArticleText = Regex.Replace(ArticleText, @"{{\s*(WikiProject[ _]+Natural[ _]+phenols[ _]+and[ _]+polyphenols|Wikiproject[ _]+NPP|WikiProject[ _]+NPP)\s*([\|}{<\n])", "{{WikiProject Chemistry$2", RegexOptions.IgnoreCase); return ArticleText; }
I hope this helps. Kumioko ( talk) 15:41, 16 May 2012 (UTC)
Task is now Done. Thank you to all that helped. ChemNerd ( talk) 20:53, 21 May 2012 (UTC)
This WP Hawaii Recent changes is linked to This Page, which in turn looks like it's being daily updated by User:Femto Bot.This information definitely is not accessible by just clicking "Recent Changes" over in the left-hand Toolbox - all that does is pull up anything and everything linked to the project page.
I think this is a valuable page, and want to know what it would take to get a Recent Changes page set up for two WikiProjects: Texas and Women's History. I believe both projects could benefit from being able to access this information. Maile66 ( talk) 19:13, 21 May 2012 (UTC)
According to MOS:FLAGBIO, flags should never be used to indicate a person's place of birth, residence, or death, as flags imply citizenship and/or nationality. Is it possible for a bot to remove {{ Flag}}, {{ Flagcountry}}, and {{ Flagicon}} automatically from the {{ Infobox person}} template? -- Luke (Talk) 23:02, 14 May 2012 (UTC)
|official language=
", but it can't read the article to find out whether the language is official. At any rate, if you can show me consensus for a specific list of infoboxes, I should be able to add it.
Anomie
⚔ 11:28, 15 May 2012 (UTC)
Can a bot be programmed to help with {{
cleanup}}? It needs to leave a message to anybody that has just tagged an article using this template without filling in the |reason=
parameter. Something similar to what
DPL bot (
talk ·
contribs) does with DABs. The message is:
"You recently added a
cleanup tag to
Article name without specifying a reason. To help other editors identify the problems please add a reason to the tag using the |reason=
parameter or replace it with a more
specific tag. Cleanup tags without reasons will be automatically removed after a week."
If a message has been left and the {{ cleanup}} template still has no reason supplied a week later the bot can then remove the template. Only templates added after the notification system has been implimented should be removed.
The relavent discussions and rfcs can be found at Template talk:Cleanup. AIRcorn (talk) 09:54, 6 May 2012 (UTC)
Retrieved from archive. — Martin ( MSGJ · talk)
Not sure if this already exists, but could a bot be made that checks Category:Wikipedia pages with incorrect protection templates and then tries to fix them. Not sure exactly how much of the necessary fixes could be automated, but here's a list of possible tasks in order of how easy I'm guessing they would be to implement:
— Preceding unsigned comment added by Millermk90 ( talk • contribs)
I'm requesting a bot to warn WikiProjects about images that are at risk for deletion. Many images, particularly older non-free images, get deleted for things like inadequate fair use rationale, and nobody notices until they're deleted. User:CommonsNotificationBot currently places notifications on article talk pages, but these don't always get viewed in the week before deletion. Ideally, this bot would maintain a page at each WikiProject like Wikipedia:WikiProject Example/File alerts (similar to many article deletion pages currently maintained). This bot would update the page when a new image is at risk for deletion, and automatically remove the images, or mark them in some way, when the image is either deleted or no longer at risk. This would not need to run more than once per day (I don't think it could or probably should keep up with CSD). The WikiProjects to notify would be those on the article's talk page (if the image itself isn't project-tagged). ▫ JohnnyMrNinja 11:49, 24 May 2012 (UTC)
All articles in the categories Western Football League seasons and Southern Football League seasons need their category keys tweaked so they list correctly. They currently have (for example):
[[Category:Southern Football League seasons]]
[[Category:1928–29 domestic association football leagues]]
[[Category:1928 in England]]
[[Category:1929 in England]]
and what's needed is:
{{DEFAULTSORT:Southern Football League, 1928–29}}
[[Category:Southern Football League seasons|1928–29]]
[[Category:1928 in England]]
[[Category:1929 in England]]
Could a bot do this? There are a couple of hundred of them - rather too many to easily do by hand. Colonies Chris ( talk) 09:19, 25 May 2012 (UTC)
Is there a bot available that could take on the ongoing task of counting the number of requests on the various Wikipedia:Requested articles pages? Counting the number of * characters on each page should be enough to get a semi-accurate number. A bonus would be if the bot could also do a count per each ==Section==. Thanks -- Eclipsed (talk) (COI Declaration) 12:51, 25 May 2012 (UTC)
A lot of years-based topics-categories have been moved on Wikimedia Commons. The category "Years in sports" has become "sports by year". Use a bot to move them all. J 1982 ( talk) 09:59, 26 May 2012 (UTC)
I am not sure about Sinebot ( I have seen posts that have no signature ), but can anyone please make a bot that signs all unsigned posts? Thank you. — Preceding unsigned comment added by 24.87.148.68 ( talk) 02:55, 27 May 2012 (UTC)
I would like to see this bot available for other wikis as well. I would particularly be interested to see if this is implemented for Bengali wiki. So the basic requirement would be to have another flag to denote the language of the wikipedia for which it would search the data. http://toolserver.org/~enwp10/bin/list2.fcgi?run=yes&projecta=Olympics&importance=Top-Class&quality=FA-Class 2 will have another param, may be. Please help. - Pasaban ( talk) 18:19, 20 May 2012 (UTC)
Every Ohio township article is entitled "___ Township, ___ County, Ohio", but each one has either a corresponding "___ Township, Ohio" redirect (when the township name is unique) or a "___ Township, Ohio" disambiguation page when there are multiple townships with the same name. This is only partially true for Indiana townships — all of the disambiguation pages have been created, but "___ Township, Indiana" redirects to uniquely named townships are nonexistent. I'd like it if someone could write a bot to go through List of Indiana townships (which has a "___ Township, ___ County, Indiana" link for each township) and create each "___ Township, Indiana" as a redirect if it doesn't already exist. Since there might be an ambiguous name or two that hasn't been created, it would be good for the bot to log all its page creations so that I could go through and repair errors of this sort. The code for the redirects could simply be REDIRECT#pagename; no categorisation of redirects is needed. Nyttend ( talk) 01:37, 23 May 2012 (UTC)
It seems there is an increase in the number of user namespace sandbox pages appearing in content categories. Probably related to the new sandbox link. I would like to see a bot monitor all user sandbox pages that are created or modified and strip out all the categories and interwiki links. Any stub templates should be removed as well. -- Alan Liefting ( talk - contribs) 02:16, 28 May 2012 (UTC)
<!--
header
several external links
nav templates
DEFAULTSORT
[[Category:Foo]]
[[Category:Bar]]
[[Category:Baz]]
language links
-->
Hello!
I'd request a bot to tag articles that are within the scope of the recently created
Wikipedia:WikiProject Hungary/Sports and games task force with {{WikiProject Hungary|class=|importance=|sports=yes}}
. I don't know how it goes exactly, but from what I managed to figure out I should give a category of which articles would get the banner. If it's so, then this should be the
Category:Sport in Hungary, and all of its sub-categories. I don't know if it's possible, but if yes, then it would be also great to auto assess these articles according to the ratings of other banners. Since this is my very first time, I may well misunderstood something, so please be forebearing. :) Thanks for your answer and help! —
Thehoboclown (
talk) 15:18, 3 June 2012 (UTC)
The regular bot that handles Wikipedia:Categories for discussion/Working has had several stalls in the last week. At least one additional bot that can process category tasks is needed to ensure full coverage continues. Thanks in advance. Timrollpickering ( talk) 02:27, 27 May 2012 (UTC)
Cydebot is operating again. I agree that additional bots would be helpful, but I am curious about something: What steps need to be taken to prevent the bots from competing (i.e., attempting the same task at the same time)? Thank you, -- Black Falcon ( talk) 21:35, 27 May 2012 (UTC)
Note: It is often (not always) better to replace the old category with a {{
Category redirect}}
.
Rich
Farmbrough, 15:40, 5 June 2012 (UTC).
Can someone help to put a pin on the map on all 700 Welsh peaks, please? The articles have been created without a map.
There has been some discussion on this - and a solution to the problem here.
All I now need is a BOT to upload the instructions to copy the geotags. Many thanks. Llywelyn2000 ( talk) 07:52, 27 May 2012 (UTC)
MediaWiki has a neat feature built into file pages that automatically links to certain pages in the meta data. For instance, if you go to File:Cervus elaphus Luc Viatour 3.jpg here on WP you will see the link to NIKON CORPORATION and NIKON D3S (common links, both blue) and Bibble Pro 5 5.2.2 (currently red). On Commons, all of these links are blue, because they are interwiki links, so finding that the page is not present is unexpected. I'm hoping for a bot to do two things. 1) to compile a list of all "wanted pages" linked to from meta-data, so that appropriate redirects can be created. 2) maintain redirects with a template like {{R for file meta-data}} (similar to {{ R from move}} and the like), so that these redirects aren't deleted as unneeded (as Bibble Pro was). ▫ JohnnyMrNinja 01:54, 1 June 2012 (UTC)
I recently noticed that Template:Infobox_musical_artist has a category of associated articles with deprecated parameters that currently numbers nearly 4,000. The parameters that need fixed are "Born =" (replaced by Template:Birth date and age for the living and Template:Birth date for deceased) and "Died =" (replaced by Template:Death date and age). I have begun plugging away at them but I am slowed by my obsession with actually reading the articles and making whatever improvements I can. Is the template cleanup described here something that a bot could handle? - UnbelievableError ( talk) 03:00, 1 June 2012 (UTC)
|Born = DATE, PLACE |Born = DATE<br>PLACE |Born = DATE in PLACE |Born = DATE PLACE
(I vaguely remember that User:Yobot used to work on this sometimes, but it's currently blocked.) 1ForTheMoney ( talk) 13:48, 1 June 2012 (UTC)
Hi, I listed Redirect pages that have Interwiki Hear. .I got this query by
SELECT /*SLOW OK */ page_title FROM page JOIN langlinks ON page_id = ll_from WHERE page_namespace = 0 AND page_is_redirect = 1 GROUP BY page_title ORDER BY count(ll_from) DESC;
please remove this interwikis by bot Reza1615 ( talk) 19:28, 4 June 2012 (UTC)
I would like to get an article alerts bot for the WikiProject Katy Perry please. teman13 TALK CONTRIBUTIONS 05:02, 5 June 2012 (UTC)
WP:DRV could use a bot creating the daily review pages. They are currently hand-created each day by the first person to click a handy "create" link on the DRV main page, but a bot would be more convenient and consistent. T. Canens ( talk) 10:30, 5 June 2012 (UTC)
I have been working to cleanup the alumni sections of university articles. I commonly find many non-notable names listed in violation of Wikipedia's alumni notability guideline. WP:ALUMNI requires that the person listed either (a) already have a Wikipedia article, or (b) have a reference showing notability. I have written an essay, Your alma mater is not your ticket to Wikipedia" about namechecking in university articles. I am requesting a bot to flag names on university pages which do not meet either of the two stipulations (e.g., existing Wikipedia page or reference), which then could be removed. I suggest that the name-removal feature not be fully automated because there are occasional cases of misspellings or disambiguations where a person has their own Wikipedia article, but it doesn't link properly. NJ Wine ( talk) 22:28, 30 May 2012 (UTC)
Some charts cannot be indexed (example:Record Report), and then it would be very useful to have a bot that auto-archives the chart page each time it is refreshed (example:Record Report is refreshed every saturday) so to have a complete chart archive to use on singles and albums pages on Wikipedia. -- Hahc21 [ TALK][ CONTRIBS 20:21, 1 June 2012 (UTC)
http://webcitation.org/archive?url=
http://recordreport.com.ve/publico/?i=top100&email=nowyouknowmymail@hotmail.com
) and return the archive url (which in this case is http://www.webcitation.org/686PeUioN). The bot will do this for several urls, provided a day for each of the chart parameters. Is that possible? --
Hahc21 [
TALK][
CONTRIBS 20:45, 1 June 2012 (UTC)
If I understand this correctly, it is to force an external site to archive another external site and record the archive location. There have been previous projects along these lines, but on a much larger scale (for which reason they were stopped), although I do not know the details. Digging or consulting community memory might save some work.
Rich
Farmbrough, 01:11, 5 June 2012 (UTC).
Can I get a bot to remove the |parking= field from {{ Infobox shopping mall}}? The field was removed due to misuse and irrelevance. Ten Pound Hammer • ( What did I screw up now?) 20:24, 4 June 2012 (UTC)
|parking=
? Thanks!
GoingBatty (
talk) 03:29, 6 June 2012 (UTC)
My post here about having two projects (Texas, and Women's History) added to Femto6 was archived, so maybe whoever reads this sees it as more appropriately listed here Wikipedia:Bot owners' noticeboard#Rich_Farmbrough.27s_bots, which I did. But there is so much of Rich's work they're sorting out over there, that I can't tell by what is being posted if Femto6 (Update recent chages pages for projects) was taken on by anyone. Is there anyone reading this who can clarify that for me? Since Special:RecentChangesLinked/Wikipedia:WikiProject_Hawaii/Recent_changes continues to be updated, the bot must be still working. What do I do to get these projects added? Maile66 ( talk) 17:51, 3 June 2012 (UTC)
The Drug Enforcement Agency, a part of the government of Liberia, is linked by over 100 pages, but nearly all of the links are mistakes and meant to go to the US government's Drug Enforcement Administration. I understand that there would be some false positives if a bot went around and replaced all of them, so I've created a userspace page with all of the current links; I'm looking at each page myself and removing entries from the userspace page if there's a link that really should go to the Liberian entity. Would it be possible for a bot to change all of the links from pages linked here once I've checked all of them and removed the irrelevant ones? I doubt that there would be many errors, and what errors exist would purely be my fault; you could insert a note into the edit summary specifying that errors should be reported at my talk page. If this idea prove workable, I can leave a note here when I'm done checking all of the links. Nyttend ( talk) 18:26, 6 June 2012 (UTC)
could someone append {{tfd|{{subst:PAGENAME}}}} to the templates listed
here? they should all be orphaned, so it shouldn't make a difference if the tag is included in a <noinclude>...</noinclude>
or not.
Frietjes (
talk) 20:56, 7 June 2012 (UTC)
Hello. Per the requested move at Talk:Air21 Express (2011-present) could all current wikilinks to Air 21 Express please be updated to link to Barako Bull Energy? This would be a one time run that I assume could be done with AWB, so I don't think a BRFA would be necessary. Jenks24 ( talk) 09:23, 12 June 2012 (UTC)
[[Air 21 Express]]
was changed to to [[Barako Bull Energy|Air 21 Express]]
Thanks in advance,
Jenks24 (
talk) 09:26, 12 June 2012 (UTC)Ah, I see my mistake now, there shouldn't be a space between "Air" and "21".
Air21 Express has many more incoming links. So, could [[Air21 Express]]
please be changed to [[Barako Bull Energy|Air21 Express]]
in preparation for the move? Cheers,
Jenks24 (
talk) 21:36, 13 June 2012 (UTC)
During a discussion on a proposed category page MOS at Wikipedia_talk:Manual_of_Style/Category_pages#Cat_main the issue was raised of missing main articles in categories. As an example science should be in Category:Science with a category sort order of a space. Due to either forgetting in the case of new categories or removal due to vandalism this important link is missing. It seems to me that the task of checking and correcting this is ideal for a bot. It would only be used in cases where there a direct correlation. The cases where there is the singular and plural category names exists (eg Category:Murder and Category:Murders) may have to be left to an actual human. -- Alan Liefting ( talk - contribs) 06:16, 13 June 2012 (UTC)
Generating a report along the lins of this one User:AnomieBOT/Afd-mergefrom report would be cool. -- Alan Liefting ( talk - contribs) 05:56, 14 June 2012 (UTC)
The documentation of some of the cite templates (e.g.,
Template:Cite web,
Template:Cite news) use {{ #time}} in the example |accessdate=
parameter. When I create a cite, I usually copy the example string. Before doing that, I usually have to purge the template page to get the accessdate to be the current date. At this moment, the document for
Template:Cite news shows accessdate=May 31, 2012, whereas today's date is June 1, 2012. I think there are a lot of pages that use {{#time}} (
Template:Time?) that could use an automatic, once a day purge when the time changes from 23:59 to 00:00.
Template:Time notes, "Most Wikipedia pages display a cached version of the page to reduce server load, so the template will only display the current time as of when the page was last parsed." Would you please develope a time template purge bot that, at about 00:01 UTC each day, purges pages that use time templates such as {{ #time: j F Y}} (
Template:Time?) and/or pages that transclude those pages (such as template pages that transclude documentation pages using, for example, {{ #time: j F Y}} (
Template:Time?). Maybe just limit the bot to template namespace if server load is a problem. Thanks! --
Uzma Gamal (
talk) 13:35, 1 June 2012 (UTC)
I'm updating barelinks using { http://toolserver.org/~dispenser/view/Reflinks/NoTitle), but it occured to me that because it's very procedural it could be done with a BOT? Sfan00 IMG ( talk) 12:51, 15 June 2012 (UTC)
|publisher=
instead of |work=
. Therefore, editors should run Reflinks manually and check/fix each reference before saving.
GoingBatty (
talk) 15:01, 16 June 2012 (UTC)See commons:Commons:Bots/Work requests#3 million null edits. — Dispenser 13:46, 16 June 2012 (UTC)
Is is possible to have a bot to cleanup deleted duplicate info tags?
http://en.wikipedia.org/wiki/Special:WhatLinksHere/Template:Duplicate_file_info
has over a 1000 entries, and I'm finding most of the duplicates are already deleted. Sfan00 IMG ( talk) 23:00, 16 June 2012 (UTC)
As I have seen so far, most articles have single sources for statistical info on sports and records. I was wondering if a bot could be created which would update the relevant page with the relevant information. And I Don't have any programming experience and haven't studied programming at all, but I'd like to contribute in any way I can!-- Harsh Mujhse baat kijiye(Talk)( Contribs) 21:09, 13 June 2012 (UTC)
This could potentially save a lot of time and brings into it's scope a lot of articles that are primarily concerned with detailing the records of the game. -- Harsh Mujhse baat kijiye(Talk)( Contribs) 05:41, 14 June 2012 (UTC)
A bot is need to answer the following question:
How many of the articles on this list,
Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, have infoboxes?
An example of what is meant here by “infobox” can be seen in the article
25001 Pacheco. Please have a look at the infobox along the right side, including statistics such as "
orbital period" and "inclination". Here it is again, the example of an article that has an infobox:
25001 Pacheco.
How many of the
Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new have such infoboxes containing orbitinfo? It cannot be counted by hand as there are far too many articles on the list. This is why a bot is requested to complete this task.
Also, in the process of doing this, could the bot keep track of which articles on the list have such orbitinfoboxes so that, if need be, they could be sorted out? This would be result in two lists, perfect subsets of the above list, and added together would include all articles on the list. One subset would contain only articles which have infoboxes, and the other a list of those which are not only on the above list but also do not have any orbitinfobox. Both sub-lists would be given appropriate descriptive titles.
Thank you for your kind attention to this matter. Chrisrus ( talk) 17:51, 18 June 2012 (UTC)
To understand this request better, see the above section.
We need to convert the unfortunately named Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, into new lists.
All those which have already been redirected should be removed and stored under a title like "Minor planet article converted to redirects on (date) by (person). These are "done". (Terribly sorry, but if I may just interject here HORRAY for Wikipedia.) This should probably have explanitory intro at the top.
The rest would form the new "Candidates for Redirection" list, but definately should not be named that. It should be named maybe "Minor planet article candidates for Redirection as of (date)" or some such, as "new" is not going to be true forever, obviously. It should include the entire "list history" that Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new has, but that obviously should be updated so that the history of the new list is up-to-date. Chrisrus ( talk) 17:42, 20 June 2012 (UTC)
Enquiring about the possibility/feasibility of using a bot to put a dent in the backlog at Category:Military history articles with incomplete B-Class checklists. If the talkpage had a completed B-class checklist for WP:SHIPS ({{ WikiProject Ships}}), it would copy the B1-B5 fields into the WP:MILHIST template ({{ WPMILHIST}}).
I first suggested the idea at Wikipedia talk:WikiProject Military history/Archive 112#Category:Military history articles with incomplete B-Class checklists where it seemed to have support, primarily because of the almost-identical assessment standards for warships the two projects have. Although that particular discussion was archived without action (in the most recent archive), the checklist backlog is a recurring subject for comment (twice more in the archive, and currently on the main talkpage).
Thoughts? -- saberwyn 02:43, 19 June 2012 (UTC)
It seems there's still some low-hanging fruit in the orchard of interlanguage linking. Yesterday I hit "Random article" and found the need for this edit to link two articles entitled en:Tapah Road and ms:Tapah Road, both of long standing. Where wikis in two languages have articles with the identical title and much content in common (e.g. geocoding, dates, inwikilinks) or have a history of overlapping human editors there is a high probability they have the same subject. It strikes me that a tool to identify these would start with a sorted union list of article titles, subset those seen in multiple wikis, then subset those without interlanguage links. Depending how long the list is, either auto- or semi-automatic replacement would need a closer look at content. LeadSongDog come howl! 19:59, 20 June 2012 (UTC)
Could a bot be created that lists "Backlog priorities". I am thinking of something based on page view statistics. For example Firefighting averages around 250 views a day (usually more) and it has 3 tags on it. I'm sure there are similar articles with even more average daily views. Would it be possible for a bot to take page view information for a 30 day period divide that number by 30 and add and remove articles with maintenance tags to a page like Wikipedia:Articles with maintenance tags receiving over 1000 views a day, Wikipedia:Articles with maintenance tags receiving over 500 views a day, and Wikipedia:Articles with maintenance tags receiving over 100 views a day? It wouldn't be necessary for the bot to remove the pages if it was done manually. Ryan Vesey Review me! 03:09, 21 June 2012 (UTC)
I've requested this about four times now, but since Rich has been banned it doesn't seem that there's anyone to take care of maintenance. So this is just one piece of the earlier request.
Can we remove date'=
from the language infoboxes? Any non-breaking space should be replaced with a normal space. Most of the entries are notes that the date is for a census. There are 200–300 of these. —
kwami (
talk) 06:52, 18 June 2012 (UTC)
My special search&replace changes are \s*\| *date' *= * census
→ census
(w a leading space), \s*\| *date' *= * *
→ (delete), \s*\| *date' *= *(-|–)
→ – (en dash), \s*\| *date' *= *
→ (delete), all within templates.
The article list I have from an old pre-parse is:
Nogai language Soddo language Waray-Waray language Maranao language Inupiat language Tzotzil language Keresan languages Kumyk language Karachay-Balkar language Dakota language Yokutsan languages Bussa language Albay Bikol language Aluku Nganasan language Kumam language Sidamo language Valley Yokuts Huastec language Ajië language Ngangikurrunggurr language Adhola language Xamtanga language Harari language Selti language Pintupi language Drehu language Achang language Sungor language Mentawai language Aklan language Libido language Busa language (Papuan) Godwari Chontal Maya language Tojolab'al language Huichol language Meriam language Mailu language Maisin language Isthmus Zapotec Bahing language Woleaian language Wolaytta language Gwere language Arop-Lokep language Otomi language Naxi language Central Bikol language P'urhépecha language Aghul language Kaugel language Pilagá language Mocho’ language Nyanga language Hamer language Lampung language Banjar language Malasanga language Maleu-Kilenge language Mapoyo language Macaguán language Guahibo language Cuiba language Mursi language Rutul language Yerukala language Aringa language Kangean language Abom language Upper Chinook language Iaai language Mezquital Otomi Chamula Tzotzil Dhuwal language Gnau language Qimant language Ayi language (Papua New Guinea) Gataq language Taos dialect Picuris dialect Southern Tiwa language Arbore language Daasanach language Dime language Karo language Chara language Barein language Basketo language Maale language Shinasha language Tsamai language Oyda language Sheko language Dizin language Gumuz language Chepang language Wayu language Tumak language Cua language (Mon–Khmer) Orok language Nayi language Alamblak language Touo language Ndrumbea language Anuak language Kachama-Ganjule language Kafa language Totontepec Mixe Hodï language Nyangatom language Kabalai language Yem language Luwo language Oroch language Hidatsa language Konjo Meadow Mari language Eastern Oromo language Kaikadi language Daju Mongo language Embu language Numanggang language Laha language Mamanwa language Kwama language Kwegu language Shekkacho language Zayse-Zergulla language Koore language Dargwa language Nepalese Sign Language Guhu-Samane language Fas language Baibai language Nobonob language Pal language Maia language Anamgura language Mudbura language Mountain Koiali language Dedua language Yopno language Yipma language Vanimo language Siane language Kamano language Gadsup language Agarabi language Kopar language Yerakai language Tuwari language Heyo language Juwal language Yil language Yangum language Mekmek language Zimba language Simbari language Kunyi language Adjora language Ebrié language Werni language Barambu language Bwa languages Raji language Khiamniungan language Sema language Central Nicobarese languages Tày language Caolan language Tai Ya language Tai Hongjin language Vaiphei language Gangte language Kom language (India) Sangtam language Yimchungrü language Angor language Xaracuu language Yessan language Sanio language Kwasengen language Central Banda language Gulay language Sar language Markweta language Sabiny language Gungu language Samia dialect (Luhya) Kwang language Budza language Mesme language Ngbundu language Koi language Jagoi language Bukar Sadong language Yakan language Chiapas Zoque Chimalapa Zoque Komering language Irish language in Britain Burum language Mesem language Ngaing language Borong language Bamu language Morawa language Keoru language Orokaiva language Kewa language Narak language Sepik Iwam language Baramu language Davawenyo language Numee language Yuaga language Babalia Creole Arabic Maramba language Foia Foia language Hoia Hoia language Kobol language Rembarunga language Binumarien language Bitur language Pei language Yawiyo language Pahi language Pasi language Bisis language Berinomo language Koiwat language Edolo language Dibiyaso language Safeyoka language Doghoro language Seta language Ningil language Amol language Bauwaki language Binahari language Kein language Coyotepec Popoloca language North Bolivian Quechua Quapaw language Hrê language Setaman language Suganga language Pochuri Naga language Dobase language Tai Mène language Tlaxcala–Puebla Nahuatl Michoacán Nahuatl Ometepec Nahuatl Temascaltepec Nahuatl
— kwami ( talk) 16:32, 22 June 2012 (UTC)
Done Thanks for your help! Kevin Rutherford ( talk) 05:52, 25 June 2012 (UTC)
Hello. I need help creating a bot for another wiki site that I work on (www.imfdb.org). I am wondering if there is anyone willing to help me out? The bot would help to locate broken redirects.
#REDIRECT [[PAGENAME#SECTION]] I want to find the redirects that don't work because the SECTION part is wrong. I know that there is the "Special:BrokenRedirects" page, but this will only tell you if the PAGENAME part of the redirect is wrong. I want to find the redirects that don't work because the SECTION part is wrong or doesn't exist. Anyone willing to help me understand how to do this would have my undying gratitude! No but seriously, I would love some help...
-- Zackmann08 ( talk) 18:52, 25 June 2012 (UTC)
The new project Wikipedia:WikiProject_Globalisation needs a bot to tag articles in Category:Globalization down to 3-4 levels. Bot help appreciated. Meclee ( talk) 22:41, 25 June 2012 (UTC)
Hello! I've an idea for a bot but I don't know where to start from. The idea would be building a list of images from certain category on commons that could be inserted in articles without images. We have lots of images of, for example, animals, that could have an image on the infobox but they don't have one despite you can find something on commons. Maybe this is not very normal on en:wp, but it happens in other languages. Could it be possible to make something like that? - Theklan ( talk) 11:55, 25 June 2012 (UTC)
#RFC: Deploying 'Start date' template in infoboxes was closed, with unanimous support to implement the following:
A great many infoboxes already emit microformats, and have for months, or even years. However, in some articles, these are incomplete, because the dates which form part of them do not use an appropriate sub-template, in order to emit the date in the correct metadata format. A bot (or bots - this task could be subdivided) is required, to complete the task of converting opening-, release-, first shown-, incident- and such dates from plain text to use {{ Start date}}, as seen in this example edit for a year, and this one for a full date and as described in the various infoboxes' documentation. Note that {{ start date}} allows for YYYY, YYYY-MM, YYYY-MM-DD and in a few cases YYY-M-DD:HH:MM formats. Note that Smackbot was approved to do this, and started, but failed to complete the task. A list of affected templates is available.
Can someone assist, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:11, 30 June 2012 (UTC)
Another piece of my former request, which got sidetracked during Rich's ban.
Could someone reorder the parameters in transclusions of {{ infobox language}} to match the order on the documentation page?
The problem with having them mixed up is that sometimes they end up getting duplicated, which causes problems: a blank entry will override an earlier, filled, entry in the expected position, for example, and there is some bizarre stuff hidden in such duplicates.
It would be nice to have a separate line for each parameter (which, for the most part, we already have), and also for the closing "}}". The two exceptions would be latitude/longitude and for multiple ISO codes, which are stacked like this in most articles and are expected this way:
|lc1=abc |ld1=name a |lc2=def |ld2=name b |lc3=ghi |ld3=name c
(lc comes before ld because it's always the same length, and so lines up better this way.)
If there are duplicate parameters, could they be tagged with a fix-it category for manual review? (Even if the 2nd is empty, because we'd need to review whatever it is hiding in the 1st instance before we display that.)
Any unsupported params (not included in the documentation) should be ordered at the end and also tagged for review. (Unless they're empty, in which case they can be deleted.)
— kwami ( talk) 21:48, 29 June 2012 (UTC)
Get a bot to delete user pages that meet WP:U1. Admins have better stuff to do.-- Otterathome ( talk) 16:31, 1 July 2012 (UTC)
What about attack/test pages? Or does edit filter already stop this? (section renamed)-- Otterathome ( talk) 19:36, 5 July 2012 (UTC)
Hello-
Can I get a bot to replace all !scope="col" tags from the table at User:Albacore/Sandbox to !scope="row" tags? This is advisable since double bolding in tables is discouraged, and it's easy enough to change the !scope="row" tags back to !scope="col" tags for the columns (only five). Tony Award for Best Featured Actor in a Play and Tony Award for Best Featured Actress in a Play need this as well. Thanks. Albacore ( talk) 04:16, 5 July 2012 (UTC)
Technical details about this error:
Last attempted database query: (SQL query hidden)
Function: SqlBagOStuff::set
MySQL error: 1114: The table 'pc000' is full (10.0.6.50)
Albacore (
talk) 13:37, 5 July 2012 (UTC)
I am forever cleaning up polluted categories (see Wikipedia:Database reports/Polluted categories) by removing pages from the wrong namespace out of content categories. A big culprit is user sandboxes, especially now that they are more easily used. I would like to get a bot to keep an eye on it and remove any categories (and interwiki links if possible). BattyBot can do it but apparently it is only semi-automated. It should be an easy bot task especially if it is only done for user sandboxes. Any takers? -- Alan Liefting ( talk - contribs) 04:41, 29 June 2012 (UTC)
While many such issues can be resolved by commenting out the category from the sandbox page, some categories are embedded inside templates. Is it possible to add namespace detection in such templates so that the categories are only included on article pages? See categories with the hidden template {{ polluted category}} for many examples. Thanks! GoingBatty ( talk) 19:51, 30 June 2012 (UTC)
All old links to discovery.co.uk now redirects to dsc.discovery.com. That means that 112 links to discovery.co.uk, most of them deeplinks in refs, needs fixing through archive.org or such. I'm not sure how to best sort this out, one possibility might be to simply add {{ wayback}} to every link to discovery.co.uk. From the few tests I made it seems many, but not all, these pages are archived at archive.org. Just adding wayback without a date isn't ideal, but it should be pretty straightforward? Finn Rindahl ( talk) 22:13, 7 July 2012 (UTC)
Recently, some concerns were raised on the External links noticeboard about links to Wikimapia:- [Wikipedia:External_links/Noticeboard#Wikimapia]
The response from there was based on the criteria, links to Wikimapia weren't eligble.
I then checked here: http://en.wikipedia.org/?title=Special:LinkSearch&limit=5000&offset=0&target=http%3A%2F%2F*.wikimapia.org
And found there were quite a few pages using them, sometimes a references, sometimes as External Links.
I was told that a bot might be able to handle removals. Sfan00 IMG ( talk) 17:54, 8 July 2012 (UTC)
It appears that the bot used to generate the page Wikipedia:WikiProject Anime and manga/Assessment/Cleanup listing and Wikipedia:WikiProject Anime and manga/Cleanup task force/Cleanup listing stopped working in March 2010 and has been down ever since, is there a bot that can be used to replace the old one and keep these pages auto-updated per month? - Knowledgekid87 ( talk) 22:17, 9 July 2012 (UTC)
Is there a bot that could create a table with a couple of cells from a list of users at What links here from {{ retired}}? The table would need cells for "last contact attempt" and "notes". Ryan Vesey Review me! 19:45, 6 July 2012 (UTC)
Here you go, I hope this is close to what you want. I added an extra column for the talk page because I noticed some didn't have a main page and thought it might be helpful. Here are a few other things I noticed:
Please let me know if you need anything else. Kumioko ( talk) 22:22, 6 July 2012 (UTC)
I'm sure this won't pass, so I'm not holding my breath. Not meaning to be sarcastic per se, just cognizant that every effort to enforce NFCC is shot down these days with nauseating regularity. But anyway...
WP:MOSLOGO notes that non-free images are "nearly always prohibited" as icons. Yet, I routinely see non-free images being used as icons.
Case example; File:Hezbollah Flag.jpg came up at Wikipedia:Non-free_content_review#File:Hezbollah_Flag.jpg recently. I found this interesting because I have, in the past, removed the image from many articles for failing WP:NFCC #10c and WP:MOSLOGO. It keeps getting restored anyway, especially in uses as icons. It is in fact used 21 times as an icon in various articles.
I propose, therefore, that a bot be created that patrols mainspace looking for icon uses of non-free images such that the rendered image is 30 pixels or less. For example, use in conjunction with {{ flagicon}}. In operation, the bot would remove the use. If applicable, it would remove the template where the non-free image is used as a parameter to that template. Further, a notification be placed on the talk page of the article in question explaining why the image was removed.
I hope you prove my first paragraph wrong. -- Hammersoft ( talk) 23:08, 11 July 2012 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 45 | Archive 46 | Archive 47 | Archive 48 | Archive 49 | Archive 50 | → | Archive 55 |
In the past there was a bot that was taking care of tagging these images Wikipedia:Database reports/Unused non-free files as orphaned. I've been trying to keep up when it wasn't working, but given the number there, it's getting to be overwhelming. If someone can remember which bot it was, I'd be happy (with a bit of assistance in getting it started) to take over running the bot. Skier Dude ( talk) 03:13, 2 May 2012 (UTC)
There is an existing category: Stub-Class Women's History articles. Could someone please run a bot that puts the template {{ Women's-History-stub}} at the bottom of the articles within that category? Thank you for your help. Maile66 ( talk) 12:43, 5 May 2012 (UTC)
Actually, this has all been a big mistake. I went to bot requests, because someone suggested I run an AWB on my own. I didn't want to do that, so I put the request here. But it's a longer story than that. Refer to the Stub types for deletion link above to see. Rcsprinter123, I apologize for putting this here. Aren't you all glad I didn't just run my own AWB? Maile66 ( talk) 14:58, 7 May 2012 (UTC)
Hi, similar to Wikipedia:Bot requests/Archive 47#ndash --> spaced ndash, I'm looking for a bot to switch over {{ mdash}} to {{ spaced mdash}} per the outcome of Template talk:Spaced mdash#Requested move. Jenks24 ( talk) 15:11, 7 May 2012 (UTC)
Will it be better if we have a bot which can change dates in sortable tables to the {{
dts}}
template, so they can be sorted correctly? Many people, even if they know how to make sortable tables, do not know how the JavaScript sorting algorithm actually works, and when sortable tables are created with raw dates they are sorted alphabetically. Such bots may be able to read the date format and use the same format in the template. (
Delhi Daredevils is one example of an article with raw dates in sortable tables).
jfd34 (
talk) 15:22, 8 May 2012 (UTC)
See
[1]. <poem></poem>
tags are more efficient and less distracting in the editing window IMO than <br>
/ similar tags on every line. So could a bot go through Shakespearean sonnet pages, remove all instances of <br>
, </br>
, or <br/>
, and place <poem></poem>
around the text, inside the {{
sonnet}} template?
It Is Me Here
t /
c 17:47, 28 April 2012 (UTC)
OK, so do we now have consensus? It Is Me Here t / c 19:34, 4 May 2012 (UTC)
Sounds good. Done (Anyone can feel free to change if they feel the defaults shouldn't be so large, but now there aren't big red error messages at Template:Sonnet.) — madman 16:21, 11 May 2012 (UTC)
The links at Special:LinkSearch/*.blog.taragana.com are now all redirects to subdomains at gaeatimes.com. Not sure if anyone is able to work out a means to convert these to direct links to the articles. — billinghurst sDrewth 11:20, 6 May 2012 (UTC)
Is it possible to get all links/signatures to my former usernames, user:N, and User:Nardman1, and their respective talk pages, updated? There is a new user who wishes to use the User:N moniker and I would like to avoid any confusion in the future. Please also include subpages, most notably User:N/modern Jesus - Nard 14:33, 11 May 2012 (UTC)
BRFA filed. I also went ahead and advertised the discussion at WP:VPR and WT:R. Nard, if this gets approval for trial, would you like the bot to log pages it can't handle to a particular subpage, or should it just spam your main talk page? Anomie ⚔ 02:51, 12 May 2012 (UTC)
Is there a way to make a bot that can notify my alternate account talk page when I have a message on my main account, as my alternate account redirects. CTJF83 23:57, 11 May 2012 (UTC)
Template:Category TOC, which is used if there is more than 400 pages in a category, is transcluded into over 48,000 articles. I have had a look at some of the categories where it is used and many of them actually had less than 200 pages in them. Having the template in these categories is unneeded and confusing clutter. Can we get a bot to remove them if there are less than 400 pages in a category? Alternatively, slip me a list of the pages and I will do it with AWB and take FULL responsibility for the task (which goes without saying). -- Alan Liefting ( talk - contribs) 04:37, 14 May 2012 (UTC)
I need a bot that will update the league table and results of the 2012 Kenyan Premier League, the 2012 FKF Division One, the 2012 Kenyan Women's Premier League and their future seasons on a regular basis (every 4 days). Manually updating them gets extremely tedious as I normally have other tasks to perform in or outside Wikipedia. Also, I, and any other person who edits the articles, will not always be available to update the articles. I know coming up with code for this can take extremely long, so if you can, which I hope, can it be ready in the next 2-3 months? Davykamanzi → talk · contribs 17:18, 8 May 2012 (UTC)
Following the recent ARBCOM drama, we'll need some people to pick up many tasks done by Smackbot/Helpful Pixie Bot. Specifically
{{
cite xxx|isbn=0123456789}}
to {{
cite xxx}}
{{
please check ISBN}}}}Some other tasks can be found at User:Helpful Pixie Bot#Tasks and authorisations, but the above seem to be the most important. Headbomb { talk / contribs / physics / books} 19:05, 15 May 2012 (UTC)
May I draw your attention to RFC: Deploying 'Start date' template in infoboxes on this page's talk page? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:37, 17 May 2012 (UTC)
I know I already brought this up in the past, but here we go again. Is there a bot that can read some specific data from a website and automatically update an article with that data? I am thinking of articles such as Rosetta@home, where the infobox stats such as Active users, Total users, Active hosts and Total hosts could be auto-updated with info from http://boincstats.com/en/stats/14/project/detail or Wieferich@Home, where the infobox stats also could be auto updated. I guess there are other types of articles (about other things than dc projects) where this could be useful. This might also be useful for example for company articles to keep things like the revenue / profit up-to-date. -- Toshio Yamaguchi ( tlk− ctb) 17:14, 19 May 2012 (UTC)
We have an article List of ISO 639-3 codes with a search button. However, it relies on stubs having been created for each ISO code, and many of these are missing or link to the wrong page.
Could we create the missing stubs, and redirect the outdated/misdirected ones, so that our ISO search interface works properly?
The full list of ISO codes (as of February), with their ISO names, is at
Wikipedia:WikiProject Languages/Articles by code, in the piped format [[ISO name|ISO code]]
.
My request is as follows:
|iso3=
or |lc(#)=
set to the ISO 639-3 code in question |lc1=
, |lc20=
).For example, the very first ISO code in the list at
Wikipedia:WikiProject Languages/Articles by code is aaa
. This is a blue link, piped to
Ghotuo language. The ISO redirect,
ISO 639:aaa, also links to
Ghotuo language. These match, and the article has an infobox with iso3
set equal to aaa
, which also matches. The link is therefore correct, so our ISO search will find the proper article, and nothing needs to be done by the bot.
The second code in the list, aab
, has no corresponding
ISO 639:aab stub, so one needs to be created. In the code list, aab
is piped to
Alumu-Tesu language, which redirects to
Alumu language, which has an infobox with iso3=aab
. That matches, so
ISO 639:aab should be directed to
Alumu language. (Note the format of the previous ISO redirect, which has {{R from ISO 639|AAA}}
in it. All of our ISO redirects have this, and this should be replicated by the bot. If there are any ISO redirects which are missing that note (which triggers a hidden category), they should be fixed by the bot.)
aai
is a red link, so the bot would do nothing.
The stub for aaq
,
ISO 639:aaq, already exists, and links to
Abnaki language. The code in the list is piped to
Eastern Abnaki language, which redirects to
Abnaki language. That article has an info box with two ISO codes. The first is ours: lc1=aaq
(ignoring white space), so again everything matches and the bot does nothing.
The stub for abo
,
ISO 639:abo, also exists, but it links to
Tivoid languages. On the code page, the abo
link is instead piped to
Abon language. Now,
Abon language has an info box with iso3=abo
, so that's where the ISO stub should direct. The bot should therefore edit
ISO 639:abo so that it points to
Abon language.
For abs
, the ISO redirect and the link on the ISO list both ultimately link to
Malay trade and creole languages#Ambonese Malay, though the piped link goes through a redirect at
Ambonese Malay. The linked section of that article is the correct destination for both, so the bot does nothing. There are several info boxes on that page, but I don't expect the bot to be sophisticated enough to verify it's linking to the correct section. (If you can do that, or at least verify that they both link to the same section, that would be great, but probably not necessary.)
For adg
, the ISO redirect
ISO 639:adg links to
Aranda language. This has the proper ISO code in the info box. However, the piped link in the list,
Andegerebinha language, redirects to a different article,
Andegerebinha dialect, which also has the proper ISO code in the info box. This is presumably too complex for the bot to work out, so it would edit the ISO rd stub to link instead to
Wikipedia:WikiProject Languages/Articles by code, and add the comment <!--ISO 639-3 code for Andegerebinha language-->
.
For zaa
,
ISO 639:zaa links to
Zapotec languages. The code list links to
Sierra de Juárez Zapotec language, which also redirects to
Zapotec languages. However, the infobox in that article does not have an iso3
or lc#
field set equal to zaa
, so
ISO 639:zaa should be redirected to
Wikipedia:WikiProject Languages/Articles by code, with a comment containing the ISO name.
If the ISO redirect exists, but the corresponding article does not, that would be a mismatch and so also be relinked to Wikipedia:WikiProject Languages/Articles by code. If the ISO redirect does not exist, and the language article does, but does not have the code in the info box, then again there is a mismatch; the ISO redirect would be created and linked to Wikipedia:WikiProject Languages/Articles by code. Etc.
This is something that might be repeated every year, as articles are created, merged, split, etc. — kwami ( talk) 06:06, 20 May 2012 (UTC)
Can a bot be please tasked to remove {{ WikiProject Natural phenols and polyphenols}} from all the talk pages it currently resides on? The associated WikiProject was deleted/userified, so the template should no longer be used. See discussions at Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles, Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles and User_talk:Nihonjoe#MFD (who suggested this bot task). Thank you. ChemNerd ( talk) 13:30, 16 May 2012 (UTC)
Given my still fresh in the memory history of bot tasks that have ruffled some feathers in the past I don't feel its appropriate for me to do this personally but here is some simple code for an AWB module that should do most of what you are asking if someone wants to use it. It may not be 100% so the usual AWB warning disclaimers apply.
public string ProcessArticle(string ArticleText, string ArticleTitle, int wikiNamespace, out string Summary, out bool Skip) { Skip = false; Summary = "Deprecate WPNPP and Replace WPNPP with WPChemistry"; //Replace WPNPP with WPChemistry ArticleText = Regex.Replace(ArticleText, @"{{\s*(WikiProject[ _]+Natural[ _]+phenols[ _]+and[ _]+polyphenols|Wikiproject[ _]+NPP|WikiProject[ _]+NPP)\s*([\|}{<\n])", "{{WikiProject Chemistry$2", RegexOptions.IgnoreCase); return ArticleText; }
I hope this helps. Kumioko ( talk) 15:41, 16 May 2012 (UTC)
Task is now Done. Thank you to all that helped. ChemNerd ( talk) 20:53, 21 May 2012 (UTC)
This WP Hawaii Recent changes is linked to This Page, which in turn looks like it's being daily updated by User:Femto Bot.This information definitely is not accessible by just clicking "Recent Changes" over in the left-hand Toolbox - all that does is pull up anything and everything linked to the project page.
I think this is a valuable page, and want to know what it would take to get a Recent Changes page set up for two WikiProjects: Texas and Women's History. I believe both projects could benefit from being able to access this information. Maile66 ( talk) 19:13, 21 May 2012 (UTC)
According to MOS:FLAGBIO, flags should never be used to indicate a person's place of birth, residence, or death, as flags imply citizenship and/or nationality. Is it possible for a bot to remove {{ Flag}}, {{ Flagcountry}}, and {{ Flagicon}} automatically from the {{ Infobox person}} template? -- Luke (Talk) 23:02, 14 May 2012 (UTC)
|official language=
", but it can't read the article to find out whether the language is official. At any rate, if you can show me consensus for a specific list of infoboxes, I should be able to add it.
Anomie
⚔ 11:28, 15 May 2012 (UTC)
Can a bot be programmed to help with {{
cleanup}}? It needs to leave a message to anybody that has just tagged an article using this template without filling in the |reason=
parameter. Something similar to what
DPL bot (
talk ·
contribs) does with DABs. The message is:
"You recently added a
cleanup tag to
Article name without specifying a reason. To help other editors identify the problems please add a reason to the tag using the |reason=
parameter or replace it with a more
specific tag. Cleanup tags without reasons will be automatically removed after a week."
If a message has been left and the {{ cleanup}} template still has no reason supplied a week later the bot can then remove the template. Only templates added after the notification system has been implimented should be removed.
The relavent discussions and rfcs can be found at Template talk:Cleanup. AIRcorn (talk) 09:54, 6 May 2012 (UTC)
Retrieved from archive. — Martin ( MSGJ · talk)
Not sure if this already exists, but could a bot be made that checks Category:Wikipedia pages with incorrect protection templates and then tries to fix them. Not sure exactly how much of the necessary fixes could be automated, but here's a list of possible tasks in order of how easy I'm guessing they would be to implement:
— Preceding unsigned comment added by Millermk90 ( talk • contribs)
I'm requesting a bot to warn WikiProjects about images that are at risk for deletion. Many images, particularly older non-free images, get deleted for things like inadequate fair use rationale, and nobody notices until they're deleted. User:CommonsNotificationBot currently places notifications on article talk pages, but these don't always get viewed in the week before deletion. Ideally, this bot would maintain a page at each WikiProject like Wikipedia:WikiProject Example/File alerts (similar to many article deletion pages currently maintained). This bot would update the page when a new image is at risk for deletion, and automatically remove the images, or mark them in some way, when the image is either deleted or no longer at risk. This would not need to run more than once per day (I don't think it could or probably should keep up with CSD). The WikiProjects to notify would be those on the article's talk page (if the image itself isn't project-tagged). ▫ JohnnyMrNinja 11:49, 24 May 2012 (UTC)
All articles in the categories Western Football League seasons and Southern Football League seasons need their category keys tweaked so they list correctly. They currently have (for example):
[[Category:Southern Football League seasons]]
[[Category:1928–29 domestic association football leagues]]
[[Category:1928 in England]]
[[Category:1929 in England]]
and what's needed is:
{{DEFAULTSORT:Southern Football League, 1928–29}}
[[Category:Southern Football League seasons|1928–29]]
[[Category:1928 in England]]
[[Category:1929 in England]]
Could a bot do this? There are a couple of hundred of them - rather too many to easily do by hand. Colonies Chris ( talk) 09:19, 25 May 2012 (UTC)
Is there a bot available that could take on the ongoing task of counting the number of requests on the various Wikipedia:Requested articles pages? Counting the number of * characters on each page should be enough to get a semi-accurate number. A bonus would be if the bot could also do a count per each ==Section==. Thanks -- Eclipsed (talk) (COI Declaration) 12:51, 25 May 2012 (UTC)
A lot of years-based topics-categories have been moved on Wikimedia Commons. The category "Years in sports" has become "sports by year". Use a bot to move them all. J 1982 ( talk) 09:59, 26 May 2012 (UTC)
I am not sure about Sinebot ( I have seen posts that have no signature ), but can anyone please make a bot that signs all unsigned posts? Thank you. — Preceding unsigned comment added by 24.87.148.68 ( talk) 02:55, 27 May 2012 (UTC)
I would like to see this bot available for other wikis as well. I would particularly be interested to see if this is implemented for Bengali wiki. So the basic requirement would be to have another flag to denote the language of the wikipedia for which it would search the data. http://toolserver.org/~enwp10/bin/list2.fcgi?run=yes&projecta=Olympics&importance=Top-Class&quality=FA-Class 2 will have another param, may be. Please help. - Pasaban ( talk) 18:19, 20 May 2012 (UTC)
Every Ohio township article is entitled "___ Township, ___ County, Ohio", but each one has either a corresponding "___ Township, Ohio" redirect (when the township name is unique) or a "___ Township, Ohio" disambiguation page when there are multiple townships with the same name. This is only partially true for Indiana townships — all of the disambiguation pages have been created, but "___ Township, Indiana" redirects to uniquely named townships are nonexistent. I'd like it if someone could write a bot to go through List of Indiana townships (which has a "___ Township, ___ County, Indiana" link for each township) and create each "___ Township, Indiana" as a redirect if it doesn't already exist. Since there might be an ambiguous name or two that hasn't been created, it would be good for the bot to log all its page creations so that I could go through and repair errors of this sort. The code for the redirects could simply be REDIRECT#pagename; no categorisation of redirects is needed. Nyttend ( talk) 01:37, 23 May 2012 (UTC)
It seems there is an increase in the number of user namespace sandbox pages appearing in content categories. Probably related to the new sandbox link. I would like to see a bot monitor all user sandbox pages that are created or modified and strip out all the categories and interwiki links. Any stub templates should be removed as well. -- Alan Liefting ( talk - contribs) 02:16, 28 May 2012 (UTC)
<!--
header
several external links
nav templates
DEFAULTSORT
[[Category:Foo]]
[[Category:Bar]]
[[Category:Baz]]
language links
-->
Hello!
I'd request a bot to tag articles that are within the scope of the recently created
Wikipedia:WikiProject Hungary/Sports and games task force with {{WikiProject Hungary|class=|importance=|sports=yes}}
. I don't know how it goes exactly, but from what I managed to figure out I should give a category of which articles would get the banner. If it's so, then this should be the
Category:Sport in Hungary, and all of its sub-categories. I don't know if it's possible, but if yes, then it would be also great to auto assess these articles according to the ratings of other banners. Since this is my very first time, I may well misunderstood something, so please be forebearing. :) Thanks for your answer and help! —
Thehoboclown (
talk) 15:18, 3 June 2012 (UTC)
The regular bot that handles Wikipedia:Categories for discussion/Working has had several stalls in the last week. At least one additional bot that can process category tasks is needed to ensure full coverage continues. Thanks in advance. Timrollpickering ( talk) 02:27, 27 May 2012 (UTC)
Cydebot is operating again. I agree that additional bots would be helpful, but I am curious about something: What steps need to be taken to prevent the bots from competing (i.e., attempting the same task at the same time)? Thank you, -- Black Falcon ( talk) 21:35, 27 May 2012 (UTC)
Note: It is often (not always) better to replace the old category with a {{
Category redirect}}
.
Rich
Farmbrough, 15:40, 5 June 2012 (UTC).
Can someone help to put a pin on the map on all 700 Welsh peaks, please? The articles have been created without a map.
There has been some discussion on this - and a solution to the problem here.
All I now need is a BOT to upload the instructions to copy the geotags. Many thanks. Llywelyn2000 ( talk) 07:52, 27 May 2012 (UTC)
MediaWiki has a neat feature built into file pages that automatically links to certain pages in the meta data. For instance, if you go to File:Cervus elaphus Luc Viatour 3.jpg here on WP you will see the link to NIKON CORPORATION and NIKON D3S (common links, both blue) and Bibble Pro 5 5.2.2 (currently red). On Commons, all of these links are blue, because they are interwiki links, so finding that the page is not present is unexpected. I'm hoping for a bot to do two things. 1) to compile a list of all "wanted pages" linked to from meta-data, so that appropriate redirects can be created. 2) maintain redirects with a template like {{R for file meta-data}} (similar to {{ R from move}} and the like), so that these redirects aren't deleted as unneeded (as Bibble Pro was). ▫ JohnnyMrNinja 01:54, 1 June 2012 (UTC)
I recently noticed that Template:Infobox_musical_artist has a category of associated articles with deprecated parameters that currently numbers nearly 4,000. The parameters that need fixed are "Born =" (replaced by Template:Birth date and age for the living and Template:Birth date for deceased) and "Died =" (replaced by Template:Death date and age). I have begun plugging away at them but I am slowed by my obsession with actually reading the articles and making whatever improvements I can. Is the template cleanup described here something that a bot could handle? - UnbelievableError ( talk) 03:00, 1 June 2012 (UTC)
|Born = DATE, PLACE |Born = DATE<br>PLACE |Born = DATE in PLACE |Born = DATE PLACE
(I vaguely remember that User:Yobot used to work on this sometimes, but it's currently blocked.) 1ForTheMoney ( talk) 13:48, 1 June 2012 (UTC)
Hi, I listed Redirect pages that have Interwiki Hear. .I got this query by
SELECT /*SLOW OK */ page_title FROM page JOIN langlinks ON page_id = ll_from WHERE page_namespace = 0 AND page_is_redirect = 1 GROUP BY page_title ORDER BY count(ll_from) DESC;
please remove this interwikis by bot Reza1615 ( talk) 19:28, 4 June 2012 (UTC)
I would like to get an article alerts bot for the WikiProject Katy Perry please. teman13 TALK CONTRIBUTIONS 05:02, 5 June 2012 (UTC)
WP:DRV could use a bot creating the daily review pages. They are currently hand-created each day by the first person to click a handy "create" link on the DRV main page, but a bot would be more convenient and consistent. T. Canens ( talk) 10:30, 5 June 2012 (UTC)
I have been working to cleanup the alumni sections of university articles. I commonly find many non-notable names listed in violation of Wikipedia's alumni notability guideline. WP:ALUMNI requires that the person listed either (a) already have a Wikipedia article, or (b) have a reference showing notability. I have written an essay, Your alma mater is not your ticket to Wikipedia" about namechecking in university articles. I am requesting a bot to flag names on university pages which do not meet either of the two stipulations (e.g., existing Wikipedia page or reference), which then could be removed. I suggest that the name-removal feature not be fully automated because there are occasional cases of misspellings or disambiguations where a person has their own Wikipedia article, but it doesn't link properly. NJ Wine ( talk) 22:28, 30 May 2012 (UTC)
Some charts cannot be indexed (example:Record Report), and then it would be very useful to have a bot that auto-archives the chart page each time it is refreshed (example:Record Report is refreshed every saturday) so to have a complete chart archive to use on singles and albums pages on Wikipedia. -- Hahc21 [ TALK][ CONTRIBS 20:21, 1 June 2012 (UTC)
http://webcitation.org/archive?url=
http://recordreport.com.ve/publico/?i=top100&email=nowyouknowmymail@hotmail.com
) and return the archive url (which in this case is http://www.webcitation.org/686PeUioN). The bot will do this for several urls, provided a day for each of the chart parameters. Is that possible? --
Hahc21 [
TALK][
CONTRIBS 20:45, 1 June 2012 (UTC)
If I understand this correctly, it is to force an external site to archive another external site and record the archive location. There have been previous projects along these lines, but on a much larger scale (for which reason they were stopped), although I do not know the details. Digging or consulting community memory might save some work.
Rich
Farmbrough, 01:11, 5 June 2012 (UTC).
Can I get a bot to remove the |parking= field from {{ Infobox shopping mall}}? The field was removed due to misuse and irrelevance. Ten Pound Hammer • ( What did I screw up now?) 20:24, 4 June 2012 (UTC)
|parking=
? Thanks!
GoingBatty (
talk) 03:29, 6 June 2012 (UTC)
My post here about having two projects (Texas, and Women's History) added to Femto6 was archived, so maybe whoever reads this sees it as more appropriately listed here Wikipedia:Bot owners' noticeboard#Rich_Farmbrough.27s_bots, which I did. But there is so much of Rich's work they're sorting out over there, that I can't tell by what is being posted if Femto6 (Update recent chages pages for projects) was taken on by anyone. Is there anyone reading this who can clarify that for me? Since Special:RecentChangesLinked/Wikipedia:WikiProject_Hawaii/Recent_changes continues to be updated, the bot must be still working. What do I do to get these projects added? Maile66 ( talk) 17:51, 3 June 2012 (UTC)
The Drug Enforcement Agency, a part of the government of Liberia, is linked by over 100 pages, but nearly all of the links are mistakes and meant to go to the US government's Drug Enforcement Administration. I understand that there would be some false positives if a bot went around and replaced all of them, so I've created a userspace page with all of the current links; I'm looking at each page myself and removing entries from the userspace page if there's a link that really should go to the Liberian entity. Would it be possible for a bot to change all of the links from pages linked here once I've checked all of them and removed the irrelevant ones? I doubt that there would be many errors, and what errors exist would purely be my fault; you could insert a note into the edit summary specifying that errors should be reported at my talk page. If this idea prove workable, I can leave a note here when I'm done checking all of the links. Nyttend ( talk) 18:26, 6 June 2012 (UTC)
could someone append {{tfd|{{subst:PAGENAME}}}} to the templates listed
here? they should all be orphaned, so it shouldn't make a difference if the tag is included in a <noinclude>...</noinclude>
or not.
Frietjes (
talk) 20:56, 7 June 2012 (UTC)
Hello. Per the requested move at Talk:Air21 Express (2011-present) could all current wikilinks to Air 21 Express please be updated to link to Barako Bull Energy? This would be a one time run that I assume could be done with AWB, so I don't think a BRFA would be necessary. Jenks24 ( talk) 09:23, 12 June 2012 (UTC)
[[Air 21 Express]]
was changed to to [[Barako Bull Energy|Air 21 Express]]
Thanks in advance,
Jenks24 (
talk) 09:26, 12 June 2012 (UTC)Ah, I see my mistake now, there shouldn't be a space between "Air" and "21".
Air21 Express has many more incoming links. So, could [[Air21 Express]]
please be changed to [[Barako Bull Energy|Air21 Express]]
in preparation for the move? Cheers,
Jenks24 (
talk) 21:36, 13 June 2012 (UTC)
During a discussion on a proposed category page MOS at Wikipedia_talk:Manual_of_Style/Category_pages#Cat_main the issue was raised of missing main articles in categories. As an example science should be in Category:Science with a category sort order of a space. Due to either forgetting in the case of new categories or removal due to vandalism this important link is missing. It seems to me that the task of checking and correcting this is ideal for a bot. It would only be used in cases where there a direct correlation. The cases where there is the singular and plural category names exists (eg Category:Murder and Category:Murders) may have to be left to an actual human. -- Alan Liefting ( talk - contribs) 06:16, 13 June 2012 (UTC)
Generating a report along the lins of this one User:AnomieBOT/Afd-mergefrom report would be cool. -- Alan Liefting ( talk - contribs) 05:56, 14 June 2012 (UTC)
The documentation of some of the cite templates (e.g.,
Template:Cite web,
Template:Cite news) use {{ #time}} in the example |accessdate=
parameter. When I create a cite, I usually copy the example string. Before doing that, I usually have to purge the template page to get the accessdate to be the current date. At this moment, the document for
Template:Cite news shows accessdate=May 31, 2012, whereas today's date is June 1, 2012. I think there are a lot of pages that use {{#time}} (
Template:Time?) that could use an automatic, once a day purge when the time changes from 23:59 to 00:00.
Template:Time notes, "Most Wikipedia pages display a cached version of the page to reduce server load, so the template will only display the current time as of when the page was last parsed." Would you please develope a time template purge bot that, at about 00:01 UTC each day, purges pages that use time templates such as {{ #time: j F Y}} (
Template:Time?) and/or pages that transclude those pages (such as template pages that transclude documentation pages using, for example, {{ #time: j F Y}} (
Template:Time?). Maybe just limit the bot to template namespace if server load is a problem. Thanks! --
Uzma Gamal (
talk) 13:35, 1 June 2012 (UTC)
I'm updating barelinks using { http://toolserver.org/~dispenser/view/Reflinks/NoTitle), but it occured to me that because it's very procedural it could be done with a BOT? Sfan00 IMG ( talk) 12:51, 15 June 2012 (UTC)
|publisher=
instead of |work=
. Therefore, editors should run Reflinks manually and check/fix each reference before saving.
GoingBatty (
talk) 15:01, 16 June 2012 (UTC)See commons:Commons:Bots/Work requests#3 million null edits. — Dispenser 13:46, 16 June 2012 (UTC)
Is is possible to have a bot to cleanup deleted duplicate info tags?
http://en.wikipedia.org/wiki/Special:WhatLinksHere/Template:Duplicate_file_info
has over a 1000 entries, and I'm finding most of the duplicates are already deleted. Sfan00 IMG ( talk) 23:00, 16 June 2012 (UTC)
As I have seen so far, most articles have single sources for statistical info on sports and records. I was wondering if a bot could be created which would update the relevant page with the relevant information. And I Don't have any programming experience and haven't studied programming at all, but I'd like to contribute in any way I can!-- Harsh Mujhse baat kijiye(Talk)( Contribs) 21:09, 13 June 2012 (UTC)
This could potentially save a lot of time and brings into it's scope a lot of articles that are primarily concerned with detailing the records of the game. -- Harsh Mujhse baat kijiye(Talk)( Contribs) 05:41, 14 June 2012 (UTC)
A bot is need to answer the following question:
How many of the articles on this list,
Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, have infoboxes?
An example of what is meant here by “infobox” can be seen in the article
25001 Pacheco. Please have a look at the infobox along the right side, including statistics such as "
orbital period" and "inclination". Here it is again, the example of an article that has an infobox:
25001 Pacheco.
How many of the
Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new have such infoboxes containing orbitinfo? It cannot be counted by hand as there are far too many articles on the list. This is why a bot is requested to complete this task.
Also, in the process of doing this, could the bot keep track of which articles on the list have such orbitinfoboxes so that, if need be, they could be sorted out? This would be result in two lists, perfect subsets of the above list, and added together would include all articles on the list. One subset would contain only articles which have infoboxes, and the other a list of those which are not only on the above list but also do not have any orbitinfobox. Both sub-lists would be given appropriate descriptive titles.
Thank you for your kind attention to this matter. Chrisrus ( talk) 17:51, 18 June 2012 (UTC)
To understand this request better, see the above section.
We need to convert the unfortunately named Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, into new lists.
All those which have already been redirected should be removed and stored under a title like "Minor planet article converted to redirects on (date) by (person). These are "done". (Terribly sorry, but if I may just interject here HORRAY for Wikipedia.) This should probably have explanitory intro at the top.
The rest would form the new "Candidates for Redirection" list, but definately should not be named that. It should be named maybe "Minor planet article candidates for Redirection as of (date)" or some such, as "new" is not going to be true forever, obviously. It should include the entire "list history" that Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new has, but that obviously should be updated so that the history of the new list is up-to-date. Chrisrus ( talk) 17:42, 20 June 2012 (UTC)
Enquiring about the possibility/feasibility of using a bot to put a dent in the backlog at Category:Military history articles with incomplete B-Class checklists. If the talkpage had a completed B-class checklist for WP:SHIPS ({{ WikiProject Ships}}), it would copy the B1-B5 fields into the WP:MILHIST template ({{ WPMILHIST}}).
I first suggested the idea at Wikipedia talk:WikiProject Military history/Archive 112#Category:Military history articles with incomplete B-Class checklists where it seemed to have support, primarily because of the almost-identical assessment standards for warships the two projects have. Although that particular discussion was archived without action (in the most recent archive), the checklist backlog is a recurring subject for comment (twice more in the archive, and currently on the main talkpage).
Thoughts? -- saberwyn 02:43, 19 June 2012 (UTC)
It seems there's still some low-hanging fruit in the orchard of interlanguage linking. Yesterday I hit "Random article" and found the need for this edit to link two articles entitled en:Tapah Road and ms:Tapah Road, both of long standing. Where wikis in two languages have articles with the identical title and much content in common (e.g. geocoding, dates, inwikilinks) or have a history of overlapping human editors there is a high probability they have the same subject. It strikes me that a tool to identify these would start with a sorted union list of article titles, subset those seen in multiple wikis, then subset those without interlanguage links. Depending how long the list is, either auto- or semi-automatic replacement would need a closer look at content. LeadSongDog come howl! 19:59, 20 June 2012 (UTC)
Could a bot be created that lists "Backlog priorities". I am thinking of something based on page view statistics. For example Firefighting averages around 250 views a day (usually more) and it has 3 tags on it. I'm sure there are similar articles with even more average daily views. Would it be possible for a bot to take page view information for a 30 day period divide that number by 30 and add and remove articles with maintenance tags to a page like Wikipedia:Articles with maintenance tags receiving over 1000 views a day, Wikipedia:Articles with maintenance tags receiving over 500 views a day, and Wikipedia:Articles with maintenance tags receiving over 100 views a day? It wouldn't be necessary for the bot to remove the pages if it was done manually. Ryan Vesey Review me! 03:09, 21 June 2012 (UTC)
I've requested this about four times now, but since Rich has been banned it doesn't seem that there's anyone to take care of maintenance. So this is just one piece of the earlier request.
Can we remove date'=
from the language infoboxes? Any non-breaking space should be replaced with a normal space. Most of the entries are notes that the date is for a census. There are 200–300 of these. —
kwami (
talk) 06:52, 18 June 2012 (UTC)
My special search&replace changes are \s*\| *date' *= * census
→ census
(w a leading space), \s*\| *date' *= * *
→ (delete), \s*\| *date' *= *(-|–)
→ – (en dash), \s*\| *date' *= *
→ (delete), all within templates.
The article list I have from an old pre-parse is:
Nogai language Soddo language Waray-Waray language Maranao language Inupiat language Tzotzil language Keresan languages Kumyk language Karachay-Balkar language Dakota language Yokutsan languages Bussa language Albay Bikol language Aluku Nganasan language Kumam language Sidamo language Valley Yokuts Huastec language Ajië language Ngangikurrunggurr language Adhola language Xamtanga language Harari language Selti language Pintupi language Drehu language Achang language Sungor language Mentawai language Aklan language Libido language Busa language (Papuan) Godwari Chontal Maya language Tojolab'al language Huichol language Meriam language Mailu language Maisin language Isthmus Zapotec Bahing language Woleaian language Wolaytta language Gwere language Arop-Lokep language Otomi language Naxi language Central Bikol language P'urhépecha language Aghul language Kaugel language Pilagá language Mocho’ language Nyanga language Hamer language Lampung language Banjar language Malasanga language Maleu-Kilenge language Mapoyo language Macaguán language Guahibo language Cuiba language Mursi language Rutul language Yerukala language Aringa language Kangean language Abom language Upper Chinook language Iaai language Mezquital Otomi Chamula Tzotzil Dhuwal language Gnau language Qimant language Ayi language (Papua New Guinea) Gataq language Taos dialect Picuris dialect Southern Tiwa language Arbore language Daasanach language Dime language Karo language Chara language Barein language Basketo language Maale language Shinasha language Tsamai language Oyda language Sheko language Dizin language Gumuz language Chepang language Wayu language Tumak language Cua language (Mon–Khmer) Orok language Nayi language Alamblak language Touo language Ndrumbea language Anuak language Kachama-Ganjule language Kafa language Totontepec Mixe Hodï language Nyangatom language Kabalai language Yem language Luwo language Oroch language Hidatsa language Konjo Meadow Mari language Eastern Oromo language Kaikadi language Daju Mongo language Embu language Numanggang language Laha language Mamanwa language Kwama language Kwegu language Shekkacho language Zayse-Zergulla language Koore language Dargwa language Nepalese Sign Language Guhu-Samane language Fas language Baibai language Nobonob language Pal language Maia language Anamgura language Mudbura language Mountain Koiali language Dedua language Yopno language Yipma language Vanimo language Siane language Kamano language Gadsup language Agarabi language Kopar language Yerakai language Tuwari language Heyo language Juwal language Yil language Yangum language Mekmek language Zimba language Simbari language Kunyi language Adjora language Ebrié language Werni language Barambu language Bwa languages Raji language Khiamniungan language Sema language Central Nicobarese languages Tày language Caolan language Tai Ya language Tai Hongjin language Vaiphei language Gangte language Kom language (India) Sangtam language Yimchungrü language Angor language Xaracuu language Yessan language Sanio language Kwasengen language Central Banda language Gulay language Sar language Markweta language Sabiny language Gungu language Samia dialect (Luhya) Kwang language Budza language Mesme language Ngbundu language Koi language Jagoi language Bukar Sadong language Yakan language Chiapas Zoque Chimalapa Zoque Komering language Irish language in Britain Burum language Mesem language Ngaing language Borong language Bamu language Morawa language Keoru language Orokaiva language Kewa language Narak language Sepik Iwam language Baramu language Davawenyo language Numee language Yuaga language Babalia Creole Arabic Maramba language Foia Foia language Hoia Hoia language Kobol language Rembarunga language Binumarien language Bitur language Pei language Yawiyo language Pahi language Pasi language Bisis language Berinomo language Koiwat language Edolo language Dibiyaso language Safeyoka language Doghoro language Seta language Ningil language Amol language Bauwaki language Binahari language Kein language Coyotepec Popoloca language North Bolivian Quechua Quapaw language Hrê language Setaman language Suganga language Pochuri Naga language Dobase language Tai Mène language Tlaxcala–Puebla Nahuatl Michoacán Nahuatl Ometepec Nahuatl Temascaltepec Nahuatl
— kwami ( talk) 16:32, 22 June 2012 (UTC)
Done Thanks for your help! Kevin Rutherford ( talk) 05:52, 25 June 2012 (UTC)
Hello. I need help creating a bot for another wiki site that I work on (www.imfdb.org). I am wondering if there is anyone willing to help me out? The bot would help to locate broken redirects.
#REDIRECT [[PAGENAME#SECTION]] I want to find the redirects that don't work because the SECTION part is wrong. I know that there is the "Special:BrokenRedirects" page, but this will only tell you if the PAGENAME part of the redirect is wrong. I want to find the redirects that don't work because the SECTION part is wrong or doesn't exist. Anyone willing to help me understand how to do this would have my undying gratitude! No but seriously, I would love some help...
-- Zackmann08 ( talk) 18:52, 25 June 2012 (UTC)
The new project Wikipedia:WikiProject_Globalisation needs a bot to tag articles in Category:Globalization down to 3-4 levels. Bot help appreciated. Meclee ( talk) 22:41, 25 June 2012 (UTC)
Hello! I've an idea for a bot but I don't know where to start from. The idea would be building a list of images from certain category on commons that could be inserted in articles without images. We have lots of images of, for example, animals, that could have an image on the infobox but they don't have one despite you can find something on commons. Maybe this is not very normal on en:wp, but it happens in other languages. Could it be possible to make something like that? - Theklan ( talk) 11:55, 25 June 2012 (UTC)
#RFC: Deploying 'Start date' template in infoboxes was closed, with unanimous support to implement the following:
A great many infoboxes already emit microformats, and have for months, or even years. However, in some articles, these are incomplete, because the dates which form part of them do not use an appropriate sub-template, in order to emit the date in the correct metadata format. A bot (or bots - this task could be subdivided) is required, to complete the task of converting opening-, release-, first shown-, incident- and such dates from plain text to use {{ Start date}}, as seen in this example edit for a year, and this one for a full date and as described in the various infoboxes' documentation. Note that {{ start date}} allows for YYYY, YYYY-MM, YYYY-MM-DD and in a few cases YYY-M-DD:HH:MM formats. Note that Smackbot was approved to do this, and started, but failed to complete the task. A list of affected templates is available.
Can someone assist, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:11, 30 June 2012 (UTC)
Another piece of my former request, which got sidetracked during Rich's ban.
Could someone reorder the parameters in transclusions of {{ infobox language}} to match the order on the documentation page?
The problem with having them mixed up is that sometimes they end up getting duplicated, which causes problems: a blank entry will override an earlier, filled, entry in the expected position, for example, and there is some bizarre stuff hidden in such duplicates.
It would be nice to have a separate line for each parameter (which, for the most part, we already have), and also for the closing "}}". The two exceptions would be latitude/longitude and for multiple ISO codes, which are stacked like this in most articles and are expected this way:
|lc1=abc |ld1=name a |lc2=def |ld2=name b |lc3=ghi |ld3=name c
(lc comes before ld because it's always the same length, and so lines up better this way.)
If there are duplicate parameters, could they be tagged with a fix-it category for manual review? (Even if the 2nd is empty, because we'd need to review whatever it is hiding in the 1st instance before we display that.)
Any unsupported params (not included in the documentation) should be ordered at the end and also tagged for review. (Unless they're empty, in which case they can be deleted.)
— kwami ( talk) 21:48, 29 June 2012 (UTC)
Get a bot to delete user pages that meet WP:U1. Admins have better stuff to do.-- Otterathome ( talk) 16:31, 1 July 2012 (UTC)
What about attack/test pages? Or does edit filter already stop this? (section renamed)-- Otterathome ( talk) 19:36, 5 July 2012 (UTC)
Hello-
Can I get a bot to replace all !scope="col" tags from the table at User:Albacore/Sandbox to !scope="row" tags? This is advisable since double bolding in tables is discouraged, and it's easy enough to change the !scope="row" tags back to !scope="col" tags for the columns (only five). Tony Award for Best Featured Actor in a Play and Tony Award for Best Featured Actress in a Play need this as well. Thanks. Albacore ( talk) 04:16, 5 July 2012 (UTC)
Technical details about this error:
Last attempted database query: (SQL query hidden)
Function: SqlBagOStuff::set
MySQL error: 1114: The table 'pc000' is full (10.0.6.50)
Albacore (
talk) 13:37, 5 July 2012 (UTC)
I am forever cleaning up polluted categories (see Wikipedia:Database reports/Polluted categories) by removing pages from the wrong namespace out of content categories. A big culprit is user sandboxes, especially now that they are more easily used. I would like to get a bot to keep an eye on it and remove any categories (and interwiki links if possible). BattyBot can do it but apparently it is only semi-automated. It should be an easy bot task especially if it is only done for user sandboxes. Any takers? -- Alan Liefting ( talk - contribs) 04:41, 29 June 2012 (UTC)
While many such issues can be resolved by commenting out the category from the sandbox page, some categories are embedded inside templates. Is it possible to add namespace detection in such templates so that the categories are only included on article pages? See categories with the hidden template {{ polluted category}} for many examples. Thanks! GoingBatty ( talk) 19:51, 30 June 2012 (UTC)
All old links to discovery.co.uk now redirects to dsc.discovery.com. That means that 112 links to discovery.co.uk, most of them deeplinks in refs, needs fixing through archive.org or such. I'm not sure how to best sort this out, one possibility might be to simply add {{ wayback}} to every link to discovery.co.uk. From the few tests I made it seems many, but not all, these pages are archived at archive.org. Just adding wayback without a date isn't ideal, but it should be pretty straightforward? Finn Rindahl ( talk) 22:13, 7 July 2012 (UTC)
Recently, some concerns were raised on the External links noticeboard about links to Wikimapia:- [Wikipedia:External_links/Noticeboard#Wikimapia]
The response from there was based on the criteria, links to Wikimapia weren't eligble.
I then checked here: http://en.wikipedia.org/?title=Special:LinkSearch&limit=5000&offset=0&target=http%3A%2F%2F*.wikimapia.org
And found there were quite a few pages using them, sometimes a references, sometimes as External Links.
I was told that a bot might be able to handle removals. Sfan00 IMG ( talk) 17:54, 8 July 2012 (UTC)
It appears that the bot used to generate the page Wikipedia:WikiProject Anime and manga/Assessment/Cleanup listing and Wikipedia:WikiProject Anime and manga/Cleanup task force/Cleanup listing stopped working in March 2010 and has been down ever since, is there a bot that can be used to replace the old one and keep these pages auto-updated per month? - Knowledgekid87 ( talk) 22:17, 9 July 2012 (UTC)
Is there a bot that could create a table with a couple of cells from a list of users at What links here from {{ retired}}? The table would need cells for "last contact attempt" and "notes". Ryan Vesey Review me! 19:45, 6 July 2012 (UTC)
Here you go, I hope this is close to what you want. I added an extra column for the talk page because I noticed some didn't have a main page and thought it might be helpful. Here are a few other things I noticed:
Please let me know if you need anything else. Kumioko ( talk) 22:22, 6 July 2012 (UTC)
I'm sure this won't pass, so I'm not holding my breath. Not meaning to be sarcastic per se, just cognizant that every effort to enforce NFCC is shot down these days with nauseating regularity. But anyway...
WP:MOSLOGO notes that non-free images are "nearly always prohibited" as icons. Yet, I routinely see non-free images being used as icons.
Case example; File:Hezbollah Flag.jpg came up at Wikipedia:Non-free_content_review#File:Hezbollah_Flag.jpg recently. I found this interesting because I have, in the past, removed the image from many articles for failing WP:NFCC #10c and WP:MOSLOGO. It keeps getting restored anyway, especially in uses as icons. It is in fact used 21 times as an icon in various articles.
I propose, therefore, that a bot be created that patrols mainspace looking for icon uses of non-free images such that the rendered image is 30 pixels or less. For example, use in conjunction with {{ flagicon}}. In operation, the bot would remove the use. If applicable, it would remove the template where the non-free image is used as a parameter to that template. Further, a notification be placed on the talk page of the article in question explaining why the image was removed.
I hope you prove my first paragraph wrong. -- Hammersoft ( talk) 23:08, 11 July 2012 (UTC)