This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
Currently, png flag images are being replaced by superior svg versions. Replacing all of the instances personally will be rather tedious, and I think a bot would be much faster and more effective; alas, I don't know anything about bot programming, so I'd like to request assistance with this endeavour. ナイトスタリオン ✉ 07:55, 24 November 2005 (UTC)
Constantly edited articles, especially current events, are severely susceptible to linking to the same subjects more than once. Subjects linked in an article should be linked only once at usually the first instance of the subject keyword. Is there a bot that performs duplicate link removal? I imagine how the bot would function is first by indexing the links on a page, identifying links that occur more than once, and removing the additional links from the bottom of the article upward. Adraeus 19:16, 25 November 2005 (UTC)
Looking over Category:Wikipedia backlog, I notice several perennial backlogs for which a bot may be of assistance. I would like to hear the opinion of people more versed in botting than I am. In particular, I think the Move To Wiktionary/Wikibooks/Meta may be bottable (it probably requires a human doublecheck to see if things are properly tagged, but after that transwiking is still a lot of work and a bot would automatically fill in the transwiki log)
Wikipedia:Duplicated sections is about the old bug that doubled parts of pages. This sounds like something a bot could plausibly fix.
Would it be feasible to help out Special:Uncategorizedpages and/or Wikipedia:List of lists/uncategorized by using bots to categorize them using some keywords found on the page?
I don't suppose Category:Articles that need to be wikified could be meaningfully aided by a bot?
My first thought about Wikipedia:Templates with red links would be to employ a bot to strip the redlinks from the templates, but that's probably not what people want.
And finally, Category:NowCommons sounds mechanical enough that a bot might work but I'm not sure about that.
Comments please? R adiant _>|< 23:53, 28 November 2005 (UTC)
This is probably a stupid point (and a bot may be doing it already), but I noticed on a visit to a random US city that an anon had corrected the infobox to account for an additional hour of difference with UTC after Daylight Saving Time ended. Could a bot, or other automation, do this sort of thing en masse on the appropriate dates instead? Xoloz 19:22, 29 November 2005 (UTC)
There are many album articles that use an old infobox, see here. I'm sure a well-programmed bot could do these conversions quickly and easily. It would save a lot of work for us (members of the project). Thanks for listening, I hope someone's up to the task. Gflores Talk 06:36, 3 December 2005 (UTC)
Hi, A bot for clearing the never-ending backlog at NowCommons is needed to do the following tasks:
If the bot can do task 1, that alone would considerably speed up the whole process. Thanks in advance. -- Pamri • Talk 07:00, 5 December 2005 (UTC)
Could a bot be made to check commonly mispelled words? it sounds obvious but a lot of articles are hard to find because of spelling errors in the titles Veritas Liberum 22:29 6 December 2005 (GMT)
Max
Talk
(add) •
Contribs • 06:16, 5 February 2006 (UTC)There is a popular grammar error of entering a parenthetical remark, and then not putting a space between the close paren and the rest of the text. A bot could look for these fairly easy. Sample: George Bush (the current President)doesn't like brocolli. SchmuckyTheCat 22:39, 6 December 2005 (UTC)
For Wikibooks:Castle of the Winds, I need to upload dozens of 32x32x4bpp icons to use in the monster and item tables. Could a bot be written to do this as a batch operation and convert them to PNG? The most convenient way, for this use, would be to extract all the icons from the Windows executable (which I could upload). Seahen 16:19, 11 December 2005 (UTC)
Currently template:Infobox Film has parameters like this:
{{Infobox Film | name = | image = | caption = | director = | producer = | writer = | starring = | music = | cinematography = | editing = | distributor = | release_date = | runtime = | language = | budget = | imdb_id = }}
However, many old movie pages used differnet titles for the same thing. For example, some use "movie_language" instead of the (now) correct "language". Could someone write a bot to change the incorrect titled fields in all the movie pages to the correct titles for these fields. Here's a list of what we would need (and this would be for all pages that use the Infobox Film template only):
This bot was first mentioned by Bobet 17:22, 10 December 2005 (UTC). This would be very helpful, then we could finally get all the movie page templates to be the same.
I know this is a very tough task but it would help greatly. Going to see King Kong Steve-O 13:47, 15 December 2005 (UTC)
Whoops, for some reason you can't see the template here. I suck. You can see it at the talk page for the template.... —preceding unsigned comment by Steve Eifert ( talk • contribs) 13:48, 15 December 2005 (UTC)
I am handling this request with User:NetBot. -- Netoholic @ 18:22, 18 December 2005 (UTC)
There should be a bot that generates nice information for Supreme Court cases, possibly also including links to Oyez, etc. wdaher 07:51, 17 December 2005 (UTC)
Could a bot go over the articles and tag uncategorized ones with {{ uncategorized}}? Maintenance categories should be ignored (like stubs, clean ups) also categories like from {{ 1911}} or Category:xxxx births/deaths. This would create real-time editable uncategorized pages storage. Now Special:Uncategorized are not editable, are limited to 1000 pages, often neglected to refresh, etc. It would really foster categorization efforts. Renata3 20:52, 19 December 2005 (UTC)
Could a bot be used to notify users when they get a reply to their question on the Reference desk? Bart133 21:13, 23 December 2005 (UTC)
I'm part of Wikipedia:Typo and I've been fixing typos in this way: Using this google search, I extract the names of the pages unto a text file using a simple program I wrote. From there I load it up on AutoWikiBot and fill in the appropriate search and replace fields and make the necessary changes. I hit save for each one and very rarely is there a false positive. However, I know that this could easily be done with a bot, which would save time doing this manually. Any takers? Gflores Talk 03:04, 25 December 2005 (UTC)
Can someone run a one time bot to subst all uses of the ll template. i.e. replace all: {{ll|Foo}}) with {{subst:ll|Foo}})
You can use this page to feed the bot. 67.165.96.26 23:30, 28 December 2005 (UTC)
See WP:FAF, which documents how featured articles have changed so they can be continually evaluated with WP:FAR. FAF lists articles with a link to the version that was originally approved and a diff showing the changes since then. It is quite tedious to crawl through the history of the talk page, find the date the featured template was added, compare it to the history of the article and add the revision id to FAF. Could this be automated? All or virtually all of the talk pages had the featured template added in an edit with the summary "{{featured}}" or "{{msg:featured}}". Using the date of that revision, a bot could presumably get the id of the version that was current at that time, and add it to WP:FAF. Tuf-Kat 06:05, 30 December 2005 (UTC)
I want to know about bot commands :
It would be really awesome if someone could write a bot to automatically fix single redirects (that is, when A links to B, and B is a redirect to C, the bot would insert C into A, piping B, [[C|B]], that is.) -- maru (talk) Contribs 02:03, 2 January 2006 (UTC)
I wrote a java program to generate a list of all articles which link to {{ Numismaticnotice}} and are thus part of Wikipedia:WikiProject Numismatics. Actually, I manually use "What links here" and cut and paste into a file. The program then takes that file and formats it so I can cut and paste back to Wikipedia:WikiProject Numismatics/Articles. I try to do this about once a week. I also ran it for the folks at Wikipedia:WikiProject Hawaii, and someone asked if I could make it a bot. I am relatively new here and know nothing about bots. Is this the sort of job that could be done by a bot? I'm sure other projects would appreciate it too. Ingrid 18:24, 3 January 2006 (UTC)
To make matters more complicated, the list that I get from "what links here" suddenly got a lot smaller. It appears to be because some of the template references include "template". So, instead of having {{Numismaticnotice}}, lots of pages have {{template:Numismaticnotice }} or sometimes {{Template:Numismaticnotic}}, and unfortunately there may be others slight changes but I don't know. I haven't updated the list, so the one at articles (link above) is still the full list. Is there a bot that can go through those articles and fix them? Or is it a bug? The template looks right on the page. An example is at Talk:History of the English penny (1066-1154). Ingrid 05:15, 8 January 2006 (UTC)
I don't know if I should post here or start a new entry at the bottom of the page. I guess I'll start here, and see if I get any replies. I have ported my java code to python, and have downloaded the pywikipediabot. Would someone mind helping me get started? Pywikipediabot intimidates me, and I just don't know where to begin. Ingrid 02:11, 24 February 2006 (UTC)
Hi, I need a bot which I can use to gather some information of the same kind from the net. I will also edit part of it by hand. More specifically I would like to have a bot so that I can organize the World Univeristies: basic information, their history, famous people taught or educated at these institutions, etc, etc. Is there a bot I can use for this purpose? If not, can anybody generate it for me? I would appreciate if somebody could help. I am pretty new to the bot business so the information you will provide should be rather soft. Thanks,
Resid Gulerdem 03:48, 5 January 2006 (UTC)
Wikipedia:Deletion log and Wikipedia:Upload_log are both repositories for the old logs from before 2005 that are not recorded in the current logs. However, since they are now in the Wikipedia namespace, all the wiki syntax in the deletion reason fields and upload summaries become real. As a result, any dozens of templates named in those fields have become real templates, like {{stub}}s in deletion reasons and {{GFDL}}s in upload logs, and the pages are now inappropriately in categories. Could someone create a one-time script that will go through all those log archives and <nowiki> any templates contained in them (there should be none)? Dmcdevit· t 07:28, 7 January 2006 (UTC)
A bot would be really helpful to add Template:Time Person of the Year to all of the different pages that the template links to. Thanks for your help. P-unit 00:04, 10 January 2006 (UTC)
Looking for a bot that can or which can be modified (by me) to perform a large series of simple moves on articles with a certain name format. Something that can take regular expressions would be ideal, and perform a substitution on the name a la sed to form the new name. It should use the normal WP move operation to retain redirects on the old names. Prefer something that runs under Unix and preferably not Python unless it can work out of the box with only configuration or arguments.
Purpose is for a move of area codes to be non-US-centric as per discussion in Talk:North American Numbering Plan. - Keith D. Tyler ¶ 22:37, 12 January 2006 (UTC)
It is a company with a major specialization in military history and is frequently cited as a reference on Wikipedia military history pages. I recently created a stub article on it, and would like a bot to go through and Wikify the current plaintext on the several hundred pages to read Osprey Publishing instead of Osprey Publishing. Palm_Dogg 19:47, 13 January 2006 (UTC)
That shouldn't be too hard :) Xxpor 14:36, 12 February 2006 (UTC)
I started working on it with User:Xbot
I was wondering if it is possible to create a bot to handle music genre links that point to disambiguation pages, such pop, rock, LP, heavy metal, folk, indie, punk, rap, emo, hip hop, album, EP, hip-hop (should point to hip hop music, country, bluegrass. and probably a couple others, and exchange them with their proper links. The bot would only search in articles with the Template:Album infobox and Template:Infobox Band, so as to avoid incorrect edits. Any takers? Comments? Gflores Talk 03:59, 14 January 2006 (UTC)
I'm requesting a bot to change pages linking to the wrong capitalisation pages to the correct titles:
Will this be possible? -- Thorpe | talk 18:41, 15 January 2006 (UTC)
There are a large number of external links which are not functioning because the editor who added them mistakenly used the pipe symbol (|) as it is used in internal wikilinks rather than just a space, or sometimes a pipe and then a space.
E.g. Wrong [http://www.bbc.co.uk|BBC Website]
Can a bot be created to automatically put these external links in the correct format? -- Spondoolicks 16:29, 16 January 2006 (UTC)
Is anyone working on bots for Wikipedia:Dead external links? Please reply on my talk page as well as below. Thanks. -- James S. 05:45, 18 January 2006 (UTC)
I would like a bot that could add a Category tag ( Category:Hit for the cycle)to all the baseball players who have hit for the cycle (there is a list at that page). zellin t / c 17:47, 21 January 2006 (UTC)
The {{ cl}} and {{ tl}} templates are widely used on wikipedia, however, it would probably be beneficial to subst: most of their use), if only because they are solely sued as a quick shortcut. This would be best doneon a semi-regular basis so that some pages where the templates are actually useful, such as WP:WSS/ST or pages listing templates can be reverted and removed from later substing. This could cover several of the templates in Category:Internal link templates. Circeus 20:06, 21 January 2006 (UTC)
I'll give it some thought... But how would one go about this? downloading each file individualy and checking md5sums? Honestly, this seems like something that would be best performed by someone with access to the boxes, so to preserve bandwidth and time. Jmax- 23:19, 28 January 2006 (UTC)
It seems there are many pages that change yearly (logo of Super Bowl winner, picture of Miss America winner, names/cabinet/etc. of presidents, etc.) that could use one of two things. a) a {future} tag to high-light the fact that yearly changes need to be made. and/or b) have the bot automatically submit edit requests. Waarmstr 04:27, 30 January 2006 (UTC)
I have been asking around about whether it would be possible to write a bot that would produce a list of all images uploaded by indefinitely blocked users. Based on my own experience, it seems like at least 75% or so of such images are unencyclopedic, orphaned, copyvios, or similarly inappropriate. Thoughts? Thanks, Chick Bowen 18:19, 29 January 2006 (UTC)
There are thousands of music albums on Wikipedia that link to Album (music). However the Album (music) page just redirects to the actual Album page. Can a bot be written to repoint these to the Album page instead of continuously going through a redirect. I'd write one myself and get approval but I don't know how to write one to do this. I've been changing the links when I've come across them but there are thousands of them (and thousands that link directly to Album so just moving and redirecting the other way won't work). Ben W Bell 10:54, 31 January 2006 (UTC)
The Perverted-Justice.com article lists an ongoing total for the number of convcitions and number of busts that are listed on the website the article is about. Since this number changes so frequently, and I think it wouldn't be difficult to automatically retrieve it, I thought it might be a good idea if someone could write a bot to automatically update this. I was thinking that the bot could run once weekly, pull the recent conviction/bust count, and update the approprait sections in the article. Possible? Reasonable? Fieari 16:21, 31 January 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
Currently, png flag images are being replaced by superior svg versions. Replacing all of the instances personally will be rather tedious, and I think a bot would be much faster and more effective; alas, I don't know anything about bot programming, so I'd like to request assistance with this endeavour. ナイトスタリオン ✉ 07:55, 24 November 2005 (UTC)
Constantly edited articles, especially current events, are severely susceptible to linking to the same subjects more than once. Subjects linked in an article should be linked only once at usually the first instance of the subject keyword. Is there a bot that performs duplicate link removal? I imagine how the bot would function is first by indexing the links on a page, identifying links that occur more than once, and removing the additional links from the bottom of the article upward. Adraeus 19:16, 25 November 2005 (UTC)
Looking over Category:Wikipedia backlog, I notice several perennial backlogs for which a bot may be of assistance. I would like to hear the opinion of people more versed in botting than I am. In particular, I think the Move To Wiktionary/Wikibooks/Meta may be bottable (it probably requires a human doublecheck to see if things are properly tagged, but after that transwiking is still a lot of work and a bot would automatically fill in the transwiki log)
Wikipedia:Duplicated sections is about the old bug that doubled parts of pages. This sounds like something a bot could plausibly fix.
Would it be feasible to help out Special:Uncategorizedpages and/or Wikipedia:List of lists/uncategorized by using bots to categorize them using some keywords found on the page?
I don't suppose Category:Articles that need to be wikified could be meaningfully aided by a bot?
My first thought about Wikipedia:Templates with red links would be to employ a bot to strip the redlinks from the templates, but that's probably not what people want.
And finally, Category:NowCommons sounds mechanical enough that a bot might work but I'm not sure about that.
Comments please? R adiant _>|< 23:53, 28 November 2005 (UTC)
This is probably a stupid point (and a bot may be doing it already), but I noticed on a visit to a random US city that an anon had corrected the infobox to account for an additional hour of difference with UTC after Daylight Saving Time ended. Could a bot, or other automation, do this sort of thing en masse on the appropriate dates instead? Xoloz 19:22, 29 November 2005 (UTC)
There are many album articles that use an old infobox, see here. I'm sure a well-programmed bot could do these conversions quickly and easily. It would save a lot of work for us (members of the project). Thanks for listening, I hope someone's up to the task. Gflores Talk 06:36, 3 December 2005 (UTC)
Hi, A bot for clearing the never-ending backlog at NowCommons is needed to do the following tasks:
If the bot can do task 1, that alone would considerably speed up the whole process. Thanks in advance. -- Pamri • Talk 07:00, 5 December 2005 (UTC)
Could a bot be made to check commonly mispelled words? it sounds obvious but a lot of articles are hard to find because of spelling errors in the titles Veritas Liberum 22:29 6 December 2005 (GMT)
Max
Talk
(add) •
Contribs • 06:16, 5 February 2006 (UTC)There is a popular grammar error of entering a parenthetical remark, and then not putting a space between the close paren and the rest of the text. A bot could look for these fairly easy. Sample: George Bush (the current President)doesn't like brocolli. SchmuckyTheCat 22:39, 6 December 2005 (UTC)
For Wikibooks:Castle of the Winds, I need to upload dozens of 32x32x4bpp icons to use in the monster and item tables. Could a bot be written to do this as a batch operation and convert them to PNG? The most convenient way, for this use, would be to extract all the icons from the Windows executable (which I could upload). Seahen 16:19, 11 December 2005 (UTC)
Currently template:Infobox Film has parameters like this:
{{Infobox Film | name = | image = | caption = | director = | producer = | writer = | starring = | music = | cinematography = | editing = | distributor = | release_date = | runtime = | language = | budget = | imdb_id = }}
However, many old movie pages used differnet titles for the same thing. For example, some use "movie_language" instead of the (now) correct "language". Could someone write a bot to change the incorrect titled fields in all the movie pages to the correct titles for these fields. Here's a list of what we would need (and this would be for all pages that use the Infobox Film template only):
This bot was first mentioned by Bobet 17:22, 10 December 2005 (UTC). This would be very helpful, then we could finally get all the movie page templates to be the same.
I know this is a very tough task but it would help greatly. Going to see King Kong Steve-O 13:47, 15 December 2005 (UTC)
Whoops, for some reason you can't see the template here. I suck. You can see it at the talk page for the template.... —preceding unsigned comment by Steve Eifert ( talk • contribs) 13:48, 15 December 2005 (UTC)
I am handling this request with User:NetBot. -- Netoholic @ 18:22, 18 December 2005 (UTC)
There should be a bot that generates nice information for Supreme Court cases, possibly also including links to Oyez, etc. wdaher 07:51, 17 December 2005 (UTC)
Could a bot go over the articles and tag uncategorized ones with {{ uncategorized}}? Maintenance categories should be ignored (like stubs, clean ups) also categories like from {{ 1911}} or Category:xxxx births/deaths. This would create real-time editable uncategorized pages storage. Now Special:Uncategorized are not editable, are limited to 1000 pages, often neglected to refresh, etc. It would really foster categorization efforts. Renata3 20:52, 19 December 2005 (UTC)
Could a bot be used to notify users when they get a reply to their question on the Reference desk? Bart133 21:13, 23 December 2005 (UTC)
I'm part of Wikipedia:Typo and I've been fixing typos in this way: Using this google search, I extract the names of the pages unto a text file using a simple program I wrote. From there I load it up on AutoWikiBot and fill in the appropriate search and replace fields and make the necessary changes. I hit save for each one and very rarely is there a false positive. However, I know that this could easily be done with a bot, which would save time doing this manually. Any takers? Gflores Talk 03:04, 25 December 2005 (UTC)
Can someone run a one time bot to subst all uses of the ll template. i.e. replace all: {{ll|Foo}}) with {{subst:ll|Foo}})
You can use this page to feed the bot. 67.165.96.26 23:30, 28 December 2005 (UTC)
See WP:FAF, which documents how featured articles have changed so they can be continually evaluated with WP:FAR. FAF lists articles with a link to the version that was originally approved and a diff showing the changes since then. It is quite tedious to crawl through the history of the talk page, find the date the featured template was added, compare it to the history of the article and add the revision id to FAF. Could this be automated? All or virtually all of the talk pages had the featured template added in an edit with the summary "{{featured}}" or "{{msg:featured}}". Using the date of that revision, a bot could presumably get the id of the version that was current at that time, and add it to WP:FAF. Tuf-Kat 06:05, 30 December 2005 (UTC)
I want to know about bot commands :
It would be really awesome if someone could write a bot to automatically fix single redirects (that is, when A links to B, and B is a redirect to C, the bot would insert C into A, piping B, [[C|B]], that is.) -- maru (talk) Contribs 02:03, 2 January 2006 (UTC)
I wrote a java program to generate a list of all articles which link to {{ Numismaticnotice}} and are thus part of Wikipedia:WikiProject Numismatics. Actually, I manually use "What links here" and cut and paste into a file. The program then takes that file and formats it so I can cut and paste back to Wikipedia:WikiProject Numismatics/Articles. I try to do this about once a week. I also ran it for the folks at Wikipedia:WikiProject Hawaii, and someone asked if I could make it a bot. I am relatively new here and know nothing about bots. Is this the sort of job that could be done by a bot? I'm sure other projects would appreciate it too. Ingrid 18:24, 3 January 2006 (UTC)
To make matters more complicated, the list that I get from "what links here" suddenly got a lot smaller. It appears to be because some of the template references include "template". So, instead of having {{Numismaticnotice}}, lots of pages have {{template:Numismaticnotice }} or sometimes {{Template:Numismaticnotic}}, and unfortunately there may be others slight changes but I don't know. I haven't updated the list, so the one at articles (link above) is still the full list. Is there a bot that can go through those articles and fix them? Or is it a bug? The template looks right on the page. An example is at Talk:History of the English penny (1066-1154). Ingrid 05:15, 8 January 2006 (UTC)
I don't know if I should post here or start a new entry at the bottom of the page. I guess I'll start here, and see if I get any replies. I have ported my java code to python, and have downloaded the pywikipediabot. Would someone mind helping me get started? Pywikipediabot intimidates me, and I just don't know where to begin. Ingrid 02:11, 24 February 2006 (UTC)
Hi, I need a bot which I can use to gather some information of the same kind from the net. I will also edit part of it by hand. More specifically I would like to have a bot so that I can organize the World Univeristies: basic information, their history, famous people taught or educated at these institutions, etc, etc. Is there a bot I can use for this purpose? If not, can anybody generate it for me? I would appreciate if somebody could help. I am pretty new to the bot business so the information you will provide should be rather soft. Thanks,
Resid Gulerdem 03:48, 5 January 2006 (UTC)
Wikipedia:Deletion log and Wikipedia:Upload_log are both repositories for the old logs from before 2005 that are not recorded in the current logs. However, since they are now in the Wikipedia namespace, all the wiki syntax in the deletion reason fields and upload summaries become real. As a result, any dozens of templates named in those fields have become real templates, like {{stub}}s in deletion reasons and {{GFDL}}s in upload logs, and the pages are now inappropriately in categories. Could someone create a one-time script that will go through all those log archives and <nowiki> any templates contained in them (there should be none)? Dmcdevit· t 07:28, 7 January 2006 (UTC)
A bot would be really helpful to add Template:Time Person of the Year to all of the different pages that the template links to. Thanks for your help. P-unit 00:04, 10 January 2006 (UTC)
Looking for a bot that can or which can be modified (by me) to perform a large series of simple moves on articles with a certain name format. Something that can take regular expressions would be ideal, and perform a substitution on the name a la sed to form the new name. It should use the normal WP move operation to retain redirects on the old names. Prefer something that runs under Unix and preferably not Python unless it can work out of the box with only configuration or arguments.
Purpose is for a move of area codes to be non-US-centric as per discussion in Talk:North American Numbering Plan. - Keith D. Tyler ¶ 22:37, 12 January 2006 (UTC)
It is a company with a major specialization in military history and is frequently cited as a reference on Wikipedia military history pages. I recently created a stub article on it, and would like a bot to go through and Wikify the current plaintext on the several hundred pages to read Osprey Publishing instead of Osprey Publishing. Palm_Dogg 19:47, 13 January 2006 (UTC)
That shouldn't be too hard :) Xxpor 14:36, 12 February 2006 (UTC)
I started working on it with User:Xbot
I was wondering if it is possible to create a bot to handle music genre links that point to disambiguation pages, such pop, rock, LP, heavy metal, folk, indie, punk, rap, emo, hip hop, album, EP, hip-hop (should point to hip hop music, country, bluegrass. and probably a couple others, and exchange them with their proper links. The bot would only search in articles with the Template:Album infobox and Template:Infobox Band, so as to avoid incorrect edits. Any takers? Comments? Gflores Talk 03:59, 14 January 2006 (UTC)
I'm requesting a bot to change pages linking to the wrong capitalisation pages to the correct titles:
Will this be possible? -- Thorpe | talk 18:41, 15 January 2006 (UTC)
There are a large number of external links which are not functioning because the editor who added them mistakenly used the pipe symbol (|) as it is used in internal wikilinks rather than just a space, or sometimes a pipe and then a space.
E.g. Wrong [http://www.bbc.co.uk|BBC Website]
Can a bot be created to automatically put these external links in the correct format? -- Spondoolicks 16:29, 16 January 2006 (UTC)
Is anyone working on bots for Wikipedia:Dead external links? Please reply on my talk page as well as below. Thanks. -- James S. 05:45, 18 January 2006 (UTC)
I would like a bot that could add a Category tag ( Category:Hit for the cycle)to all the baseball players who have hit for the cycle (there is a list at that page). zellin t / c 17:47, 21 January 2006 (UTC)
The {{ cl}} and {{ tl}} templates are widely used on wikipedia, however, it would probably be beneficial to subst: most of their use), if only because they are solely sued as a quick shortcut. This would be best doneon a semi-regular basis so that some pages where the templates are actually useful, such as WP:WSS/ST or pages listing templates can be reverted and removed from later substing. This could cover several of the templates in Category:Internal link templates. Circeus 20:06, 21 January 2006 (UTC)
I'll give it some thought... But how would one go about this? downloading each file individualy and checking md5sums? Honestly, this seems like something that would be best performed by someone with access to the boxes, so to preserve bandwidth and time. Jmax- 23:19, 28 January 2006 (UTC)
It seems there are many pages that change yearly (logo of Super Bowl winner, picture of Miss America winner, names/cabinet/etc. of presidents, etc.) that could use one of two things. a) a {future} tag to high-light the fact that yearly changes need to be made. and/or b) have the bot automatically submit edit requests. Waarmstr 04:27, 30 January 2006 (UTC)
I have been asking around about whether it would be possible to write a bot that would produce a list of all images uploaded by indefinitely blocked users. Based on my own experience, it seems like at least 75% or so of such images are unencyclopedic, orphaned, copyvios, or similarly inappropriate. Thoughts? Thanks, Chick Bowen 18:19, 29 January 2006 (UTC)
There are thousands of music albums on Wikipedia that link to Album (music). However the Album (music) page just redirects to the actual Album page. Can a bot be written to repoint these to the Album page instead of continuously going through a redirect. I'd write one myself and get approval but I don't know how to write one to do this. I've been changing the links when I've come across them but there are thousands of them (and thousands that link directly to Album so just moving and redirecting the other way won't work). Ben W Bell 10:54, 31 January 2006 (UTC)
The Perverted-Justice.com article lists an ongoing total for the number of convcitions and number of busts that are listed on the website the article is about. Since this number changes so frequently, and I think it wouldn't be difficult to automatically retrieve it, I thought it might be a good idea if someone could write a bot to automatically update this. I was thinking that the bot could run once weekly, pull the recent conviction/bust count, and update the approprait sections in the article. Possible? Reasonable? Fieari 16:21, 31 January 2006 (UTC)