This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | Archive 6 | Archive 7 | Archive 8 | Archive 9 | Archive 10 | → | Archive 15 |
Too much has been allowed to creep into this category and be reused without watching. The template and category should have been deleted a long time ago, but now represents so much work in removing images from articles nobody would dare approaching it. Can a bot deal with removing images from articles? Circeus 01:03, 25 June 2006 (UTC)
A bot (or perhaps a script, or some other tool) would be very useful to the growing amount of people (myself included) who are interested in studying Wikipedia. I'd very much like to see and use a tool that would look at the history of any article (including a talk page!) and:
Even one or a few of those if implemented would be much, much appreciated! If we already have tools that can answer some of these questions, please let me know.-- Piotr Konieczny aka Prokonsul Piotrus Talk 00:40, 28 June 2006 (UTC)
I have been thinking over the last few days and seeing that Category:Images with no copyright tag, Category:Images with unknown copyright status, Category:Orphaned fairuse images and Category:Images with no fair use rationale get back-logged quite a lot, I think I bot would be able to help us combat these backlogs. The bot should go through the sub-categories of the main categories and remove all those images listed at the sub-category and remove them from the article(s). The bot would not delete the image, merely remove it from the article(s) that contain the image. This would make it a whole lot easier for admins to go around these categories and delete the images there, and not have to remove the images from the article themselves. I am aware OrphanBot does some work linked to this, but it doesn't go through all of the categories. Iola k ana| (talk) 15:20, 25 June 2006 (UTC)
I'd very much like a bot that would fill in the blanks in incomplete {{ cite book}}s. A cite can be considered incomplete if it's missing any of the following parameters: Title, last, first, publisher, year, id.
I can see five cases in which this could be used (given in fall-through order):
If the bot is unable to fill in all of these basic parameters, it should insert ?s for the missing ones and/or a page comment, to show that it has tried.
If this type of bot responds quickly to new {{ cite book}}s, it will no doubt save editors a lot of tedious work looking up and typing in all the data fields. We can tell them to type just the ISBN, and suggest that they come back later and verify the bot's work. Seahen 03:42, 28 June 2006 (UTC)
Bot's Name: SpyroBot Editing Spyro Articles
I noticed a lot of the requested article pages are in quite a mess with not being proper formatted, ie bullet points.
Would it be possible for someone to write a bot to place bullet points in front of the requests to make the pages look better?
I did do one page manually, and it took quite a while.
And, would it be possible for a bot to automatically remove blue links.
I would happily run it if someone could write it!
Thanks!
Reedy Boy 10:38, 1 July 2006 (UTC)
I have already created a list of some cross-space redirects at User:Invitatious/cnr. I would like this bot to do the following:
==BE== * [[:Being a dick]] → [[:Wikipedia:Don't be a dick]]
Invitatious 19:07, 1 July 2006 (UTC)
I'd like to see if anyone's up for writing a bot to help those who monitor PROD. Specifically, I think a bot could:
Number 1 especially would be very helpful in patrolling. I don't have the skills to write the bot myself, or I would. Mango juice talk 14:50, 2 July 2006 (UTC)
I am requesting a bot that can detect & remove offensive language.-- StitchPedia 00:40, 4 July 2006 (UTC)
—Preceding unsigned comment added by OneWeirdDude ( talk • contribs)
Will a botbeard please go through the Whatlinkshere for Template:Redirect-acronym (recently moved from Template:Disambig-acronym) and snap the links to point to the new name? JesseW, the juggling janitor 03:10, 6 July 2006 (UTC)
I think a bot that creates an index for all 2 and 3 word articles according to their initials (instead of the first letters as in Wikipedia:Quick index) would be very useful for browsing and figuring out abbreviations. I guess in current software it can't be done in a dynamic way or searched for initials. Any ideas?
I would like to work on such a bot, but I have no idea how to write one (even though I have some programming experience). If somebody send me a code for a similar bot (that looks every article title and categorize for a certain criteria) I can modify it.
-- þħɥʂıɕıʄʈʝɘɖı 22:10, 6 July 2006 (UTC)
A bot is needed to keep template:ISBN out of actual use, see Wikipedia:Templates for deletion/Log/2006 July 9#Template:ISBN. Circeus 12:59, 11 July 2006 (UTC)
We've got a non-Wikimedia MediaWiki site over at
Wikible; there are several versions of the wiki in different languages, so we also have a Pool (like Wikimedia Commons). We've changed how things work and now have a bunch of images to transfer from wikible.org/en to wikible.org/pool. I don't think anyone has any bot experience in our group; would you mind helping us out? Read
the discussion on our site for more background and relevant links. Thanks! --
J. J. 18:42,
11 July
2006 (UTC)
Any idea whom should I contact to get more answers there?-- Piotr Konieczny aka Prokonsul Piotrus Talk 16:11, 13 July 2006 (UTC)
I'd like a bot to go through small articles and label them stubs. Anything smaller than 1k is probably a stub, yeah? -- BradBeattie 05:54, 16 July 2006 (UTC)
I need a bot to review a page I added. I've been without the necessary time to review my writing and I know there must be mulitple grammar and spelling errors. Page is H. B. Hollins.-- LongIslander 15:31, 17 July 2006 (UTC)
I originally posted this here, but figured it would be good to post it here also.
I've noticed a recent issue with WikiProjects. I've noticed it in the one I work on, Wikipedia:WikiProject_Anime_and_manga, but it probably applies to all WikiProjects. When an article is declared a "Good Article" or any other article class, editors add the appropriate tag on the Discussion page, but they often forget to add the appropriate Wikiproject tag that says "this is a good article for Wikiproject whatever". This means that the Wikiproject statistics page that shows how many Good Articles and the like the Wikiproject has may be drastically off, and the categories sorting the Wikiproject's articles may show tons of "good articles" and the like in the "unassessed" section.
Would it be possible for a bot to regularly peruse the Wikiproject article discussion pages and find ones that have a GA tag but no Wikiproject GA tag, and the same for all article assessment tags? Also, if a Wikiproject has a system for article ratings that doesn't coincide with the main Wikipedia system (as warned by a commentator on my original post), the bot could simply not affect articles on that Wikiproject. Dark Shikari 15:57, 17 July 2006 (UTC)
It could change M/Y or M/D/Y dates for template defaults like {{cleanup}} and various "As of..." containing articles. It sounds like it wouldn't put too much strain on the servers and would prevent articles from becoming inaccurate datewise. -- Blackjack48 23:38, 18 July 2006 (UTC)
How about a bot that fixes links that redirect. Like, say a link points to A, which redirects to B; the bot could fix it that it points straight to B. OneWeirdDude 18:22, 20 July 2006 (UTC)
Given the recent apparent demises of both Crypticbot ( talk · contribs) and NekoDaemon ( talk · contribs), which did a bunch of housekeeping tasks related to WP:CFD, WP:TFD, and the WP:VPs (and others), I propose we designate some bot source as the "official" daily maintenance bot, create an account for it, post the source, solicit an owner, and put its daily tasklist (instructions) in a protected file. This way anyone could propose or even implement new tasks for it, and if the current owner goes missing, anyone else could take over as its owner as well. I've started a list of maintenance activities at wikipedia:maintenance/tasklist. IMO, having the normal operation of any of the *fD activities rely on any individual's closed source bot is not a sustainable model. Comments? -- Rick Block ( talk) 01:07, 21 July 2006 (UTC)
This comment is a little different from the others here as its not directly a Bot request but rather some assistance with clarifying whether a bot that can do the following is possible. Any thoughts before I embark on trying to make it are greatly helpful. Please forgive me if this is the wrong place to raise this. I am in the early stages of designing and building a bot named the Prolificity Sentinel.
In short the bot will flag articles where:
It will upon finding a suitable candidate for flagging edit the page of the article and give it 'Sentinel Alert Status'. This will be a category all wikipedians can see and go through.
Any way, thats a very breif overview, i've discussed it in much more detail here Prolificity Sentinel. Thansk for your help and sorry if this is not an appropriate place to ask this. -- WikipedianProlific (Talk) 22:43, 25 July 2006 (UTC)
I guess the three key things I'd need to know per page on the run are:
So per page analysed there'd be 3 requests for information if it was going to be flagged. I was thinking of setting the Bot up with specific parameters say, so it would only do a run of say 1,000 articles at a time(roughly one per minute over the course of a day?) I can make the bot less server intense by having it immiediately stop requesting say the last talk page edit if its already found out that the main page edit was within the last 6 months. That way the majority of articles would only have one piece of information requested making sure I dont accidently DoS the server. Any thoughts on that. Is there a database query that can be made to ascertain the last page edit date? Thats the key think I need to know and I can't find a script for it if one exists. ta -- WikipedianProlific (Talk) 00:33, 26 July 2006 (UTC)
I wanted to ask if someone could create a bot to automate the changing of links to the CIA's World Factbook web site.
For example, the current link for Malaysia's entry on the Wikipedia page for Putrajaya point to http://www.cia.gov/library/publications/the-world-factbook/geos/my.html . If you go to that address, the CIA says that the page has been moved. Even worse, it doesn't forward you to the correct page for Malaysia. Instead, it redirects you to The World Factbook's front page and you have to navigate yourself to the right page.
I thought they re-did the directory structure or something, but found that the change is much more subtle. They've simply required The World Factbook to be accessed using the secure server method.
So, http://www.cia.gov/library/publications/the-world-factbook/geos/my.html can be successfully viewed at https://www.cia.gov/library/publications/the-world-factbook/geos/my.html .
Can someone create a wikibot to go through the wiki files and change the http://www.cia.gov.... to https://www.cia.gov ?
I figure it'd be more helpful since the CIA doesn't do the forwarding automatically... The wiki community would appreciate it. :-D
Thanks!
Brian -- Bsheppard 23:57, 26 July 2006 (UTC)
Okay, I put in the proposal on WP:BRFA. Feel free to comment, criticize, etc. As a little note about searching for the factbook text, some of those articles already have the http format in place. The bot is going to replace http://www.cia.gov with https://www.cia.gov. Can you guys give me some examples of the NIC thing, so I can add it to the proposal? alphaChimp laudare 22:11, 29 July 2006 (UTC)
Is there a way to make a tool that will help tell us if two users have ever interacted before? For example, it tells you (and provides diffs) of instances when user A has edited user B's talk page (or any user space page) and vice versa. NoSeptember 08:23, 30 July 2006 (UTC)
I don't know whether this requires a new bot or just an addition to a data file for an existing bot. Anyway, various people add links to various pages in http://www.websearchinfo.com/ e.g in Cloaking. All links to this site are spam e.g. http://www.websearchinfo.com/poker and http://www.websearchinfo.com/cloaking-techniques.
Could we create / amend a bot to revert all links to this site shortly after they are made? Nunquam Dormio 11:54, 30 July 2006 (UTC)
Is there a bot that will do font color changes? That is change every instance of a six digit hex string to another 6 digit string within a single page. What I am planning to do is explained here, but the first change is still a month or so in the future. NoSeptember 18:18, 24 June 2006 (UTC)
I'm looking to write a bot that will go trough all pages, (or for wikipedia, preferably a dump) and look for references to uploaded .gif's if found, convert them to .png's and update the reference. I have some experience in programming, but not anything of this sort. I seems like a fun project, but I would like some help making it. Martijn Hoekstra 17:33, 19 July 2006 (UTC)
Have you looked at gif2png, which also includes a python web2png, which may do what you want?
moved the following from the talk page to here: Martijn Hoekstra 17:49, 19 July 2006 (UTC)
Such as bot would go around, and whenever the string "ISBN" is followed by 10 or 13 digits, would compute the special ISBN checksum, and if the checksum is invalid, leave a comment or a template to the effect that someone needs to check the transcription of the ISBN or fix it. -- maru (talk) contribs 04:50, 26 April 2006 (UTC)
Can someone design a bot that when a user posts an alert on the WP:AIV page, it places this template on the reported users' talk page:
This message is to alert you to the fact that you have been reported to the Administrator Intervention against Vandalism (AIV) page so that your case can be reviewed by an Administrator. They may then impose a block for a period of time on your IP to prevent you from editing in the future. If you wish to contest the merits of the report, please post it under the actual report on the AIV page. Do not remove the initial report, as this will probably not help you in trying to prove you are not a vandal.
The template is {{subst:User:Daniel.Bryant/AIV}}. I'll run it off my main user, or create a new bot account, whichever is easier. Thanks! Killfest2 (Critique my new user page design please) 05:13, 20 July 2006 (UTC)
There are hundreds of articles about holders of the Victoria Cross where the opening statement is a standard phrase:
the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to [[United Kingdom|British]] and [[Commonwealth]] forces
which should be disambiguated to
the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to [[United Kingdom|British]] and [[Commonwealth of Nations|Commonwealth]] forces
Simultaneously, most of these articles could have the reference title Monuments To Courage corrected to Monuments to Courage, and many of them need SCOTLAND'S FORgotten VALOUR (and other eccentric capitalisations) converted to Scotland's Forgotten Valour.
Colonies Chris 22:55, 17 August 2006 (UTC)
I was hoping to get a bot to change all instances of List of professional wrestling throws#Spinebuster slam and Professional wrestling throws#Spinebuster slam to List of professional wrestling throws#Spinebuster. Although currently it is linked to spinebuster slam it is a nearly universal consensus that it is always linked to simply as spinebuster and never referred to as a spinebuster slam. The problem is it's a very common link in wrestling profiles and would take an exceedingly long time of human interaction to find all instances of the link and remove the word slam from the URLs. Could someone help me and WP:PW out? --- Lid 10:01, 27 July 2006 (UTC)
I would like to request a bot for me that can:
This would be good, since Curps block bot is inactive at the moment. -- TheM62Manchester 12:05, 30 July 2006 (UTC)
Will anyone be willing to create a bot to redirect ticker symbols to their respective companies? - Blackjack48 18:59, 30 July 2006 (UTC)
We put a request in recently for all the articles in Category:Architects and all it's sub and sub-sub categories to have {{Architecture}} added to their talk pages. I think only the root category got done. Would it be possible to now do the sub and sub-sub-cateogies? Many thanks. -- Mcginnly | Natter 23:33, 31 July 2006 (UTC)
A bunch of images are showing up on Category:License_tags. When I looked at the source it looks like some public domain template might have gotten subst'd in rather than being included using PD-whatever resulting in those pages having the License_tags category applied even though it's in a noinclude section. I forget if subst works that way, but that's my best guess.
I created Template:PD-Japan for some of those images and started applying, but I think it might be bottable. If so that would be much simpler than doing it by hand :-). At the least, I would think the noinclude sections could be removed from the image pages using a bot.
RainbowCrane 22:35, 1 August 2006 (UTC)
{{ Hospital-stub}} and Category:Hospital stubs were created today. Articles in Category:Medical organization stubs that have the word "Hospital" in their title need to be re-stubbed from {{ med-org-stub}} to {{ hospital-stub}}. I count over 200 such articles in Category:Medical organization stubs. If a bot could please do this it would save a lot of time. Kurieeto 18:25, 3 August 2006 (UTC)
I think there should be a bot that fixes spelling mistakes and typos
There are still lots of such maps not moved to Commons. Is it possible for bot to do it? If bot can't find maps, which haven't been moved, I can find some of them. Paweł ze Szczecina 13:34, 6 August 2006 (UTC)
Hi! I would like to ask someone to add missing interwiki links to Polish Wikipedia in Category:Asteroids. Those articles appeared on pl.wiki just few days ago. I was asked to do this with my bot, but it doesn't run here yet. Asteroids articles are named identically on both wikis and we now have every article that you have, so it shouldn't be that hard task. :) And sorry for my probably poor English. Thanks a lot for help, Jozef-k 10:33, 7 August 2006 (UTC)
Regarding this discussion, please someone move every article in Congenital genetic disorder category to Genetic disorder category. Thanks! NCurse work 19:36, 7 August 2006 (UTC)
We all know OrphanBot removes unsourced images. The thing is, sports logos are exempt, because it being a logo is a source in itself. Recently, OrphanBot removed a logo for the Pee Dee Cyclones, and it was in a teamtable and everything. Therefore, I request the installation of FreedomBot, a bot that will undo any damage OrphanBot may unintentionally do when someone feeds it a little too many cookies (restore sports logos OrphanBot may have removed-as long as they're sourced). Tom Danson 14:52, 8 August 2006 (UTC)
lol i love the name already :P
Anyway.. I was going through some random pages and i saw
Turkey slap, and on the talk page I noticed it had survived an AfD, but all that was placed was this
[8]. So i changed it into the right header
[9], and it prompted me to think there must be heaps more of these, hundreds possibly on Wikipedia, and I think we need a bot to either a)put the notice on Talk pages, or b) Change plain text into the proper font. I would be happy to run the bot. I welcome feedback. Thanks --
Deon555|
talk|
e|
Review Me! :D 05:16,
9 August
2006 (UTC)
Eagle_101 has created this [10]. Thanks anyway :) -- Deon555| talk| e| Review Me! :D 03:14, 10 August 2006 (UTC)
I'd like to request an image trawling bot to do some indexing for me. Now, I'll explain with the first one I'd like to go with. Start with linksearch. Go to each image page and log if that image is not in Category:NASA images. I want to try to generate a list of image that reference a NASA image page, but don't have the correct image classification. Post results somewhere (probably not on this page, of course). Would prefer a wikiformatting, like * [[:Image:blah]]\n for each image to make it easier, but whatever. Just need to get a list of images that refer to that link and are not in that category, so I can analyse them for retagging. Note that Special:Linksearch needs to be screen scraped. &action=raw and the rss and atom feeds don't work for it. Don't forget to restrict results to the Image: namespace. If this works, I may request a different URL and a different cat to be compared in the same manner. Thanks! -- Kevin_b_er 05:22, 10 August 2006 (UTC)
I'm attempting to restart this project, and was wondering if a bot could regularly run on a list of its subpages (such as Wikipedia:WikiProject Deletion sorting/UK) to remove transcluded deletion discussions that have been closed. It's not ready for the bot to start yet, I just want to find out if it's feasible/easy, and if anyone is willing to do it. the wub "?!" 14:42, 10 August 2006 (UTC)
Quite simple really, a bot that checks all edits that are done to User: and User_talk: pages and user space for swears/insults/racism etc. And ****'s them. It is optional and only watches user pages that are listed in the bots user space. Thought this would be useful for admins, and users that recieve a lot of vandalism/attacks. Good idea?-- Andeh 18:21, 10 August 2006 (UTC)
I've found a bug in this template. I fixed it, but there are a ton of pages that used the broken template. Here's what I've done, and here's the list of sites that need to be fixed. Is this a thing a bot could fix, or does this need to be done by hand? Lawilkin 22:34, 14 August 2006 (UTC)
The content of Athletics was recently moved to Athletics (track and field), and a disambiguation page placed at Athletics, after a lengthy discussion. However, many pages on track and field still link to Athletics. We need a bot that would redirect appropriate links from Athletics to Athletics (track and field), using the pipe trick to keep the wording the same in the text. -- Mwalcoff 01:20, 15 August 2006 (UTC)
There's probably already a generic bot that can do this, but I need a bot to add the {{ WikiProject Kentucky}} template (if it or {{ LouisvilleWikiProject}} isn't there already) into the talk pages of articles listed in specific categories: Cities in Kentucky, Kentucky counties, Towns in Kentucky, and Unincorporated communities in Kentucky. Doing this manually has become too much of a chore. Thanks! Stevie is the man! Talk • Work 17:41, 15 August 2006 (UTC)
Firefox has unfortunately taken a leave of absence. Does anyone else have the capability to run a bot in the manner I heretofore described? Thanks! Stevie is the man! Talk • Work 14:58, 18 August 2006 (UTC)
I can Take over just let me get it approved. Betacommand 15:54, 18 August 2006 (UTC)
A couple weeks ago, I noticed a significant disparity between a Wikipedia page and the Meta master page. I updated it, but I notice there are a lot of such pages, and many are not kept up-to-date. I suggest a bot to update page copies on a regular basis.
The pages in question are most—but not all—of the pages listed here and here.
Now it's worth noting that people do edit these pages, despite the prominantly displayed message suggesting that they refrain from doing so. It may be worth the time to transfer significant edits back to the Meta copies beforehand, if they are obviously good additions to the article. I dunno if I'd want to do that all myself though, so someone else would have to be interested as well.
Any regular bot activity should be announced on the talk page, I think, to discourage anyone making changes that may be quickly overwritten. -- Tsuji 02:16, 17 August 2006 (UTC)
I've run into a number of people lately whose idea of fun is editing numbers on Wikipedia (and other information that most people won't recognize as vandalism, but numbers are the most obvious) just for the hell of it. Some do it every day as a hobby. They usually switch usernames often, or use anon/proxy.
It would be very useful to have a bot that downloads the Wikipedia database (to avoid undeserved load on the real one) and searches the entire edit history of Wikipedia for edits that:
These edits could then be checked out by Wikipedians to see if they're vandalistic or not. Dark Shikari talk/ contribs 00:32, 2 August 2006 (UTC)
Discussion of this proposed bot's function can also be found at Wikipedia talk:Reference desk.
Cleaning and maintainance of the Reference Desk was formerly performed on a daily basis by the now defunct Crypticbot. Due to the incredibly large size of reference desk archive pages, Crypticbot began to have problems archiving the questions. In order to make the reference desk easier to use, the way in which archive pages are managed was recently changed. Consequently, a new bot will need to be designed to handle daily maintainance tasks. Each day at approximately 00:00 UTC, the bot would have 18 tasks to complete: three pages must be modified (main page, monthly archive page, daily translusion page) for each of the six reference desks (Humanities, Science, Mathematics, Computing, Language, Miscellaneous).
As such, the bot must be able to complete the following tasks for each of the six reference desks on a daily basis at approximately 00:00 UTC:
At the start of each month, the bot would have six additional tasks to perform to create a new monthly reference desk archive page with {{ Reference desk navigation}} for each of the six reference desks.
Detailed Example of Bot's Function:
At 00:00 UTC on August 16, 2006, for the Miscellaneous reference desk the bot would:
<noinclude> {{subst:Reference desk navigation |previous = Wikipedia:Reference desk archive/Miscellaneous/2006 August 13 |date1 = August 13 |next = Wikipedia:Reference desk archive/Miscellaneous/2006 August 15 |date2 = August 15 |type = Miscellaneous }} </noinclude>
<!--werdnabot-archive--> = August 14 = [[Wikipedia:Reference desk archive/Miscellaneous/2006 August 14]] #A type of chair #Male Orgasm #World Trade Center Movie #edits #Maps from Nationalatlas.gov #Clitoral Hood Piercing #Guitar #Alexander Graham Bell #Cruise control on the 1998 ford windstar #Gangster Chronicles TV Series #Physics of a bullet #T.E.A.M. #Who would be richest? #My surname is Bencko. #Top Hats #pounds to dollars #The New York Pass
In addition to its normal daily tasks, at 00:00 UTC on the third of each month, the bot would need to create a new monthly reference archive page for each of the reference desks. After creating all six monthly pages, the bot would perform its normal daily duties.
For example, at 00:00 UTC on September 3, 2006 the bot would create Wikipedia:Reference desk archive/Miscellaneous/September 2006 as a new page containing the following text:
<noinclude> {{subst:Reference desk navigation |previous = Wikipedia:Reference desk archive/Miscellaneous/August 2006 |date1 = August |next = Wikipedia:Reference desk archive/Miscellaneous/October 2006 |date2 = October |type = Miscellaneous }} </noinclude>
-- C. S. Joiner ( talk) 23:07, 15 August 2006 (UTC)
<noinclude> {{subst:User:71-247-243-173/RDmonthly |previous = |date1 = |next = |date2 = |type = }} </noinclude>
is now being used for monthly archives. The changes were to decrease the work load so that it could be done by hand until a bot was made available-- VectorPotential71.247.243.173 21:16, 4 September 2006 (UTC)
There are a bunch of articles in this category: Category:Articles_with_invalid_ISBNs that have this reference in them:
''Naval wars in the Levant 1559-1853'' - R. C. Anderson [ISBN 0-87839-799-0]
The correct reference for the 2005 edition (the only one with an ISBN I could find) is:
Anderson, R. C. (2005), Naval wars in the Levant 1559-1853, Martino Pub, ISBN 1578985382
or, wiki-style:
{{Harvard reference|ISBN=1578985382|Title=Naval wars in the Levant 1559-1853|Given1=R.C.|Surname1=Anderson|Year=2005|Publisher=Martino Pub|Location=Mansfield Center, [[Connecticut]]}}
Is it possible for someone to do a mass search and replace on that reference in the bad ISBN category? It looks like all of the bogus ISBNs were added by [User:SpookyMulder] on September 4, if that helps.
Examples:
Thanks RainbowCrane | Talk 02:29, 25 August 2006 (UTC)
Can someone write a bot that parses a page in the form (...)(...) and prints the results on another page? The format of (...) is (quality and importance and name). Ratings could be posted on another page. The rating of an article is the average of all the user ratings. There is a quality rating and an importance rating (quality and importance are numbers). The URL for the ratings is http://en.wikipedia.org/wiki/User:Eyu100/Bot_area, but there are no ratings yet. This bot will be used for the Wikisort project. Eyu100 18:08, 25 August 2006 (UTC)
Occasionally, when browsing categories, I find user pages that are tagged under some of the encyclopedic categories. These pages are usually user sandboxes or "Works in progress" of articles that they copied to their user space to thoroughly revise incrementally. Now, I'm still quite new at Wikipedia, so I may be missing something here... but I am surprised that the encyclopedic categories aren't hardcoded to skip User: Space pages.
In lieu of such a change, it seems like it would be simple to have a bot go through the categories, and when it finds an article linked to User: Space, it would go and tack <nowiki> tags around the [[Category:(.*)]] links. Of course, templates that add categories, like {{stub}} and such would make things more difficult, but I imagine the most common templates could be similarly coded into the bot.
Does such a bot exist? Is there a particular reason why it doesn't? Just a few thoughts. Matt B. 06:16, 31 August 2006 (UTC)
At Wikipedia:Categories for discussion we're entering a terrible logjam caused by my taking on the responsibility to ensure Wikipedian user categories all had "Wikipedian" in the title. These are many hundreds of categories, and some of the more traditional CfD bots (notably Cydebot) can't handle user categories. We're only about a week out from having most of these approved for renaming and deletion, but we have fewer resources for actually carrying out the renaming and deletion. So if anyone has a bot that can handle this task, I encourage you to go to Wikipedia:Categories for discussion/Working and help out. Thanks!-- Mike Selinker 20:40, 3 September 2006 (UTC)
PLEESE PLEASE PRETTY PLEASE! See my request for beurocrattiness to see what I am all about.I just think that I could program a bot-abouve-all-bots bot that picks up all vandelism and nothing BUT vandelism!-- Hi its mina19_1919! 03:57, 5 September 2006 (UTC)
Is there a bot that can add the {{Wikipedia:WikiProject Sharks/SharksTalk}} template to the top of all of the talk pages of the articles in Category:Sharks? And if there is one for that is there also one that can replace {{portalpar|Sharks}} with {{Sharksportal}} for all of the articles in Category:Sharks, and if it doesn't have {{portalpar|Sharks}} could it just add {{Sharksportal}} below the taxobox.
Is this easy to do? --
chris_huh 13:29,
5 September
2006 (UTC)
I have created WikiProject Pittsburgh, and for the organizational work I am trying to do, it would have to have the Pittsburgh links changed to article [[Pittsburgh, Pennsylvania|Pittsburgh]]. This is the only Pittsburgh in existence; all other "-burghs" in the United States dropped the last "H" at the turn of the 20th century, so anything marked "Pittsburgh" actually refers to "Pittsburgh, Pennsylvania." -- Chris Griswold ( ☎ ☓) 21:50, 5 September 2006 (UTC)
If someone could design a bot to replace things like " User:Akrabbim/Asplode" (which redirects to User:UBX/Asplode) with "User:UBX/Asplode. I've been working on it with AWB, but I don't have enough time, as there are hundreds of transclusions. I think it would be a relatively simple bot task, as it is really only a few simple replacements. The only pages that need it is User:Akrabbim/Asplode, User:Akrabbim/Earthling2, User:Akrabbim/Emptybox, User:Akrabbim/No secondhand smoke, and User:Akrabbim/Towel. I'm just moving them into the User:Akrabbim/UBX subuserspace. It would be appreciated. — Akrabbim talk 16:05, 6 September 2006 (UTC)
I would like a bot to check links to all Connecticut cities & towns and create the standard redirects [[town, CT]] and [[town (CT)]] if they haven't yet been created. I have noticed that these have not all been finished for Connecticut. -- Schzmo 11:55, 31 August 2006 (UTC)
According to Wikipedia_talk:Redirect, redirects may now contain multiple lines (by design) and categories, but an unforseen side-effect is that the developers did not know redirects contained templates such as {{ R from misspelling}} and the other contents of Category:Redirect templates (and they may break this functionality in the future). There are currently 52 templates, 34 of which may have no current transclusions, and ~15,000 instances that need to be substituted in - all they contain is the categories Category:Redirects from X and Category:Unprintworthy redirects, to my knowledge. -- nae' blis 19:23, 31 August 2006 (UTC)
Can someone please write a bot that could tag articles in Category:Rapid transit and it's subcategories with {{TrainsWikiProject|Subway=yes}}? I'd do it myself, but the perl module used appears to require a steep learning curve (that and the fact that I have not written a perl script for ages). Thanks! -- Selmo ( talk) 00:03, 2 September 2006 (UTC)
Does anyone know of a Bot that I could request to add a project tag {{ Project North Carolina}} to every article talk under a Category (including Sub-Categories) - (North Carolina), if the tag does not already exist on the article? Looks like Selmo has the same request. Thanks Morphh 04:04, 4 September 2006 (UTC)
I've also requested feedback on this idea from User:Beland:
"Hi,
I had an idea to sort Category:Pages needing expert attention according to expertise. That way, Wikiprojects could easily track articles needing expert attention in their field. I think expert attention will get more input from wikiprojects than general cleanup. I've already begun "Pages needing attention from expert in medicine" manually, but if a bot could detect categories, it could split up the expert-category entirely into subjects (if necessary, with a complete list of pages needing expert attention in place).
a) I'm not sure if you don't already plan to implement such functions into Pearle. b) If you like the idea, I'd be interested in running a clone of Pearle for this task. Or maybe Pearle could be expanded.
Anyway I have no experience with bots. "
-- Steven Fruitsmaak ( Reply) 12:06, 4 September 2006 (UTC)
Hi everyone. I'm a member of wikiproject Writing Systems. A while back I created the template {{ wsproj}} with the intent of adding it to every article under the topic of writing systems, but after looking at Category:Writing systems, I realised it would take an extremely long time to do and misleadingly inflate my edit count. Is there a bot that could add this template to the talkpage of every article under Category:Writing systems and its sub-categories? The ikiroid ( talk· desk· Advise me) 23:04, 7 September 2006 (UTC)
There's been some talk at CFD about having a bot patrol Category:Protected deleted categories to make sure the categories stay empty. Does anybody have a bot that can do that? - EurekaLott 02:56, 6 September 2006 (UTC) (copied from Wikipedia_talk:Bots#Category:Protected_deleted_categories TimBentley (talk) 16:10, 8 September 2006 (UTC))
Per the MoS (dates and numbers), dates should read September 10 or January 1 instead of September 10 or January 1. Although many pages use the "th", "rd", and "st" for dates. Is it possible to get a bot to fix this? Either by going through the articles or by going to the "What links here" page for each of the incorrect date formats and changing the linked pages? Not only would September 10 have to be checked but also 10 September. Dismas| (talk) 07:46, 10 September 2006 (UTC)
AWB request: http://en.wikipedia.org/?title=Special:Whatlinkshere/Category:Evolution_Wikipedians&limit=500&from=0 - remove this category from these places, it's been deleted by CfD CfD: http://en.wikipedia.org/wiki/Wikipedia:Categories_for_deletion/Log/2006_May_10#Category:Evolution_Wikipedians
Thanks! JesseW, the juggling janitor 21:27, 13 September 2006 (UTC)
Berria requested:
Adding 31 slightly varying interwiki's seems like the perfect job for a bot, so I'm posting it here... JesseW, the juggling janitor 22:06, 31 August 2006 (UTC)
A bot could create a list of proposed edits, and an admin could approve them manually, but it makes it alot harder. HighInBC 19:15, 15 September 2006 (UTC)
A while back Template:Cite journal was forked to create Template:Cite journal2, the only difference being that "In contrast to cite journal, cite journal2 omits the quotation marks around the article title." Since this fork, an option was coded into cite journal so that the quotation marks can be removed from individual usages of the template. Would a bot be able to change all usages of {{ cite journal2}} to {{ cite journal}}, copying over the existing information for each usage of the template and adding "|quotes=no" at the end? For example,
* {{cite journal2 | author=Könneke, M., Bernhard, A.E., de la Torre, J.R., Walker, C.B., Waterbury, J.B. and Stahl, D.A. | title=Isolation of an autotrophic ammonia-oxidizing marine archaeon | journal=Nature | year=2005 | volume=437 | pages=543-546}}
needs to be changed to:
* {{cite journal | author=Könneke, M., Bernhard, A.E., de la Torre, J.R., Walker, C.B., Waterbury, J.B. and Stahl, D.A. | title=Isolation of an autotrophic ammonia-oxidizing marine archaeon | journal=Nature | year=2005 | volume=437 | pages=543-546 | quotes=no}}
For comparison, this is how the two render:
{{
cite journal}}
: Unknown parameter |quotes=
ignored (
help)CS1 maint: multiple names: authors list (
link){{ cite journal2}} can then be depreciated. I've contacted the original author of the fork, and they have agreed to the merge (see User_talk:Stemonitis#Template:Cite_journal2). About 120 pages would be affected by this change, with multiple instances of the template in use on each page. Thanks. Mike Peel 21:43, 14 September 2006 (UTC)
Table requested:
Any help appeciated
I've moved a few userboxes onto my user space, and I need a bot to update them.
Thanks! Laur ə n whisper 18:24, 20 September 2006 (UTC)
I am not certain if this is the right place to post this question (if it is not, please move it to the right place and drop a note on my talk page.
The editor Sheynhertz-Unbayg has recently been banned and now his contributions (mostly weird "onomastics" pages that are just concatenations of several disambiguation pages, see Lust (onomastics) for a typical example) need to be cleaned up. To ensure that all pages he has edited do get checked, I would like to have a bot- or script-generated list of all pages he has touched. As he has more than 20,000 edits, manually created lists like the one here are probably incomplete. Also, a centralized list would help avoid duplicated efforts from the people who check the pages.
Please create a list of all pages touched by this editor and drop it somewhere, for example at User:Kusma/Sheynhertz/contribs. A good format would be a bulleted list with wikilinks to the pages, perhaps with a "redirect=no" or a mentioning of the redirect target for the numerous redirects created.
In addition, a bot could be used to check all of the interwikilinks created by Sheynhertz. I have already removed dozens of links to nonexistent articles on the Japanese Wikipedia, and I expect that many more of his interwikis are wrong.
Thank you for any help or insight you can offer, Kusma (討論) 08:28, 21 September 2006 (UTC)
There ought to be a bot that is automated to remove common things used for testing, such as Bold text, Italic text, [[Link title]], [http://www.example.com link title], [[Media:Example.ogg]], Image:Example.jpg, #REDIRECT [[Insert text]] within other text, etc. from the article space. -- Gray Porpoise 10:48, 20 September 2006 (UTC)
Is there any way a bot can go through and look for all talk pages without an associated article/image and tag them with Template:db-talk? I've encountered a large amount of these and I can foresee a bot finding a few thousand. VegaDark 20:18, 22 September 2006 (UTC)
This page may meet Wikipedia's speedy deletion criteria, as it is a
talk page of a page which does not exist (
CSD G8).
This notice was added by a bot. There are three reasons why a talk page may exist without an article, which are:
Only delete this page if it clearly does not meet any of those three criteria. |
I was wondering if a bot could do the same for Category:Articles to be merged as it did for Category:Category needed and Category:Articles that need to be wikified. To sort them out per month. There currently is a backlog of close to 11.000 (!!) articles. And it would help a lot if you could see quickly how long the merge tag is already on an article. Garion96 (talk) 00:13, 18 September 2006 (UTC)
A while ago, I made maps for all of the cities and towns in Indiana, and started semi-automatically adding cityboxes each article. I got part way through the 'L's but I have not worked on it for a year now. It would be helpful for someone to finish the rest of the pages. The red-dot maps are located here [11]. - Marvin01 | talk 00:49, 21 September 2006 (UTC)
Hi, could someone please replace mammal-stub with bat-stub for the articles listed at http://en.wikipedia.org/wiki/User:Eug/Bat-Stubs ? Eug 13:43, 27 September 2006 (UTC)
Per the MoS (dates and numbers), dates should read September 10 or January 1 instead of September 10 or January 1. Although many pages use the "th", "rd", and "st" for dates. Is it possible to get a bot to fix this? Either by going through the articles or by going to the "What links here" page for each of the incorrect date formats and changing the linked pages? Not only would September 10 have to be checked but also 10 September. Dismas| (talk) 17:59, 21 September 2006 (UTC)
Following
Wknight94's suggestion, I would like to request a bot whose function would be to create redirect pages for articles whose names contain diacritics. For example, the bot could verify if the article
České Budějovice can be redirected from
Ceske Budejovice. If not, the latter would be created.
This could prove useful because many articles lack these redirect pages and are hard to find for those who do not possess the diacritics on their keyboards. Recently I had to create redirect pages for nearly all Portuguese municipalities. A bot could perform these tasks much more efficiently. If there is someone interested in creating this bot, thank you in advance. Regards.--
Hús
ö
nd 22:44,
30 September
2006 (UTC)
Hi. I wonder if there's a bot with the time and inclination to go around all the articles linking to Church of Jesus Christ of Latter-day Saints and change those links to point at The Church of Jesus Christ of Latter-day Saints. There are between 1500 and 1600 such pages, so it's a bit of a task to do by hand. Thanks! - GTBacchus( talk) 01:24, 28 September 2006 (UTC)
I would like permission to run Firefoxbot using AutoWikiBrowser in automatic mode. The Fox Man of Fire 14:37, 6 October 2006 (UTC)
We still need an archive bot, our current system is starting to break down, and the last bot we had was CrypticBot, so we've been doing it all by hand for quite a while now. Since our old request has been long since archived off this page (by an archiving BOT, oh the irony o:) I decided to repost a less involved version of the same request here-- VectorPotentialRD NEEDS A BOT (-: 13:01, 1 October 2006 (UTC)
Here is a new REVISED bot request. I have created a demo reference desk that could be used to test implementation of an RD bot using a slightly updated layout. It will be proposed by a few of the RD editors (including me) only once a bot is working for it, because of the increased number of desks to manage manually. Please read User:Freshgavin/Sandbox/Reference_desk_bot_request for more details about the changes that will be proposed, and for a detailed summary of the requirements for the bot.
The reference desk relies now on a few diligent editors for manual archiving, and there are a lot of people that would really appreciate a bot to help us do this task. Any suggestions, ideas would be greatly appreciated. Questions and comments about the new layout should be posted on this talk page, and those about the current system should be posted here. freshofftheufo ΓΛĿЌ 07:03, 4 October 2006 (UTC)
Off-topic question: There's an MSN bot that lets you get information from Encarta. Are there Wikipedia IM or SMS bot? Such a bot, if it doesn't already exist, would be mighty useful. IM bots would let people get information from Wikipedia without opening a web browser. And an SMS bot would let people get info anytime, anywhere, for just the price of an SMS text message. Anybody know if these exist? If not, would it be appropriate for me to file a bug on MediaZilla or would this request be off-topic there too? :-) Cheers, -- unforg e ttabl e id | talk to m e 00:54, 5 October 2006 (UTC)
The templates {{user ara}}, {{user Arab}}, {{user cyr-1}} etc shall be replaced with the parameterized template:user iso15924. Parameters given below. Some templates may be included via Template:Babel. AFAIK these cannot be replaced by a bot and will be done by hand. The request was developped at Wikipedia_talk:Userboxes/Writing_systems#Bot_request Tobias Conradi (Talk) 16:05, 25 September 2006 (UTC) The 15 replacements are as follows:
thanks a lot for your help! Tobias Conradi (Talk) 01:33, 10 October 2006 (UTC)
I'm proposing a bot that patrols articles for creation and starts new articles for unregistered users. I understand that this defeats the purpose of the restriction that only registered users can create articles, and that this bot can be easily abused. However, this bot may not be such a bad idea if the following measures were in place:
Anonymous editors would also be able to create pages by accessing the bot interface on an off-wiki site.
Any thoughts on this? -- Ixfd64 05:47, 9 October 2006 (UTC)
I'd like to ask a bot to help the Medical genetics project. We've finished the article rating mostly and we'd like a bot to tag every unassessed articles in Category:Medical genetics with {{MedGen|class=unassessed}}. Thanks in advance. NCurse work 18:30, 12 October 2006 (UTC)
At Wikipedia:WikiProject Bedfordshire/Infobox status we are trying to create a list of all pages and their infobox status. We have a template which is placed on the talk page of all these articles that contains an infobox status and automatically puts them in one of four categories. We would like a bot that automatically creates the list daily. If you do want to help please contact the talk page of that article for more information. Thanks. Lcarsdata ( Talk) 14:54, 13 October 2006 (UTC)
Hello everyone, how are you going? I'd like to request a bot for WikiProject Indonesia. With it's current number of members, it is getting hard for me to post a message on each talk page. For now, maybe I'd like the bot runs every week, to deliver weekly notices (but I can change it, right?). Thanks in advance -- I mo eng 14:11, 15 October 2006 (UTC)
I'm looking for some help converting all the article titles, links, and non-linked mentions of a great number of Japan-related articles which, when macrons (e.g. ō and ū) are taken into account, need to be respelled. The greatest congestion of these, I think, comes from the ships of the Imperial Japanese Navy. Right now, I have a very short list of names that need to be changed, but as I look into each individual ship's Japanese name and how it ought to be spelled, I'll be adding to the list of those that need renaming. The number of ships isn't too great - those that need renaming hopefully do not number more than 30-50, hopefully. But if each of those is linked to by 10 articles, that's 300-500 right there. Please let me know what to do or who to talk to. Thanks for the help. LordAmeth 23:51, 15 October 2006 (UTC)
Would there be any way to create a bot which would be able to indicate which pages have been added to the listing of pages here or elsewhere, over, say, the last month? Maybe a two-section format listing all the pages in one column and another listing either those which existed (or were revised) before a given date or were created after a given date might be easiest. Also, the bot could potentially be used to determine which pages are "stable", which is to say, not modified over a given period. Thanks for your response, positive or negative. Badbilltucker 16:16, 16 October 2006 (UTC)
I'd like a bot to go through all the articles linked to in this template and add this template to the bottom with {{ fb start}} and {{ fb end}} around it as is standard with football templates. If {{ fb start}} and {{ fb end}} are already present, just add this template in front of {{ fb end}}. Not too hard to do I hope? - MTC 11:43, 16 October 2006 (UTC)
Just like repeatedly-recreated unwanted articles are tagged with {{ deletedpage}} and protected, some unwanted categories are protected with {{ deletedcategory}}. However, given the way the cat system works, this doesn't stop people from adding articles to the cat. Hence, a bot is requested to regularly (e.g. weekly) empty these deleted categories. >Radiant< 12:04, 18 October 2006 (UTC)
User:RobotG currently clears categories in Category:Protected deleted categories. Any Category tagged with {{ deletedpage}} (or the now-redirect {{ deletedcategory}}, is placed in this category. — Centrx→ talk • 23:58, 19 October 2006 (UTC)
Do we have a welcome bot, that adds welcome templates to new user pages? Seems like a good idea to me, it was discussed on the mailing list somewhere. Mind, it seems such a good idea it's probably been discussed before. Hiding Talk 10:12, 20 October 2006 (UTC)
I've learned that the { { IPA } } template is used to enable phonetic symbols to appear as they should, and not as little squares, in IE6. A bot to do a mass conversion of "hard" phonetic symbols to { { IPA } } template-formatted phonetic symbols would be useful. Tawagoto 01:47, 16 October 2006 (UTC)
Can someone please write 4 me a bot that will pick up NPOV breaches which is shut-off complinatnt.
Thanks
Nathannoblet 04:29, 22 October 2006 (UTC)
Can someone write for me a bot that removes red links, red templates, and red categories from articles (except Template:Red link)? What it does is it will turn "{{ red link}}" into "this" for red links, and remove red templates and red categories straight from the article. -- AAA! ( talk • contribs) 11:45, 26 October 2006 (UTC)
Hi. At this page, you can see a list of links pointing at List of Ed, Edd 'n' Eddy episodes. That page, however, is a redirect to List of Ed, Edd n Eddy episodes, without the apostrophes. Could someone please sic a bot on that list and fix all the individual episode pages, which are currently double redirects? Thanks. - GTBacchus( talk) 19:34, 26 October 2006 (UTC)
Change all "Major Highways" titles to "Major highways", particularly in counties. -- MNAdam 03:41, 1 November 2006 (UTC)
You mean page titles or in text? why? is there a consensus/vote(I SAID THE V WORD ZOMG!) somewhere? ST47 Talk 11:11, 1 November 2006 (UTC)
I have noticed that perhaps six out of seven anonymous users who leave comments on talk pages do not sign their posts properly. I have usually added the {{ unsigned}} message after those posts when I have encountered them. However, this could be a job for a bot: scan the Recent changes list limited to the Talk space, and if a comment is made by an IP-address, check it for a signature and add one if necessary. Of course logged in users also forget the signature sometimes, and those could be checked too, if it doesn't take too much resources. Alternatively only check those users that have not created an user page yet, they are often new to Wikipedia and do not know about signing their posts. Is anyone with the skill/equipment up to this? -- ZeroOne ( talk | @) 20:49, 17 October 2006 (UTC)
Why not just write it into the program (an auto signature)? If it's not an option to not leave a signature then a bot wouldnt be needed in the first place. -- MNAdam 23:19, 3 November 2006 (UTC)
How about a manually summoned bot that can suggest images for an article based on images from Commons that exist on copies of the page on other language wikis? -- InShaneee 02:21, 27 October 2006 (UTC)
Waht I'm after is either guidance on how to write a bot and what I need to run it, or perhaps someone to set up a bot that would run through the various comic stubs categories and tag them as stubs for the 1.0 assessment. I get some webspace with my ISP that includes cgi space I don't know if that's enough to host a bot, but I'd be interested in doing it if someone would hold my hand, otherwise if that's impractical or impossible, I'd appreciate someone taking it on. The categories are Category:Comics stubs and sub-categories and the code that needs to be added or amended on an article talk page is that either {{Comicsproj|class=Stub}} needs to be added or where it exists {{Comicsproj needs to be amended to {{Comicsproj|class=Stub, leaving the close brackets in case other fields are active. Also, I guess a subpage, um, {{FULLPAGENAME}}/Comments needs to be created with a message, um Assessed by comics-bot which automatically tags articles in stub categories as syub class articles. Appreciate thoughts. Hiding Talk 18:23, 30 October 2006 (UTC)
I'm not sure if this would be a good idea or not but perhaps it would be possible to build a robot to standardize Wikipedia pages (make them of similiar formatting).
Some points would be:
I'd be interested to see other people's points on this.
Yuser31415 07:43,
3 November
2006 (UTC)
I would like to request a bot that can touch about 80.000 pages on Wikispecies. I am one of the admins on Wikispecies, and we're going through some major changes. We do have one registered bot, but it stopped working, for an unknown reason. Perhaps a 'techy' is able and willing to do some standard changes. In principal it would need to delete '::::' colons out of taxonavigation sections. Perhaps also a check on a certain layout and if it does not fit standard layout add a Category. (or fix the issue if possible). Help would be highly appreciated. -- Kempmichel 10:21, 3 November 2006 (UTC) ( Wikispecies:User:Kempm)
I have some annoying hiccups sometimes, but that seems quite normal. So far I received 20.000 e-mails from your edits :) Is that how many you did? -- Kempmichel 17:58, 6 November 2006 (UTC)
I've just come back from fixing about 16 double redirects. Could someone write up a bot for me that fixes double redirects (If it's possible)? -- AAA! ( talk • contribs) 08:40, 6 November 2006 (UTC)
Is it possible to have a bot written that would patrol subcategories of Category:Comics and where an article has been tagged for deletion it could add that fact to the relevant section of the Wikipedia:WikiProject Comics/Notice Board? Hiding Talk 21:22, 7 November 2006 (UTC)
There are many pages dealing with subjects in ancient Greece and Rome that erroneously capitalize "ancient". WP guidelines and editorial consensus say that "ancient" should be lowercase. It's easy enough to move individual pages, but fixing the redirects is a pain. Is this the kind of task that a bot can help with? If not, are there other ways to (semi-)automate the process? Thanks. --Akhilleus ( talk) 16:32, 27 October 2006 (UTC)
Not in detail, because I mostly use OS X. But I have some access to a Windows machine, so I'll check it out. --Akhilleus ( talk) 19:05, 27 October 2006 (UTC)
Errr...this proposal speaks of moving pages. Article titles must begin with a capital letter, for technical reasons. Hence the link ancient Greece will always point to the article titled Ancient Greece. Robert A.West ( Talk) 19:47, 7 November 2006 (UTC)
Could someone write a bot that'll convert the old Template:PDFlink format to the new one, while adding file size info possibly, some details one what needs to be done is located at Template talk:PDFlink#PDFbot - Dispenser 08:19, 29 October 2006 (UTC)
Is it possible that a bot could be created or used that would be able to patrol all images on the wiki, and either add or replace the category with Category:Memory Beta images, as we have hundreds of images and it would be a mammoth task to do by hand. If so that would be fantastic, address for the wiki is Memory Beta Main Page. -- The Doctor 11:32, 08 November 2006 (UTC)
A lot of userboxes are being moved per the WP:GUS to userspace. This is probably a good thing; but when a box is moved everyone who had it on their userpage is left with something like this:
This user tries to do the right thing. If they make a mistake, please let them know. |
. Could someone create a box to fix those automatically? ~ ONUnicorn ( Talk / Contribs) 16:28, 9 November 2006 (UTC)
Could someone run a bot through the Playboy Playmate articles to compile a list of the dead ones so that I can compare it to the list at Dead Playboy Playmates. I want to check the list to see if it has all of them or not. Dismas| (talk) 09:07, 10 November 2006 (UTC)
I mentioned this on the Village Pump, but then I realized that this page existed: I've noticed that tables are used an awful lot everywhere on Wikipedia, even when using <div> tags would work just as well. I looked up Wikipedia:When to use tables, and I thought it would make sense to have a bot to find unnecessary tables, i.e. single-cell tables, and turn them into an equivalent <div style="CSS"> combination. That doesn't exist already, does it? Phoenix-forgotten 20:00, 10 November 2006 (UTC)
One example is Template:TOCright. As long as the border is nonexistent, a div will still look just like a one-cell table does if you convert any cellspacing and cellpadding into an appropriate amount of CSS padding. If the table has a border though, I haven't been able to make an exact equivalent because the table always seems to have a one-pixel border for its cell, which messes up the border-style:outset the outer boundary has. Phoenix-forgotten 01:25, 15 November 2006 (UTC)
Hello, I need a very specific change to be made to a number of very specific articles. For a list, see User:lensovet/Rail. What I need is as follows: for each line that reads
{{rail line|previous=[[Metropark (NJT station)|Metropark]]|route=[[Northeast Corridor Line]]|next=[[Linden (NJT station)|Linden]]|col=FF2400}}
to be converted to
{{NJT line|previous=Metropark|line=Northeast Corridor|next=Linden}}
that is:
please let me know when you make the change. thanks! — lensovet– talk – 20:23, 12 November 2006 (UTC)
I was thinking last night that it would be fun to have a bot that randomly generated a page in certain areas for the editor to edit. Like folks interested in botany or biology could get a random biology or botany page, then copyedit it. If Wikipedians did this for a year in all major areas, many of our crummiest articles, apperance wise, would get cleaned up.
I think that there are numerous articles on Wikipedia that need copyedits. I attempted to do this in the [Herat] article and the [Afghanistan] articles, but got sucked into a vicious flame war--these articles need serious work. However, I moved on to using the Random Article generator to find articles that could use copyediting, leading me to copyedit obscure pages like [Pre-dreadnought]. About half of the articles that come up have to do with Anime or television shows it seems, and some are in areas I know nothing about, but sometimes I find something interesting that needs work.
I can find articles on list, fine, but adding a little fun to it, and making it an all-Wikipedia project could seriously improve many Wikipedia articles. Editors would be encouraged to add citation needed tags, categories, and just do the rudimentary copyedit work that really makes Wikipedia viable. By allowing folks to get random articles in selected categories people would work on articles in their areas.
One of the best things about Wikipedia is writing a good article, then coming back the next day and finding someone else has spit-shined for you. There are a lot of articles that have some useful information but are rather sorry in appearance. Devoting some time to cleaning up these articles would, imo, greatly improve Wikipedia. Adding a little twist for those seeking something to do would make it a bit more interesting.
Please someone write this bot. Oh, I would call it the KPBot (for Kitchen police)!
KP Botany 20:55, 15 November 2006 (UTC)
As we know, images and templates on the main page are changed on a daily basis. To prevent vandalism to Wikipedia's most important page, these images and templates must be fully protected. I'm sure that many administrators will agree that this task can be pretty tedious. Also, it's always possible that something will be left unprotected by accident. After all, we're all humans! :)
Therefore, I'm proposing a bot that will automate the following tasks:
I do have one concern, though. If administrators become too dependent on the bot, some images or templates may be left unprotected if the bot suffers a downtime. -- Ixfd64 09:44, 22 November 2006 (UTC)
It'd be nice to have a bot automatically add Template:Verylong to articles that are above the recommended size. Vicarious 09:24, 23 November 2006 (UTC)
"On Wikipedia, and other Wiki-based websites broken external links still present a maintenance problem." linkrot
I am hoping someone takes on the task to write a WebCite-Bot, i.e. a bot which automatically feeds all cited URLs in Wikipedia to WebCite, which is a web archive similar to the Internet Archive, but with enhanced features such as on-demand archiving, and with a focus on scholarly material (as opposed to IA's shotgun-crawling-"archive-all-you-can"-approach). WebCite creates a snapshot (cached copy) of the cited URL, thereby archiving/preserving the cited webpage or webdocument (if caching was successful, the link to the snapshot should be added to the original URL on wikipedia). WebCite takes care of robot exclusion standards, no-cache tags etc on the cited page. If caching was successful, WebCite hands back a WebCite-ID, which should be added to the original link (for an example on how this could look like see the reference list at the bottom of the article http://www.jmir.org/2006/4/e27/ - all cited non-journal webpages also have a link to the WebCite snapshot). The benefit would be that all cited material will be automatically preserved and link rot, 404s or changes in the cited webpages will no longer be a problem. WebCite has an XML-based webservice which allows to communicate the caching request as well as to receive the WebCite ID if the caching was successful, see technical guide http://www.webcitation.org/doc/WebCiteBestPracticesGuide.pdf User:Eysen 19:45, 01 Dec 2006 (UTC)
I wanted a bot to do the following :
Thanks,
Jmfayard 14:02, 18 November 2006 (UTC)
But I just noticed, there is something else to care of.
Before, we used {{ Translation request}} and {{ Translation request from German}} which are now obsoleted. Instead, one should use {{ Translation}}. I just made a redirect of the two first templates to {{ Translation}}. This solve the case of future translation requests with the old templates.
But there is the problem of existing translation requests.
The bot should replace the obsoloted templates in the talk pages Talk:XXX ( the full list is hre) with
Is it doable ?
Jmfayard 23:31, 18 November 2006 (UTC)
No, this is not what I want.
To make it simpler :
I want you to replace
The thing is that, because I made a redirect, the talk pages with {{Translation request}} are not listed in Special:Whatlinkshere/Template:Translation request as you could expect, but in Special:Whatlinkshere/Template:Translation
Jmfayard 00:16, 19 November 2006 (UTC)
It'd be nice to have a bot automatically add Template:Verylong to articles that are above the recommended size. Vicarious 09:24, 23 November 2006 (UTC)
the list of fiddlers is getting a little unwielding. right now it duplicates itself completely, listing all the names first alphabetically and then by style - nice, convenient, but long. the plan is to split it into two articles. obviously there's a problem: how do we make sure people put their additions on both pages? already people aren't adding them to both lists.
seems like there are a couple ways this could be done with a bot, though I haven't read up on how they work and what they're capable of. conceptually most simply a bot could copy recent additions from the one page to the other - but it would have to check to see if the editor had edited both already. if bots can get around edit protection we could protect the list-by-style and have the bot check for changes to the list by name (take a look at the page - each name in the alphabetical list is followed by the styles they play... could a bot find and parse those parenthetical strings, and copy the name into the appropriate part(s) of a protected list-by-style article?) -- Eitch 19:21, 14 November 2006 (UTC)
Someone should write a bot to clean up Ganeshbot's bad grammar. Kaldari 06:56, 15 November 2006 (UTC)
Hello, would it be possible to have a bot created that would add the Law enforcement wikiproject header ({{Law enforcement}}) to articles that are in the Law Enforcement catagory? ( here). Many thanks.-- SGGH 14:37, 22 November 2006 (UTC)
Template:CopyrightedFreeUse has been officially deprecated since February. However, there are still about 7000 images using it. The equivalent Template:PD-release should be substituted for it (or Template:No rights reserved which is also legally equivalent). Personally I prefer Template:PD-release as it is less confusing but means exactly the same thing legally, i.e. that all rights are renouced by the copyright holder. Kaldari 17:45, 24 November 2006 (UTC)
Could someone please create a bot which would automatically replace the Template:SER with Template:SRB. Even though SRB is an official ISO 3166-1 3-letter country code and an abbreviation for Serbia (from Srbija) many people still think this code is SER and having a wrong template doesn`t help either. It would be nice to have an automatic bot to correct future mistakes. Avala 14:08, 27 November 2006 (UTC)
I'm tinkering with a proposal I've made which involves using categories to replace a list page (the proposal is Wikipedia:WikiProject The Simpsons/Proposal for managing song lists on Simpsons episodes) but I wonder if this is something that could be well managed by a bot.
The requirement would be to generate a page (or I guess edit a delimited section in the middle of a page) based on all the pages listed in a different category. Perhaps the extended requirement would be to find all the subcategories of a given category and use them to generate a list of lists. For instance, 'generate a page with sections named from subcategories of the category "songs on the simpsons", where each section has a list formed from the names of the pages in those subcategories (which would be the names of songs)'.
Or perhaps to take the text from a given section from each page of a list of pages (identified from being members of a category) and generate one page containnig all those sections. For instance 'copy all the sections called "songs" from all pages in the category "simpsons episodes" to the "songs" section of the page "list of songs on the simpsons" page'.
This all sounds quite fiddly, but I can imagine someone may have made a generic bot that can be fed with parameters to do this sort of processing. Is there anything out there to do something like this, or any keen on developing one? Please feel free to chat on my talk page if you want to ask questions or suggest solutions -- Mortice 18:10, 27 November 2006 (UTC)
I'm currently developing this bot -- Mortice 12:21, 29 November 2006 (UTC)
Category:Main pages with misplaced talk page templates contains pages with a template that belongs on the page's talk page instead. Can we employ a bot to move these? ( Radiant) 13:28, 29 November 2006 (UTC)
And, Category:Articles actively undergoing construction and Category:Articles actively undergoing a major edit are supposed to be temporary categories, but have grown very large. Perhaps a bot could depopulate them weekly? ( Radiant) 13:44, 29 November 2006 (UTC)
I try to go through this and weed out the forgotten ones. It is a bit frustrating because some people demand the right to leave up the tag for long periods of time, and aren't shy about complaining. Also it's transcluded on a bunch of instructional pages, like Wikipedia:Edit lock so some care needs to be taken not to mess up those pages. But if someone wants to run a bot and deal with the occasional upset article owner, I think that's great. -- W.marsh 18:22, 29 November 2006 (UTC)
How do I make a bot to message Wikiproject Gold Coast members? If possible, can someone make it for me. Thank you -- Nathannoblet 07:49, 1 December 2006 (UTC)
I have opened suggestions for creating a template for image pages that warns admins that a user has contacted the copyright owner requesting the image's use of Wikipedia and asks not for it to be deleted while a response is expected. The discussion can be found here.
In order to work, it will need a bot that checks a tagging date and remove images that have been tagged for more than one week, and create a relevant category of unlicensed images each day, the same way it is done in {{ Replaceable fair use}}.
I don't have any expertise in creating bots, so could someone please create a bot that can do this? ~ ► Wykebjs ◄ ( userpage | talk) 18:05, 1 December 2006 (UTC)
Also, Category:Pages needing an infobox conversion and Category:Needs album infobox conversion contain a template that should be switched to another template. Would a bot be feasible here? ( Radiant) 13:33, 29 November 2006 (UTC)
There are quite a few lists of new articles related to specific subject used by various wikiprojects. They all work 'manually' - with dedicated users adding articles they find to the lists - but this could all be easily autmomatized (botized). We need a bot that would: 1) look at a specified forum (i.e. Portal:Poland/New article announcements) 2) look at a specified section to find last reported article (they differ as various wikiprojects and such have no unified structure, so some may have 'November', others 'November 1-15', and so on 3) look at a 'what links here' of given article(s) - for example, Poland, Polish, Polish language for Portal:Poland, see what new articles have been added to the 'what links here' and generate a report in the above section in the format *[[article name]] created by [[User:Username]] on date This bot would save much time now spend by dozens of dedicated editors who scour the 'what links here' lists instead of doing more constructive work. Issues to consider: articles in question (countries) have many pages of 'what links here', to speed up the process the bot may want to look from the end to find the most recent article added, thus skipping 99% of the links - but this may skip checking redirect. I don't know how long will it take to analyze the entire page to find and analyze redirects, but once they are found they can be added to the main 'check' list and the bot wouldn't have to look through main article for them again, so it would be useful to have to options: normal scan (from the end to the last reported) and complete (from the end to the last reported, and then to the begining but generate only list of redirects). Additional features which I doubt would be included (wishlist): add lenght of the article, tags, lead; scan for new pictures, categories and stubs.-- Piotr Konieczny aka Prokonsul Piotrus | talk 18:03, 2 December 2006 (UTC)
Per here and on each affected template's talk page (thread name "Template name"), please dispatch a bot to rename all transclusions/links of:
Thanks! David Kernow (talk) 10:22, 3 December 2006 (UTC)
Templates {{ wc}}, {{ ec}} and {{ ec2}} need substing into what appears to be thousands of articles. Chris cheese whine 20:10, 5 December 2006 (UTC)
Hi, I'm trying to get some Country and City Wikipedia pages for use on a Google map travel site. The code is written in C# and works ok for other sites. On Wikipedia I get a 403 error when I read in a page. Do I need to register my site process as a bot or could I use an existing bot to get the pages ?
please
Thanks I'll try those -- Seewhere.net 02:08, 7 December 2006 (UTC)
"On Wikipedia, and other Wiki-based websites broken external links still present a maintenance problem." linkrot
I am hoping someone takes on the task to write a WebCite-Bot, i.e. a bot which automatically feeds all cited URLs in Wikipedia to WebCite, which is a web archive similar to the Internet Archive, but with enhanced features such as on-demand archiving, and with a focus on scholarly material (as opposed to IA's shotgun-crawling-"archive-all-you-can"-approach). WebCite creates a snapshot (cached copy) of the cited URL, thereby archiving/preserving the cited webpage or webdocument (if caching was successful, the link to the snapshot should be added to the original URL on wikipedia). WebCite takes care of robot exclusion standards, no-cache tags etc on the cited page. If caching was successful, WebCite hands back a WebCite-ID, which should be added to the original link (for an example on how this could look like see examples below and also see the reference list at the bottom of the article http://www.jmir.org/2006/4/e27/ - all cited non-journal webpages also have a link to the WebCite snapshot). The benefit would be that all cited material will be automatically preserved and link rot, 404s or changes in the cited webpages will no longer be a problem. WebCite has an XML-based webservice which allows to communicate the caching request as well as to receive the WebCite ID if the caching was successful, see technical guide http://www.webcitation.org/doc/WebCiteBestPracticesGuide.pdf User:Eysen 19:45, 04 December 2006 (UTC)
--snip--
It is proposed to develop a bot which - using the WebCite webservice - changes a reference (or even "naked" URLs) as follows:
Replace a reference like:
with a reference like this
or this (in addition to the WebCite URL, the original URL might be given):
Alternatively, the cited URL can also be retained as part of the link to webcitation, to keep the cited URL explicit and to allow easy reverting to the original URL should this be desired:
--snap--
Hello again, I was wondering if one of you lovely people could help us out again - we've set up an Assement department and would like all {{Architecture|class=stub}} adding to all of the stubs currently listed at Wikipedia:WikiProject Architecture/Stub categories, starting with those articles with {{architecture-stub}} and {{architect-stub}} tags. Cheers. -- Mcginnly | Natter 12:06, 7 December 2006 (UTC)
(This isn't a bot request per se but an "is this worth doing" post - I can write the bot myself if people think its worth doing.)
I notice that one of the items permanently on the maintenance list is the Wikipedia:Cleanup list and that the number of articles needing cleanup seems to be increasing rather than decreasing. I had an idea for a bot that might help with this problem. I could write this bot myself (already have one bot on trial) but wanted to get people's ideas for whether it was worthwhile, comments etc etc, before I started on it.
Brief scope as I see it now (amenable to change): Bot would be manually run or automatically run on an eg weekly basis. Bot would trawl Category:All_pages_needing_cleanup and find any new additions since its last trawl. It would visit each new addition and pull a list of contributors. It would then leave a message on the talk page of (every contributor) or (last 10 contributors) or (article starter) or (contributors with 10+edits) or (whatever) notifying them that the article is in need of cleanup and listing tips for how they could help to achieve this etc.
What do you think? Worthwhile? Ideas? Comments? - PocklingtonDan 17:44, 7 December 2006 (UTC)
Hi there!
I was wondering if it would be possible to run a spacing bot, either automated or manual, that puts spaces or newlines between wiki syntax and removes spaces or newlines if they weren't needed? Obviously it would have to be pretty simple so as not to get spacing wrong. I could probably knock something up in Yabasic pretty quickly, using an external program like Wget to handle getting and sending data from the page.
If anyone's interested I'll give an account of different wiki syntax that I believe spacing makes easier to understand (like putting spaces after bullet points before the text).
Cheers,
Yuser31415 @ ? # & help! 02:44, 7 December 2006 (UTC)
and although that's not official policy, it's generally accepted. Would it change the way pages are displayed, or only their wikicode? — Mets501 ( talk) 11:56, 7 December 2006 (UTC)Avoid making insignificant minor edits such as only adding or removing some white space
Aloha. WikiProject Hawaii assessment is just getting started and we really need help. To start, I need a bot to replace the current WikiProject tag with the new tag {{WikiProject Hawaii |class=NA |cat=yes}} on every category talk page contained within Category:WikiProject Hawaii articles (please exclude the subcats). Thank you for your assistance! — Viriditas | Talk 21:51, 7 December 2006 (UTC)
Great job. Now, on to stub assessment. I would like to add {{WikiProject Hawaii|class=Stub}} to all talk pages in Category:Hawaii stubs (including subcats). Please add class=Stub to unassessed or untagged articles only, skipping articles where class is already flagged. Thank you again. — Viriditas | Talk 01:32, 9 December 2006 (UTC)
User:WatchlistBot can help you with this. I'm a bit behind right now, but I'll add it to my to-do list and contact you when I can get to it, if you can't find anyone to do it sooner. Ingrid 01:48, 9 December 2006 (UTC)
Per discussions
here, I need
these 152 users informed of a change made recently to the
common.css file so that they can restore their ability to view
Persondata. The bot should leave the following notice on those users' talk pages:
"Per
recent discussions, the way in which
Persondata is viewed by Wikipedia editors has changed. In order to continue viewing Persondata in Wikipedia articles, please edit your user CSS file to display table.persondata rather than table.metadata.
More specific instructions can be found on the Persondata page."
If there are any questions about this request, please ask me on my talk page rather than here. Thanks.
Kaldari 08:33, 24 December 2006 (UTC)
Simply, this bot would check for {{ lowercase}} tags on an article, then rename an article (let's say, "Test article") to "Thisisanarticleusedbylowercasebot". Then, the bot would rename "Thisisanarticleusedbylowercasebot" to "test article". It should be easy to create, and would only need to be run once. -Slash- 20:27, 10 December 2006 (UTC)
This a request for a bot that counts the number of items at Wikipedia:Featured articles and puts that number into a template. Once the bot has proved reliable, it envisaged that the bot will be flagged to edit the protected page Template:FA number, which is used as a counter within the Main Page FA box.
A bit of background, in recent discussion at Talk:Main Page consensus was reached for an FA counter. Once it was implemented as template requiring manual updating, several FA regulars expressed their unhappiness with being asked to keep track of another page when pulling together the results of WP:FAC and WP:FARC. On the Talk:Main Page discussion, FA regulars suggested a bot solution, which seems to address everyone's concerns. In case anyone remembers the recent attempt to get approval for a bot to edit a protected page, Raul654 has stated that he would be willing to set its flag once it proves reliable.
It occurs to me (as opposed to me relaying the results of the discussion to date) that some sort of vandal-spoofing feature would be useful. Wikipedia:Featured articles is already under semiprotection, but a particularly determined vandal might add or remove items to make the Main Page number jump. One idea is the use of a user whitelist (admins and selected non-admin FA regulars), in which the bot waits 15 minutes or so if anyone not on the whitelist adds or removes articles, to give time for vandalism to be reverted. (Yes, I'm paranoid.) Hopefully this description has made sense. Thanks! - Banyan Tree 13:43, 9 December 2006 (UTC)
Out of curiosity, is there a GA botcounter? b_cubed 21:00, 11 December 2006 (UTC)
Any chance that I can get a bot to go through Category:Unassessed biography articles on the Biography Wikiproject, and have it label any that are stubs as a stub in Wikiproject Biography's rating? I know there's a bot that's doing something similar, but that one's more purpseful in finding unassessed ones rather than assessing some. -- Wizardman 05:56, 10 December 2006 (UTC)
Anyone? If there's a way to do it on AWB, how would I go about doing that then? -- Wizardman 00:52, 12 December 2006 (UTC)
Is there anyway that a bot capable of unwikilinking dates (specifically years) could be made? I read a lot of articles that contain such wikilinked dates, e.g. 1942 or 1784, which really add nothing to the article. I am aware, that as it stands now, the wikipedia policy on dates is to have all of the date wikilinked. However, in practice, there has been a growing trend with FA articles to unlink the dates. Personally I think a bot capable of this would be very useful. I'm not sure how to do it otherwise I'd try myself. The only concern is that you'd have to make sure it doesn't unlink the "fuller" dates, e.g. November 20 1983. (if you can respond on my talk page it would be helpful) b_cubed 19:52, 8 December 2006 (UTC)
::I think I saw some discussion of this on another page recently. Objections were raised that:
::*It would lead to overlinking - ie wikipedia doesn't want every year linked.
::*Four-digit numbers could be used as a number or a year. ie "In the year 2000, X did Y" or "X led 2000 troops into battle"
::*Numbers can refer to something more specific ie "2000 AD" or whatever that Judge Dredd comic is.
::*Numbers could be years, but as part of fuller strings, ie "Battle of Suessonia (1976)" should link tot he whole battle article, not just the year.
::*It wouldn't be possible to have a sufficiently clever bot to get round these caveats, so it would probably have to be manually-assisted and would thus represent a massive amount of work.
::Note the above are my recollections of what I read of a discussion of this same idea elsewhere
[Copied from above]: (removing indent) I applaud your wish to contibrute your programmign knowledge its just I personally think re-arranging whitespace is a poor use of your bot's time and the server's resources. If there isn't already such a thing, a bot that flags near-empty articles as stubs would seem a much better idea. If it was able to flag the stubs as project stubs based on the category it was in or article name (this would probably need to be manualy-assisted) then all the better. Why not start a new section on this page to discuss this and move these comments there? - PocklingtonDan 19:11, 8 December 2006 (UTC)
I recently created a new category Cars of England. There are quite a few cars from england, and it's quite tedious to add them all. I do, however, has a list of categories that list cars from manufacturers in England. Is there a way to have a bot add the category to all the articles found in these sub-categories? Thanks, Riguy 07:13, 10 December 2006 (UTC)
Does anyone want to take over tagging orphan articles with {{ linkless}}? The actual tagging is simple with AWB, just load Special:Lonelypages (it refreshed Saturdays/Wednesdays as of mid-November) and run the bot, it will make about 800 edits each refresh. I can supply you with the regex I used to ignore an array of pages that should not be tagged (dab pages, pages to be transwikid, various other odd stuff). You would also want to maintain a list of orphaned articles, you can see what I mean at User:W.marsh/orphans articles/A-C.
The one hitch is that you will also want to de-orphan the ignored articles somehow or other, you could create a list of orphaned dab pages and so on from your userspace, or manually add them to Wikipedia:Links to disambiguation pages (which is what I did, hence the burnout after 6 months most likely). It takes maybe 2-3 hours a week (almost all on the manual stuff), maybe 5-10 minutes plus bot time if you just do the tagging and output the skipped articles to lists in your userspace.
The more automation you can add (e.g. automatically updated lists) the faster this would be, unfortunately I could never add much except automated tagging with AWB. It's not glamerous work (for every 3,000 or so edits my bot made, I got about one reply on my talk page) but I think it's helpful work, I did notice a whole lot of articles getting de-orphaned within a few days after the tag was added. -- W.marsh 15:37, 13 December 2006 (UTC)
linkless|geodis|copyvio|{{Disambig|this AfD|This page has been deleted|#redirect|dated prod|4LA|{{dab}}|{{hndis|{{disamb|numberdis|4CC|3CC|2CC|Schooldis|Shipindex|Tempdab|Wikipedia does not currently have an encyclopedia article for|{{wi}}|{{surname|4CC|TLAdisambig|{{deleted|{{move to|{{dicdef
We have a lot of articles about individual US towns, many of which refer to the "2000 census" in their introductions. As we have an article for the United States Census, 2000, maybe somebody could script a bot to wikify those references. Cribcage 07:43, 14 December 2006 (UTC)
|
Hello, I've upgraded with others Wikipedia:Translation into English which is now here : Wikipedia:Translation.
One thing we decided is to migrate from one big static page with all the available translators (which is heavy, hard to maintain, never up-to-date) to two userbox templates ({{ Translator}} and {{ Proofreader}}) so 1) all the work is done automatically by the categories 2) each translator has a link to the translation page on his user page.
To migrate from the old system to the new one, since they were a lot of people, I need a bot which would go through every user listed on Wikipedia:Translators available, and which would let the following message on his talk page :
{{subst:Translation/Talkpage}}
Jmfayard 18:22, 14 December 2006 (UTC)
Can someone make a bot which places new user templates on someone's talk pages? If you can, I would seriously appreciate it! Bushc a rrot ( Talk· Desk) 23:20, 15 December 2006 (UTC)
Is it possible to make a bot that automatically removes users that are blocked? Often times admins forget to remove users that they block or spend time blocking the user and putting the appropriate message(s) on the blocked user's page. Or, sometimes when there are a large number of notices on WP:AIV they will go through blocking users first, waiting until a bit later to remove users. So, to help with efficiency (by cutting down on conflicts) a bot might be useful. -- tariqabjotu 00:37, 15 December 2006 (UTC)
As part of some work I am carrying out on Scottish Historic Railways, I have created a progress and reference page in my user space at User:Pencefn/Historical Scottish Railways. I would like to add the latest revision date/time into the table for the article and associated talk page on the second and fourth column respectively - which is updated to reflect the work in progress (covering updates to articles and the potential addition of more articles). Can anyone help me? Stewart 19:16, 21 December 2006 (UTC)
Whoever puts any comments on REF desk without signing should be given a welcome template, so they will sign next time. (only if they don't have the welcome template already).-- Judged 23:10, 22 December 2006 (UTC)
Happy Holidays and Happy New Year to everyone. I'm curious if anyone knows of any bots working the neutrality template categories. I would like to know what percentage of articles have neutrality-related tags by WikiProject and have a report generated, with a template updated on the project page (Pearle produced a similar report listing articles needing cleanup). After the report is generated, the template on the project page could be updated with a percentage linking to the category of WikiProject-related neutrality issues. Something like, "12% of articles require attention for neutrality-related issues." WikiProject departments would deal with this. The bot would only need to be run once a week. Thanks. — Viriditas | Talk 03:33, 25 December 2006 (UTC)
I was wondering if someone with a bot would be able to place banners for Wikipedia:WikiProject Massively multiplayer online games on all talk pages (including non-existant ones) in the category of Category:Massively multiplayer online games including it's sub-categories. This need not be a reoccurring event, but it is necessary to get this WikiProject up and running. Any help is appreciated! Greeves 04:31, 19 December 2006 (UTC)
A bot would be needed to carry this task following a modification to {{ Permprot}}:
Circeus 21:14, 23 December 2006 (UTC)
Given that my proposal for an additional step to the AfD process (found here) is meeting both opposition and the suggestion that the job could be better done by a bot, I've brought that proposal here. The suggestion is a reasonably simple one:
This would avoid the bureaucracy that is the major criticism of my original proposal, and (hopefully) significantly reduce the problems of biting that I raised there. Thanks! Daveydw ee b ( chat/ review!) 01:11, 27 November 2006 (UTC)
I'm using the Weblinkchecker.py bot and have a whole load of bad links. Is there a way to have a bot remove them from the articles? (I reaize that this could be hard, since we have refs and [] links) One output looks like:
ST47 Talk 22:13, 5 December 2006 (UTC)
I'm trying to find out how many articles and categories ultimately descend from Category:Dungeons & Dragons and how many of these are stubs (both by categorization and by byte/word count). Lists would be good if possible. For comparison sake, I'm also seeking similar numbers for Category:Chess. This is for the following purposes:
Neon Merlin 23:56, 24 December 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | Archive 6 | Archive 7 | Archive 8 | Archive 9 | Archive 10 | → | Archive 15 |
Too much has been allowed to creep into this category and be reused without watching. The template and category should have been deleted a long time ago, but now represents so much work in removing images from articles nobody would dare approaching it. Can a bot deal with removing images from articles? Circeus 01:03, 25 June 2006 (UTC)
A bot (or perhaps a script, or some other tool) would be very useful to the growing amount of people (myself included) who are interested in studying Wikipedia. I'd very much like to see and use a tool that would look at the history of any article (including a talk page!) and:
Even one or a few of those if implemented would be much, much appreciated! If we already have tools that can answer some of these questions, please let me know.-- Piotr Konieczny aka Prokonsul Piotrus Talk 00:40, 28 June 2006 (UTC)
I have been thinking over the last few days and seeing that Category:Images with no copyright tag, Category:Images with unknown copyright status, Category:Orphaned fairuse images and Category:Images with no fair use rationale get back-logged quite a lot, I think I bot would be able to help us combat these backlogs. The bot should go through the sub-categories of the main categories and remove all those images listed at the sub-category and remove them from the article(s). The bot would not delete the image, merely remove it from the article(s) that contain the image. This would make it a whole lot easier for admins to go around these categories and delete the images there, and not have to remove the images from the article themselves. I am aware OrphanBot does some work linked to this, but it doesn't go through all of the categories. Iola k ana| (talk) 15:20, 25 June 2006 (UTC)
I'd very much like a bot that would fill in the blanks in incomplete {{ cite book}}s. A cite can be considered incomplete if it's missing any of the following parameters: Title, last, first, publisher, year, id.
I can see five cases in which this could be used (given in fall-through order):
If the bot is unable to fill in all of these basic parameters, it should insert ?s for the missing ones and/or a page comment, to show that it has tried.
If this type of bot responds quickly to new {{ cite book}}s, it will no doubt save editors a lot of tedious work looking up and typing in all the data fields. We can tell them to type just the ISBN, and suggest that they come back later and verify the bot's work. Seahen 03:42, 28 June 2006 (UTC)
Bot's Name: SpyroBot Editing Spyro Articles
I noticed a lot of the requested article pages are in quite a mess with not being proper formatted, ie bullet points.
Would it be possible for someone to write a bot to place bullet points in front of the requests to make the pages look better?
I did do one page manually, and it took quite a while.
And, would it be possible for a bot to automatically remove blue links.
I would happily run it if someone could write it!
Thanks!
Reedy Boy 10:38, 1 July 2006 (UTC)
I have already created a list of some cross-space redirects at User:Invitatious/cnr. I would like this bot to do the following:
==BE== * [[:Being a dick]] → [[:Wikipedia:Don't be a dick]]
Invitatious 19:07, 1 July 2006 (UTC)
I'd like to see if anyone's up for writing a bot to help those who monitor PROD. Specifically, I think a bot could:
Number 1 especially would be very helpful in patrolling. I don't have the skills to write the bot myself, or I would. Mango juice talk 14:50, 2 July 2006 (UTC)
I am requesting a bot that can detect & remove offensive language.-- StitchPedia 00:40, 4 July 2006 (UTC)
—Preceding unsigned comment added by OneWeirdDude ( talk • contribs)
Will a botbeard please go through the Whatlinkshere for Template:Redirect-acronym (recently moved from Template:Disambig-acronym) and snap the links to point to the new name? JesseW, the juggling janitor 03:10, 6 July 2006 (UTC)
I think a bot that creates an index for all 2 and 3 word articles according to their initials (instead of the first letters as in Wikipedia:Quick index) would be very useful for browsing and figuring out abbreviations. I guess in current software it can't be done in a dynamic way or searched for initials. Any ideas?
I would like to work on such a bot, but I have no idea how to write one (even though I have some programming experience). If somebody send me a code for a similar bot (that looks every article title and categorize for a certain criteria) I can modify it.
-- þħɥʂıɕıʄʈʝɘɖı 22:10, 6 July 2006 (UTC)
A bot is needed to keep template:ISBN out of actual use, see Wikipedia:Templates for deletion/Log/2006 July 9#Template:ISBN. Circeus 12:59, 11 July 2006 (UTC)
We've got a non-Wikimedia MediaWiki site over at
Wikible; there are several versions of the wiki in different languages, so we also have a Pool (like Wikimedia Commons). We've changed how things work and now have a bunch of images to transfer from wikible.org/en to wikible.org/pool. I don't think anyone has any bot experience in our group; would you mind helping us out? Read
the discussion on our site for more background and relevant links. Thanks! --
J. J. 18:42,
11 July
2006 (UTC)
Any idea whom should I contact to get more answers there?-- Piotr Konieczny aka Prokonsul Piotrus Talk 16:11, 13 July 2006 (UTC)
I'd like a bot to go through small articles and label them stubs. Anything smaller than 1k is probably a stub, yeah? -- BradBeattie 05:54, 16 July 2006 (UTC)
I need a bot to review a page I added. I've been without the necessary time to review my writing and I know there must be mulitple grammar and spelling errors. Page is H. B. Hollins.-- LongIslander 15:31, 17 July 2006 (UTC)
I originally posted this here, but figured it would be good to post it here also.
I've noticed a recent issue with WikiProjects. I've noticed it in the one I work on, Wikipedia:WikiProject_Anime_and_manga, but it probably applies to all WikiProjects. When an article is declared a "Good Article" or any other article class, editors add the appropriate tag on the Discussion page, but they often forget to add the appropriate Wikiproject tag that says "this is a good article for Wikiproject whatever". This means that the Wikiproject statistics page that shows how many Good Articles and the like the Wikiproject has may be drastically off, and the categories sorting the Wikiproject's articles may show tons of "good articles" and the like in the "unassessed" section.
Would it be possible for a bot to regularly peruse the Wikiproject article discussion pages and find ones that have a GA tag but no Wikiproject GA tag, and the same for all article assessment tags? Also, if a Wikiproject has a system for article ratings that doesn't coincide with the main Wikipedia system (as warned by a commentator on my original post), the bot could simply not affect articles on that Wikiproject. Dark Shikari 15:57, 17 July 2006 (UTC)
It could change M/Y or M/D/Y dates for template defaults like {{cleanup}} and various "As of..." containing articles. It sounds like it wouldn't put too much strain on the servers and would prevent articles from becoming inaccurate datewise. -- Blackjack48 23:38, 18 July 2006 (UTC)
How about a bot that fixes links that redirect. Like, say a link points to A, which redirects to B; the bot could fix it that it points straight to B. OneWeirdDude 18:22, 20 July 2006 (UTC)
Given the recent apparent demises of both Crypticbot ( talk · contribs) and NekoDaemon ( talk · contribs), which did a bunch of housekeeping tasks related to WP:CFD, WP:TFD, and the WP:VPs (and others), I propose we designate some bot source as the "official" daily maintenance bot, create an account for it, post the source, solicit an owner, and put its daily tasklist (instructions) in a protected file. This way anyone could propose or even implement new tasks for it, and if the current owner goes missing, anyone else could take over as its owner as well. I've started a list of maintenance activities at wikipedia:maintenance/tasklist. IMO, having the normal operation of any of the *fD activities rely on any individual's closed source bot is not a sustainable model. Comments? -- Rick Block ( talk) 01:07, 21 July 2006 (UTC)
This comment is a little different from the others here as its not directly a Bot request but rather some assistance with clarifying whether a bot that can do the following is possible. Any thoughts before I embark on trying to make it are greatly helpful. Please forgive me if this is the wrong place to raise this. I am in the early stages of designing and building a bot named the Prolificity Sentinel.
In short the bot will flag articles where:
It will upon finding a suitable candidate for flagging edit the page of the article and give it 'Sentinel Alert Status'. This will be a category all wikipedians can see and go through.
Any way, thats a very breif overview, i've discussed it in much more detail here Prolificity Sentinel. Thansk for your help and sorry if this is not an appropriate place to ask this. -- WikipedianProlific (Talk) 22:43, 25 July 2006 (UTC)
I guess the three key things I'd need to know per page on the run are:
So per page analysed there'd be 3 requests for information if it was going to be flagged. I was thinking of setting the Bot up with specific parameters say, so it would only do a run of say 1,000 articles at a time(roughly one per minute over the course of a day?) I can make the bot less server intense by having it immiediately stop requesting say the last talk page edit if its already found out that the main page edit was within the last 6 months. That way the majority of articles would only have one piece of information requested making sure I dont accidently DoS the server. Any thoughts on that. Is there a database query that can be made to ascertain the last page edit date? Thats the key think I need to know and I can't find a script for it if one exists. ta -- WikipedianProlific (Talk) 00:33, 26 July 2006 (UTC)
I wanted to ask if someone could create a bot to automate the changing of links to the CIA's World Factbook web site.
For example, the current link for Malaysia's entry on the Wikipedia page for Putrajaya point to http://www.cia.gov/library/publications/the-world-factbook/geos/my.html . If you go to that address, the CIA says that the page has been moved. Even worse, it doesn't forward you to the correct page for Malaysia. Instead, it redirects you to The World Factbook's front page and you have to navigate yourself to the right page.
I thought they re-did the directory structure or something, but found that the change is much more subtle. They've simply required The World Factbook to be accessed using the secure server method.
So, http://www.cia.gov/library/publications/the-world-factbook/geos/my.html can be successfully viewed at https://www.cia.gov/library/publications/the-world-factbook/geos/my.html .
Can someone create a wikibot to go through the wiki files and change the http://www.cia.gov.... to https://www.cia.gov ?
I figure it'd be more helpful since the CIA doesn't do the forwarding automatically... The wiki community would appreciate it. :-D
Thanks!
Brian -- Bsheppard 23:57, 26 July 2006 (UTC)
Okay, I put in the proposal on WP:BRFA. Feel free to comment, criticize, etc. As a little note about searching for the factbook text, some of those articles already have the http format in place. The bot is going to replace http://www.cia.gov with https://www.cia.gov. Can you guys give me some examples of the NIC thing, so I can add it to the proposal? alphaChimp laudare 22:11, 29 July 2006 (UTC)
Is there a way to make a tool that will help tell us if two users have ever interacted before? For example, it tells you (and provides diffs) of instances when user A has edited user B's talk page (or any user space page) and vice versa. NoSeptember 08:23, 30 July 2006 (UTC)
I don't know whether this requires a new bot or just an addition to a data file for an existing bot. Anyway, various people add links to various pages in http://www.websearchinfo.com/ e.g in Cloaking. All links to this site are spam e.g. http://www.websearchinfo.com/poker and http://www.websearchinfo.com/cloaking-techniques.
Could we create / amend a bot to revert all links to this site shortly after they are made? Nunquam Dormio 11:54, 30 July 2006 (UTC)
Is there a bot that will do font color changes? That is change every instance of a six digit hex string to another 6 digit string within a single page. What I am planning to do is explained here, but the first change is still a month or so in the future. NoSeptember 18:18, 24 June 2006 (UTC)
I'm looking to write a bot that will go trough all pages, (or for wikipedia, preferably a dump) and look for references to uploaded .gif's if found, convert them to .png's and update the reference. I have some experience in programming, but not anything of this sort. I seems like a fun project, but I would like some help making it. Martijn Hoekstra 17:33, 19 July 2006 (UTC)
Have you looked at gif2png, which also includes a python web2png, which may do what you want?
moved the following from the talk page to here: Martijn Hoekstra 17:49, 19 July 2006 (UTC)
Such as bot would go around, and whenever the string "ISBN" is followed by 10 or 13 digits, would compute the special ISBN checksum, and if the checksum is invalid, leave a comment or a template to the effect that someone needs to check the transcription of the ISBN or fix it. -- maru (talk) contribs 04:50, 26 April 2006 (UTC)
Can someone design a bot that when a user posts an alert on the WP:AIV page, it places this template on the reported users' talk page:
This message is to alert you to the fact that you have been reported to the Administrator Intervention against Vandalism (AIV) page so that your case can be reviewed by an Administrator. They may then impose a block for a period of time on your IP to prevent you from editing in the future. If you wish to contest the merits of the report, please post it under the actual report on the AIV page. Do not remove the initial report, as this will probably not help you in trying to prove you are not a vandal.
The template is {{subst:User:Daniel.Bryant/AIV}}. I'll run it off my main user, or create a new bot account, whichever is easier. Thanks! Killfest2 (Critique my new user page design please) 05:13, 20 July 2006 (UTC)
There are hundreds of articles about holders of the Victoria Cross where the opening statement is a standard phrase:
the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to [[United Kingdom|British]] and [[Commonwealth]] forces
which should be disambiguated to
the highest and most prestigious award for gallantry in the face of the enemy that can be awarded to [[United Kingdom|British]] and [[Commonwealth of Nations|Commonwealth]] forces
Simultaneously, most of these articles could have the reference title Monuments To Courage corrected to Monuments to Courage, and many of them need SCOTLAND'S FORgotten VALOUR (and other eccentric capitalisations) converted to Scotland's Forgotten Valour.
Colonies Chris 22:55, 17 August 2006 (UTC)
I was hoping to get a bot to change all instances of List of professional wrestling throws#Spinebuster slam and Professional wrestling throws#Spinebuster slam to List of professional wrestling throws#Spinebuster. Although currently it is linked to spinebuster slam it is a nearly universal consensus that it is always linked to simply as spinebuster and never referred to as a spinebuster slam. The problem is it's a very common link in wrestling profiles and would take an exceedingly long time of human interaction to find all instances of the link and remove the word slam from the URLs. Could someone help me and WP:PW out? --- Lid 10:01, 27 July 2006 (UTC)
I would like to request a bot for me that can:
This would be good, since Curps block bot is inactive at the moment. -- TheM62Manchester 12:05, 30 July 2006 (UTC)
Will anyone be willing to create a bot to redirect ticker symbols to their respective companies? - Blackjack48 18:59, 30 July 2006 (UTC)
We put a request in recently for all the articles in Category:Architects and all it's sub and sub-sub categories to have {{Architecture}} added to their talk pages. I think only the root category got done. Would it be possible to now do the sub and sub-sub-cateogies? Many thanks. -- Mcginnly | Natter 23:33, 31 July 2006 (UTC)
A bunch of images are showing up on Category:License_tags. When I looked at the source it looks like some public domain template might have gotten subst'd in rather than being included using PD-whatever resulting in those pages having the License_tags category applied even though it's in a noinclude section. I forget if subst works that way, but that's my best guess.
I created Template:PD-Japan for some of those images and started applying, but I think it might be bottable. If so that would be much simpler than doing it by hand :-). At the least, I would think the noinclude sections could be removed from the image pages using a bot.
RainbowCrane 22:35, 1 August 2006 (UTC)
{{ Hospital-stub}} and Category:Hospital stubs were created today. Articles in Category:Medical organization stubs that have the word "Hospital" in their title need to be re-stubbed from {{ med-org-stub}} to {{ hospital-stub}}. I count over 200 such articles in Category:Medical organization stubs. If a bot could please do this it would save a lot of time. Kurieeto 18:25, 3 August 2006 (UTC)
I think there should be a bot that fixes spelling mistakes and typos
There are still lots of such maps not moved to Commons. Is it possible for bot to do it? If bot can't find maps, which haven't been moved, I can find some of them. Paweł ze Szczecina 13:34, 6 August 2006 (UTC)
Hi! I would like to ask someone to add missing interwiki links to Polish Wikipedia in Category:Asteroids. Those articles appeared on pl.wiki just few days ago. I was asked to do this with my bot, but it doesn't run here yet. Asteroids articles are named identically on both wikis and we now have every article that you have, so it shouldn't be that hard task. :) And sorry for my probably poor English. Thanks a lot for help, Jozef-k 10:33, 7 August 2006 (UTC)
Regarding this discussion, please someone move every article in Congenital genetic disorder category to Genetic disorder category. Thanks! NCurse work 19:36, 7 August 2006 (UTC)
We all know OrphanBot removes unsourced images. The thing is, sports logos are exempt, because it being a logo is a source in itself. Recently, OrphanBot removed a logo for the Pee Dee Cyclones, and it was in a teamtable and everything. Therefore, I request the installation of FreedomBot, a bot that will undo any damage OrphanBot may unintentionally do when someone feeds it a little too many cookies (restore sports logos OrphanBot may have removed-as long as they're sourced). Tom Danson 14:52, 8 August 2006 (UTC)
lol i love the name already :P
Anyway.. I was going through some random pages and i saw
Turkey slap, and on the talk page I noticed it had survived an AfD, but all that was placed was this
[8]. So i changed it into the right header
[9], and it prompted me to think there must be heaps more of these, hundreds possibly on Wikipedia, and I think we need a bot to either a)put the notice on Talk pages, or b) Change plain text into the proper font. I would be happy to run the bot. I welcome feedback. Thanks --
Deon555|
talk|
e|
Review Me! :D 05:16,
9 August
2006 (UTC)
Eagle_101 has created this [10]. Thanks anyway :) -- Deon555| talk| e| Review Me! :D 03:14, 10 August 2006 (UTC)
I'd like to request an image trawling bot to do some indexing for me. Now, I'll explain with the first one I'd like to go with. Start with linksearch. Go to each image page and log if that image is not in Category:NASA images. I want to try to generate a list of image that reference a NASA image page, but don't have the correct image classification. Post results somewhere (probably not on this page, of course). Would prefer a wikiformatting, like * [[:Image:blah]]\n for each image to make it easier, but whatever. Just need to get a list of images that refer to that link and are not in that category, so I can analyse them for retagging. Note that Special:Linksearch needs to be screen scraped. &action=raw and the rss and atom feeds don't work for it. Don't forget to restrict results to the Image: namespace. If this works, I may request a different URL and a different cat to be compared in the same manner. Thanks! -- Kevin_b_er 05:22, 10 August 2006 (UTC)
I'm attempting to restart this project, and was wondering if a bot could regularly run on a list of its subpages (such as Wikipedia:WikiProject Deletion sorting/UK) to remove transcluded deletion discussions that have been closed. It's not ready for the bot to start yet, I just want to find out if it's feasible/easy, and if anyone is willing to do it. the wub "?!" 14:42, 10 August 2006 (UTC)
Quite simple really, a bot that checks all edits that are done to User: and User_talk: pages and user space for swears/insults/racism etc. And ****'s them. It is optional and only watches user pages that are listed in the bots user space. Thought this would be useful for admins, and users that recieve a lot of vandalism/attacks. Good idea?-- Andeh 18:21, 10 August 2006 (UTC)
I've found a bug in this template. I fixed it, but there are a ton of pages that used the broken template. Here's what I've done, and here's the list of sites that need to be fixed. Is this a thing a bot could fix, or does this need to be done by hand? Lawilkin 22:34, 14 August 2006 (UTC)
The content of Athletics was recently moved to Athletics (track and field), and a disambiguation page placed at Athletics, after a lengthy discussion. However, many pages on track and field still link to Athletics. We need a bot that would redirect appropriate links from Athletics to Athletics (track and field), using the pipe trick to keep the wording the same in the text. -- Mwalcoff 01:20, 15 August 2006 (UTC)
There's probably already a generic bot that can do this, but I need a bot to add the {{ WikiProject Kentucky}} template (if it or {{ LouisvilleWikiProject}} isn't there already) into the talk pages of articles listed in specific categories: Cities in Kentucky, Kentucky counties, Towns in Kentucky, and Unincorporated communities in Kentucky. Doing this manually has become too much of a chore. Thanks! Stevie is the man! Talk • Work 17:41, 15 August 2006 (UTC)
Firefox has unfortunately taken a leave of absence. Does anyone else have the capability to run a bot in the manner I heretofore described? Thanks! Stevie is the man! Talk • Work 14:58, 18 August 2006 (UTC)
I can Take over just let me get it approved. Betacommand 15:54, 18 August 2006 (UTC)
A couple weeks ago, I noticed a significant disparity between a Wikipedia page and the Meta master page. I updated it, but I notice there are a lot of such pages, and many are not kept up-to-date. I suggest a bot to update page copies on a regular basis.
The pages in question are most—but not all—of the pages listed here and here.
Now it's worth noting that people do edit these pages, despite the prominantly displayed message suggesting that they refrain from doing so. It may be worth the time to transfer significant edits back to the Meta copies beforehand, if they are obviously good additions to the article. I dunno if I'd want to do that all myself though, so someone else would have to be interested as well.
Any regular bot activity should be announced on the talk page, I think, to discourage anyone making changes that may be quickly overwritten. -- Tsuji 02:16, 17 August 2006 (UTC)
I've run into a number of people lately whose idea of fun is editing numbers on Wikipedia (and other information that most people won't recognize as vandalism, but numbers are the most obvious) just for the hell of it. Some do it every day as a hobby. They usually switch usernames often, or use anon/proxy.
It would be very useful to have a bot that downloads the Wikipedia database (to avoid undeserved load on the real one) and searches the entire edit history of Wikipedia for edits that:
These edits could then be checked out by Wikipedians to see if they're vandalistic or not. Dark Shikari talk/ contribs 00:32, 2 August 2006 (UTC)
Discussion of this proposed bot's function can also be found at Wikipedia talk:Reference desk.
Cleaning and maintainance of the Reference Desk was formerly performed on a daily basis by the now defunct Crypticbot. Due to the incredibly large size of reference desk archive pages, Crypticbot began to have problems archiving the questions. In order to make the reference desk easier to use, the way in which archive pages are managed was recently changed. Consequently, a new bot will need to be designed to handle daily maintainance tasks. Each day at approximately 00:00 UTC, the bot would have 18 tasks to complete: three pages must be modified (main page, monthly archive page, daily translusion page) for each of the six reference desks (Humanities, Science, Mathematics, Computing, Language, Miscellaneous).
As such, the bot must be able to complete the following tasks for each of the six reference desks on a daily basis at approximately 00:00 UTC:
At the start of each month, the bot would have six additional tasks to perform to create a new monthly reference desk archive page with {{ Reference desk navigation}} for each of the six reference desks.
Detailed Example of Bot's Function:
At 00:00 UTC on August 16, 2006, for the Miscellaneous reference desk the bot would:
<noinclude> {{subst:Reference desk navigation |previous = Wikipedia:Reference desk archive/Miscellaneous/2006 August 13 |date1 = August 13 |next = Wikipedia:Reference desk archive/Miscellaneous/2006 August 15 |date2 = August 15 |type = Miscellaneous }} </noinclude>
<!--werdnabot-archive--> = August 14 = [[Wikipedia:Reference desk archive/Miscellaneous/2006 August 14]] #A type of chair #Male Orgasm #World Trade Center Movie #edits #Maps from Nationalatlas.gov #Clitoral Hood Piercing #Guitar #Alexander Graham Bell #Cruise control on the 1998 ford windstar #Gangster Chronicles TV Series #Physics of a bullet #T.E.A.M. #Who would be richest? #My surname is Bencko. #Top Hats #pounds to dollars #The New York Pass
In addition to its normal daily tasks, at 00:00 UTC on the third of each month, the bot would need to create a new monthly reference archive page for each of the reference desks. After creating all six monthly pages, the bot would perform its normal daily duties.
For example, at 00:00 UTC on September 3, 2006 the bot would create Wikipedia:Reference desk archive/Miscellaneous/September 2006 as a new page containing the following text:
<noinclude> {{subst:Reference desk navigation |previous = Wikipedia:Reference desk archive/Miscellaneous/August 2006 |date1 = August |next = Wikipedia:Reference desk archive/Miscellaneous/October 2006 |date2 = October |type = Miscellaneous }} </noinclude>
-- C. S. Joiner ( talk) 23:07, 15 August 2006 (UTC)
<noinclude> {{subst:User:71-247-243-173/RDmonthly |previous = |date1 = |next = |date2 = |type = }} </noinclude>
is now being used for monthly archives. The changes were to decrease the work load so that it could be done by hand until a bot was made available-- VectorPotential71.247.243.173 21:16, 4 September 2006 (UTC)
There are a bunch of articles in this category: Category:Articles_with_invalid_ISBNs that have this reference in them:
''Naval wars in the Levant 1559-1853'' - R. C. Anderson [ISBN 0-87839-799-0]
The correct reference for the 2005 edition (the only one with an ISBN I could find) is:
Anderson, R. C. (2005), Naval wars in the Levant 1559-1853, Martino Pub, ISBN 1578985382
or, wiki-style:
{{Harvard reference|ISBN=1578985382|Title=Naval wars in the Levant 1559-1853|Given1=R.C.|Surname1=Anderson|Year=2005|Publisher=Martino Pub|Location=Mansfield Center, [[Connecticut]]}}
Is it possible for someone to do a mass search and replace on that reference in the bad ISBN category? It looks like all of the bogus ISBNs were added by [User:SpookyMulder] on September 4, if that helps.
Examples:
Thanks RainbowCrane | Talk 02:29, 25 August 2006 (UTC)
Can someone write a bot that parses a page in the form (...)(...) and prints the results on another page? The format of (...) is (quality and importance and name). Ratings could be posted on another page. The rating of an article is the average of all the user ratings. There is a quality rating and an importance rating (quality and importance are numbers). The URL for the ratings is http://en.wikipedia.org/wiki/User:Eyu100/Bot_area, but there are no ratings yet. This bot will be used for the Wikisort project. Eyu100 18:08, 25 August 2006 (UTC)
Occasionally, when browsing categories, I find user pages that are tagged under some of the encyclopedic categories. These pages are usually user sandboxes or "Works in progress" of articles that they copied to their user space to thoroughly revise incrementally. Now, I'm still quite new at Wikipedia, so I may be missing something here... but I am surprised that the encyclopedic categories aren't hardcoded to skip User: Space pages.
In lieu of such a change, it seems like it would be simple to have a bot go through the categories, and when it finds an article linked to User: Space, it would go and tack <nowiki> tags around the [[Category:(.*)]] links. Of course, templates that add categories, like {{stub}} and such would make things more difficult, but I imagine the most common templates could be similarly coded into the bot.
Does such a bot exist? Is there a particular reason why it doesn't? Just a few thoughts. Matt B. 06:16, 31 August 2006 (UTC)
At Wikipedia:Categories for discussion we're entering a terrible logjam caused by my taking on the responsibility to ensure Wikipedian user categories all had "Wikipedian" in the title. These are many hundreds of categories, and some of the more traditional CfD bots (notably Cydebot) can't handle user categories. We're only about a week out from having most of these approved for renaming and deletion, but we have fewer resources for actually carrying out the renaming and deletion. So if anyone has a bot that can handle this task, I encourage you to go to Wikipedia:Categories for discussion/Working and help out. Thanks!-- Mike Selinker 20:40, 3 September 2006 (UTC)
PLEESE PLEASE PRETTY PLEASE! See my request for beurocrattiness to see what I am all about.I just think that I could program a bot-abouve-all-bots bot that picks up all vandelism and nothing BUT vandelism!-- Hi its mina19_1919! 03:57, 5 September 2006 (UTC)
Is there a bot that can add the {{Wikipedia:WikiProject Sharks/SharksTalk}} template to the top of all of the talk pages of the articles in Category:Sharks? And if there is one for that is there also one that can replace {{portalpar|Sharks}} with {{Sharksportal}} for all of the articles in Category:Sharks, and if it doesn't have {{portalpar|Sharks}} could it just add {{Sharksportal}} below the taxobox.
Is this easy to do? --
chris_huh 13:29,
5 September
2006 (UTC)
I have created WikiProject Pittsburgh, and for the organizational work I am trying to do, it would have to have the Pittsburgh links changed to article [[Pittsburgh, Pennsylvania|Pittsburgh]]. This is the only Pittsburgh in existence; all other "-burghs" in the United States dropped the last "H" at the turn of the 20th century, so anything marked "Pittsburgh" actually refers to "Pittsburgh, Pennsylvania." -- Chris Griswold ( ☎ ☓) 21:50, 5 September 2006 (UTC)
If someone could design a bot to replace things like " User:Akrabbim/Asplode" (which redirects to User:UBX/Asplode) with "User:UBX/Asplode. I've been working on it with AWB, but I don't have enough time, as there are hundreds of transclusions. I think it would be a relatively simple bot task, as it is really only a few simple replacements. The only pages that need it is User:Akrabbim/Asplode, User:Akrabbim/Earthling2, User:Akrabbim/Emptybox, User:Akrabbim/No secondhand smoke, and User:Akrabbim/Towel. I'm just moving them into the User:Akrabbim/UBX subuserspace. It would be appreciated. — Akrabbim talk 16:05, 6 September 2006 (UTC)
I would like a bot to check links to all Connecticut cities & towns and create the standard redirects [[town, CT]] and [[town (CT)]] if they haven't yet been created. I have noticed that these have not all been finished for Connecticut. -- Schzmo 11:55, 31 August 2006 (UTC)
According to Wikipedia_talk:Redirect, redirects may now contain multiple lines (by design) and categories, but an unforseen side-effect is that the developers did not know redirects contained templates such as {{ R from misspelling}} and the other contents of Category:Redirect templates (and they may break this functionality in the future). There are currently 52 templates, 34 of which may have no current transclusions, and ~15,000 instances that need to be substituted in - all they contain is the categories Category:Redirects from X and Category:Unprintworthy redirects, to my knowledge. -- nae' blis 19:23, 31 August 2006 (UTC)
Can someone please write a bot that could tag articles in Category:Rapid transit and it's subcategories with {{TrainsWikiProject|Subway=yes}}? I'd do it myself, but the perl module used appears to require a steep learning curve (that and the fact that I have not written a perl script for ages). Thanks! -- Selmo ( talk) 00:03, 2 September 2006 (UTC)
Does anyone know of a Bot that I could request to add a project tag {{ Project North Carolina}} to every article talk under a Category (including Sub-Categories) - (North Carolina), if the tag does not already exist on the article? Looks like Selmo has the same request. Thanks Morphh 04:04, 4 September 2006 (UTC)
I've also requested feedback on this idea from User:Beland:
"Hi,
I had an idea to sort Category:Pages needing expert attention according to expertise. That way, Wikiprojects could easily track articles needing expert attention in their field. I think expert attention will get more input from wikiprojects than general cleanup. I've already begun "Pages needing attention from expert in medicine" manually, but if a bot could detect categories, it could split up the expert-category entirely into subjects (if necessary, with a complete list of pages needing expert attention in place).
a) I'm not sure if you don't already plan to implement such functions into Pearle. b) If you like the idea, I'd be interested in running a clone of Pearle for this task. Or maybe Pearle could be expanded.
Anyway I have no experience with bots. "
-- Steven Fruitsmaak ( Reply) 12:06, 4 September 2006 (UTC)
Hi everyone. I'm a member of wikiproject Writing Systems. A while back I created the template {{ wsproj}} with the intent of adding it to every article under the topic of writing systems, but after looking at Category:Writing systems, I realised it would take an extremely long time to do and misleadingly inflate my edit count. Is there a bot that could add this template to the talkpage of every article under Category:Writing systems and its sub-categories? The ikiroid ( talk· desk· Advise me) 23:04, 7 September 2006 (UTC)
There's been some talk at CFD about having a bot patrol Category:Protected deleted categories to make sure the categories stay empty. Does anybody have a bot that can do that? - EurekaLott 02:56, 6 September 2006 (UTC) (copied from Wikipedia_talk:Bots#Category:Protected_deleted_categories TimBentley (talk) 16:10, 8 September 2006 (UTC))
Per the MoS (dates and numbers), dates should read September 10 or January 1 instead of September 10 or January 1. Although many pages use the "th", "rd", and "st" for dates. Is it possible to get a bot to fix this? Either by going through the articles or by going to the "What links here" page for each of the incorrect date formats and changing the linked pages? Not only would September 10 have to be checked but also 10 September. Dismas| (talk) 07:46, 10 September 2006 (UTC)
AWB request: http://en.wikipedia.org/?title=Special:Whatlinkshere/Category:Evolution_Wikipedians&limit=500&from=0 - remove this category from these places, it's been deleted by CfD CfD: http://en.wikipedia.org/wiki/Wikipedia:Categories_for_deletion/Log/2006_May_10#Category:Evolution_Wikipedians
Thanks! JesseW, the juggling janitor 21:27, 13 September 2006 (UTC)
Berria requested:
Adding 31 slightly varying interwiki's seems like the perfect job for a bot, so I'm posting it here... JesseW, the juggling janitor 22:06, 31 August 2006 (UTC)
A bot could create a list of proposed edits, and an admin could approve them manually, but it makes it alot harder. HighInBC 19:15, 15 September 2006 (UTC)
A while back Template:Cite journal was forked to create Template:Cite journal2, the only difference being that "In contrast to cite journal, cite journal2 omits the quotation marks around the article title." Since this fork, an option was coded into cite journal so that the quotation marks can be removed from individual usages of the template. Would a bot be able to change all usages of {{ cite journal2}} to {{ cite journal}}, copying over the existing information for each usage of the template and adding "|quotes=no" at the end? For example,
* {{cite journal2 | author=Könneke, M., Bernhard, A.E., de la Torre, J.R., Walker, C.B., Waterbury, J.B. and Stahl, D.A. | title=Isolation of an autotrophic ammonia-oxidizing marine archaeon | journal=Nature | year=2005 | volume=437 | pages=543-546}}
needs to be changed to:
* {{cite journal | author=Könneke, M., Bernhard, A.E., de la Torre, J.R., Walker, C.B., Waterbury, J.B. and Stahl, D.A. | title=Isolation of an autotrophic ammonia-oxidizing marine archaeon | journal=Nature | year=2005 | volume=437 | pages=543-546 | quotes=no}}
For comparison, this is how the two render:
{{
cite journal}}
: Unknown parameter |quotes=
ignored (
help)CS1 maint: multiple names: authors list (
link){{ cite journal2}} can then be depreciated. I've contacted the original author of the fork, and they have agreed to the merge (see User_talk:Stemonitis#Template:Cite_journal2). About 120 pages would be affected by this change, with multiple instances of the template in use on each page. Thanks. Mike Peel 21:43, 14 September 2006 (UTC)
Table requested:
Any help appeciated
I've moved a few userboxes onto my user space, and I need a bot to update them.
Thanks! Laur ə n whisper 18:24, 20 September 2006 (UTC)
I am not certain if this is the right place to post this question (if it is not, please move it to the right place and drop a note on my talk page.
The editor Sheynhertz-Unbayg has recently been banned and now his contributions (mostly weird "onomastics" pages that are just concatenations of several disambiguation pages, see Lust (onomastics) for a typical example) need to be cleaned up. To ensure that all pages he has edited do get checked, I would like to have a bot- or script-generated list of all pages he has touched. As he has more than 20,000 edits, manually created lists like the one here are probably incomplete. Also, a centralized list would help avoid duplicated efforts from the people who check the pages.
Please create a list of all pages touched by this editor and drop it somewhere, for example at User:Kusma/Sheynhertz/contribs. A good format would be a bulleted list with wikilinks to the pages, perhaps with a "redirect=no" or a mentioning of the redirect target for the numerous redirects created.
In addition, a bot could be used to check all of the interwikilinks created by Sheynhertz. I have already removed dozens of links to nonexistent articles on the Japanese Wikipedia, and I expect that many more of his interwikis are wrong.
Thank you for any help or insight you can offer, Kusma (討論) 08:28, 21 September 2006 (UTC)
There ought to be a bot that is automated to remove common things used for testing, such as Bold text, Italic text, [[Link title]], [http://www.example.com link title], [[Media:Example.ogg]], Image:Example.jpg, #REDIRECT [[Insert text]] within other text, etc. from the article space. -- Gray Porpoise 10:48, 20 September 2006 (UTC)
Is there any way a bot can go through and look for all talk pages without an associated article/image and tag them with Template:db-talk? I've encountered a large amount of these and I can foresee a bot finding a few thousand. VegaDark 20:18, 22 September 2006 (UTC)
This page may meet Wikipedia's speedy deletion criteria, as it is a
talk page of a page which does not exist (
CSD G8).
This notice was added by a bot. There are three reasons why a talk page may exist without an article, which are:
Only delete this page if it clearly does not meet any of those three criteria. |
I was wondering if a bot could do the same for Category:Articles to be merged as it did for Category:Category needed and Category:Articles that need to be wikified. To sort them out per month. There currently is a backlog of close to 11.000 (!!) articles. And it would help a lot if you could see quickly how long the merge tag is already on an article. Garion96 (talk) 00:13, 18 September 2006 (UTC)
A while ago, I made maps for all of the cities and towns in Indiana, and started semi-automatically adding cityboxes each article. I got part way through the 'L's but I have not worked on it for a year now. It would be helpful for someone to finish the rest of the pages. The red-dot maps are located here [11]. - Marvin01 | talk 00:49, 21 September 2006 (UTC)
Hi, could someone please replace mammal-stub with bat-stub for the articles listed at http://en.wikipedia.org/wiki/User:Eug/Bat-Stubs ? Eug 13:43, 27 September 2006 (UTC)
Per the MoS (dates and numbers), dates should read September 10 or January 1 instead of September 10 or January 1. Although many pages use the "th", "rd", and "st" for dates. Is it possible to get a bot to fix this? Either by going through the articles or by going to the "What links here" page for each of the incorrect date formats and changing the linked pages? Not only would September 10 have to be checked but also 10 September. Dismas| (talk) 17:59, 21 September 2006 (UTC)
Following
Wknight94's suggestion, I would like to request a bot whose function would be to create redirect pages for articles whose names contain diacritics. For example, the bot could verify if the article
České Budějovice can be redirected from
Ceske Budejovice. If not, the latter would be created.
This could prove useful because many articles lack these redirect pages and are hard to find for those who do not possess the diacritics on their keyboards. Recently I had to create redirect pages for nearly all Portuguese municipalities. A bot could perform these tasks much more efficiently. If there is someone interested in creating this bot, thank you in advance. Regards.--
Hús
ö
nd 22:44,
30 September
2006 (UTC)
Hi. I wonder if there's a bot with the time and inclination to go around all the articles linking to Church of Jesus Christ of Latter-day Saints and change those links to point at The Church of Jesus Christ of Latter-day Saints. There are between 1500 and 1600 such pages, so it's a bit of a task to do by hand. Thanks! - GTBacchus( talk) 01:24, 28 September 2006 (UTC)
I would like permission to run Firefoxbot using AutoWikiBrowser in automatic mode. The Fox Man of Fire 14:37, 6 October 2006 (UTC)
We still need an archive bot, our current system is starting to break down, and the last bot we had was CrypticBot, so we've been doing it all by hand for quite a while now. Since our old request has been long since archived off this page (by an archiving BOT, oh the irony o:) I decided to repost a less involved version of the same request here-- VectorPotentialRD NEEDS A BOT (-: 13:01, 1 October 2006 (UTC)
Here is a new REVISED bot request. I have created a demo reference desk that could be used to test implementation of an RD bot using a slightly updated layout. It will be proposed by a few of the RD editors (including me) only once a bot is working for it, because of the increased number of desks to manage manually. Please read User:Freshgavin/Sandbox/Reference_desk_bot_request for more details about the changes that will be proposed, and for a detailed summary of the requirements for the bot.
The reference desk relies now on a few diligent editors for manual archiving, and there are a lot of people that would really appreciate a bot to help us do this task. Any suggestions, ideas would be greatly appreciated. Questions and comments about the new layout should be posted on this talk page, and those about the current system should be posted here. freshofftheufo ΓΛĿЌ 07:03, 4 October 2006 (UTC)
Off-topic question: There's an MSN bot that lets you get information from Encarta. Are there Wikipedia IM or SMS bot? Such a bot, if it doesn't already exist, would be mighty useful. IM bots would let people get information from Wikipedia without opening a web browser. And an SMS bot would let people get info anytime, anywhere, for just the price of an SMS text message. Anybody know if these exist? If not, would it be appropriate for me to file a bug on MediaZilla or would this request be off-topic there too? :-) Cheers, -- unforg e ttabl e id | talk to m e 00:54, 5 October 2006 (UTC)
The templates {{user ara}}, {{user Arab}}, {{user cyr-1}} etc shall be replaced with the parameterized template:user iso15924. Parameters given below. Some templates may be included via Template:Babel. AFAIK these cannot be replaced by a bot and will be done by hand. The request was developped at Wikipedia_talk:Userboxes/Writing_systems#Bot_request Tobias Conradi (Talk) 16:05, 25 September 2006 (UTC) The 15 replacements are as follows:
thanks a lot for your help! Tobias Conradi (Talk) 01:33, 10 October 2006 (UTC)
I'm proposing a bot that patrols articles for creation and starts new articles for unregistered users. I understand that this defeats the purpose of the restriction that only registered users can create articles, and that this bot can be easily abused. However, this bot may not be such a bad idea if the following measures were in place:
Anonymous editors would also be able to create pages by accessing the bot interface on an off-wiki site.
Any thoughts on this? -- Ixfd64 05:47, 9 October 2006 (UTC)
I'd like to ask a bot to help the Medical genetics project. We've finished the article rating mostly and we'd like a bot to tag every unassessed articles in Category:Medical genetics with {{MedGen|class=unassessed}}. Thanks in advance. NCurse work 18:30, 12 October 2006 (UTC)
At Wikipedia:WikiProject Bedfordshire/Infobox status we are trying to create a list of all pages and their infobox status. We have a template which is placed on the talk page of all these articles that contains an infobox status and automatically puts them in one of four categories. We would like a bot that automatically creates the list daily. If you do want to help please contact the talk page of that article for more information. Thanks. Lcarsdata ( Talk) 14:54, 13 October 2006 (UTC)
Hello everyone, how are you going? I'd like to request a bot for WikiProject Indonesia. With it's current number of members, it is getting hard for me to post a message on each talk page. For now, maybe I'd like the bot runs every week, to deliver weekly notices (but I can change it, right?). Thanks in advance -- I mo eng 14:11, 15 October 2006 (UTC)
I'm looking for some help converting all the article titles, links, and non-linked mentions of a great number of Japan-related articles which, when macrons (e.g. ō and ū) are taken into account, need to be respelled. The greatest congestion of these, I think, comes from the ships of the Imperial Japanese Navy. Right now, I have a very short list of names that need to be changed, but as I look into each individual ship's Japanese name and how it ought to be spelled, I'll be adding to the list of those that need renaming. The number of ships isn't too great - those that need renaming hopefully do not number more than 30-50, hopefully. But if each of those is linked to by 10 articles, that's 300-500 right there. Please let me know what to do or who to talk to. Thanks for the help. LordAmeth 23:51, 15 October 2006 (UTC)
Would there be any way to create a bot which would be able to indicate which pages have been added to the listing of pages here or elsewhere, over, say, the last month? Maybe a two-section format listing all the pages in one column and another listing either those which existed (or were revised) before a given date or were created after a given date might be easiest. Also, the bot could potentially be used to determine which pages are "stable", which is to say, not modified over a given period. Thanks for your response, positive or negative. Badbilltucker 16:16, 16 October 2006 (UTC)
I'd like a bot to go through all the articles linked to in this template and add this template to the bottom with {{ fb start}} and {{ fb end}} around it as is standard with football templates. If {{ fb start}} and {{ fb end}} are already present, just add this template in front of {{ fb end}}. Not too hard to do I hope? - MTC 11:43, 16 October 2006 (UTC)
Just like repeatedly-recreated unwanted articles are tagged with {{ deletedpage}} and protected, some unwanted categories are protected with {{ deletedcategory}}. However, given the way the cat system works, this doesn't stop people from adding articles to the cat. Hence, a bot is requested to regularly (e.g. weekly) empty these deleted categories. >Radiant< 12:04, 18 October 2006 (UTC)
User:RobotG currently clears categories in Category:Protected deleted categories. Any Category tagged with {{ deletedpage}} (or the now-redirect {{ deletedcategory}}, is placed in this category. — Centrx→ talk • 23:58, 19 October 2006 (UTC)
Do we have a welcome bot, that adds welcome templates to new user pages? Seems like a good idea to me, it was discussed on the mailing list somewhere. Mind, it seems such a good idea it's probably been discussed before. Hiding Talk 10:12, 20 October 2006 (UTC)
I've learned that the { { IPA } } template is used to enable phonetic symbols to appear as they should, and not as little squares, in IE6. A bot to do a mass conversion of "hard" phonetic symbols to { { IPA } } template-formatted phonetic symbols would be useful. Tawagoto 01:47, 16 October 2006 (UTC)
Can someone please write 4 me a bot that will pick up NPOV breaches which is shut-off complinatnt.
Thanks
Nathannoblet 04:29, 22 October 2006 (UTC)
Can someone write for me a bot that removes red links, red templates, and red categories from articles (except Template:Red link)? What it does is it will turn "{{ red link}}" into "this" for red links, and remove red templates and red categories straight from the article. -- AAA! ( talk • contribs) 11:45, 26 October 2006 (UTC)
Hi. At this page, you can see a list of links pointing at List of Ed, Edd 'n' Eddy episodes. That page, however, is a redirect to List of Ed, Edd n Eddy episodes, without the apostrophes. Could someone please sic a bot on that list and fix all the individual episode pages, which are currently double redirects? Thanks. - GTBacchus( talk) 19:34, 26 October 2006 (UTC)
Change all "Major Highways" titles to "Major highways", particularly in counties. -- MNAdam 03:41, 1 November 2006 (UTC)
You mean page titles or in text? why? is there a consensus/vote(I SAID THE V WORD ZOMG!) somewhere? ST47 Talk 11:11, 1 November 2006 (UTC)
I have noticed that perhaps six out of seven anonymous users who leave comments on talk pages do not sign their posts properly. I have usually added the {{ unsigned}} message after those posts when I have encountered them. However, this could be a job for a bot: scan the Recent changes list limited to the Talk space, and if a comment is made by an IP-address, check it for a signature and add one if necessary. Of course logged in users also forget the signature sometimes, and those could be checked too, if it doesn't take too much resources. Alternatively only check those users that have not created an user page yet, they are often new to Wikipedia and do not know about signing their posts. Is anyone with the skill/equipment up to this? -- ZeroOne ( talk | @) 20:49, 17 October 2006 (UTC)
Why not just write it into the program (an auto signature)? If it's not an option to not leave a signature then a bot wouldnt be needed in the first place. -- MNAdam 23:19, 3 November 2006 (UTC)
How about a manually summoned bot that can suggest images for an article based on images from Commons that exist on copies of the page on other language wikis? -- InShaneee 02:21, 27 October 2006 (UTC)
Waht I'm after is either guidance on how to write a bot and what I need to run it, or perhaps someone to set up a bot that would run through the various comic stubs categories and tag them as stubs for the 1.0 assessment. I get some webspace with my ISP that includes cgi space I don't know if that's enough to host a bot, but I'd be interested in doing it if someone would hold my hand, otherwise if that's impractical or impossible, I'd appreciate someone taking it on. The categories are Category:Comics stubs and sub-categories and the code that needs to be added or amended on an article talk page is that either {{Comicsproj|class=Stub}} needs to be added or where it exists {{Comicsproj needs to be amended to {{Comicsproj|class=Stub, leaving the close brackets in case other fields are active. Also, I guess a subpage, um, {{FULLPAGENAME}}/Comments needs to be created with a message, um Assessed by comics-bot which automatically tags articles in stub categories as syub class articles. Appreciate thoughts. Hiding Talk 18:23, 30 October 2006 (UTC)
I'm not sure if this would be a good idea or not but perhaps it would be possible to build a robot to standardize Wikipedia pages (make them of similiar formatting).
Some points would be:
I'd be interested to see other people's points on this.
Yuser31415 07:43,
3 November
2006 (UTC)
I would like to request a bot that can touch about 80.000 pages on Wikispecies. I am one of the admins on Wikispecies, and we're going through some major changes. We do have one registered bot, but it stopped working, for an unknown reason. Perhaps a 'techy' is able and willing to do some standard changes. In principal it would need to delete '::::' colons out of taxonavigation sections. Perhaps also a check on a certain layout and if it does not fit standard layout add a Category. (or fix the issue if possible). Help would be highly appreciated. -- Kempmichel 10:21, 3 November 2006 (UTC) ( Wikispecies:User:Kempm)
I have some annoying hiccups sometimes, but that seems quite normal. So far I received 20.000 e-mails from your edits :) Is that how many you did? -- Kempmichel 17:58, 6 November 2006 (UTC)
I've just come back from fixing about 16 double redirects. Could someone write up a bot for me that fixes double redirects (If it's possible)? -- AAA! ( talk • contribs) 08:40, 6 November 2006 (UTC)
Is it possible to have a bot written that would patrol subcategories of Category:Comics and where an article has been tagged for deletion it could add that fact to the relevant section of the Wikipedia:WikiProject Comics/Notice Board? Hiding Talk 21:22, 7 November 2006 (UTC)
There are many pages dealing with subjects in ancient Greece and Rome that erroneously capitalize "ancient". WP guidelines and editorial consensus say that "ancient" should be lowercase. It's easy enough to move individual pages, but fixing the redirects is a pain. Is this the kind of task that a bot can help with? If not, are there other ways to (semi-)automate the process? Thanks. --Akhilleus ( talk) 16:32, 27 October 2006 (UTC)
Not in detail, because I mostly use OS X. But I have some access to a Windows machine, so I'll check it out. --Akhilleus ( talk) 19:05, 27 October 2006 (UTC)
Errr...this proposal speaks of moving pages. Article titles must begin with a capital letter, for technical reasons. Hence the link ancient Greece will always point to the article titled Ancient Greece. Robert A.West ( Talk) 19:47, 7 November 2006 (UTC)
Could someone write a bot that'll convert the old Template:PDFlink format to the new one, while adding file size info possibly, some details one what needs to be done is located at Template talk:PDFlink#PDFbot - Dispenser 08:19, 29 October 2006 (UTC)
Is it possible that a bot could be created or used that would be able to patrol all images on the wiki, and either add or replace the category with Category:Memory Beta images, as we have hundreds of images and it would be a mammoth task to do by hand. If so that would be fantastic, address for the wiki is Memory Beta Main Page. -- The Doctor 11:32, 08 November 2006 (UTC)
A lot of userboxes are being moved per the WP:GUS to userspace. This is probably a good thing; but when a box is moved everyone who had it on their userpage is left with something like this:
This user tries to do the right thing. If they make a mistake, please let them know. |
. Could someone create a box to fix those automatically? ~ ONUnicorn ( Talk / Contribs) 16:28, 9 November 2006 (UTC)
Could someone run a bot through the Playboy Playmate articles to compile a list of the dead ones so that I can compare it to the list at Dead Playboy Playmates. I want to check the list to see if it has all of them or not. Dismas| (talk) 09:07, 10 November 2006 (UTC)
I mentioned this on the Village Pump, but then I realized that this page existed: I've noticed that tables are used an awful lot everywhere on Wikipedia, even when using <div> tags would work just as well. I looked up Wikipedia:When to use tables, and I thought it would make sense to have a bot to find unnecessary tables, i.e. single-cell tables, and turn them into an equivalent <div style="CSS"> combination. That doesn't exist already, does it? Phoenix-forgotten 20:00, 10 November 2006 (UTC)
One example is Template:TOCright. As long as the border is nonexistent, a div will still look just like a one-cell table does if you convert any cellspacing and cellpadding into an appropriate amount of CSS padding. If the table has a border though, I haven't been able to make an exact equivalent because the table always seems to have a one-pixel border for its cell, which messes up the border-style:outset the outer boundary has. Phoenix-forgotten 01:25, 15 November 2006 (UTC)
Hello, I need a very specific change to be made to a number of very specific articles. For a list, see User:lensovet/Rail. What I need is as follows: for each line that reads
{{rail line|previous=[[Metropark (NJT station)|Metropark]]|route=[[Northeast Corridor Line]]|next=[[Linden (NJT station)|Linden]]|col=FF2400}}
to be converted to
{{NJT line|previous=Metropark|line=Northeast Corridor|next=Linden}}
that is:
please let me know when you make the change. thanks! — lensovet– talk – 20:23, 12 November 2006 (UTC)
I was thinking last night that it would be fun to have a bot that randomly generated a page in certain areas for the editor to edit. Like folks interested in botany or biology could get a random biology or botany page, then copyedit it. If Wikipedians did this for a year in all major areas, many of our crummiest articles, apperance wise, would get cleaned up.
I think that there are numerous articles on Wikipedia that need copyedits. I attempted to do this in the [Herat] article and the [Afghanistan] articles, but got sucked into a vicious flame war--these articles need serious work. However, I moved on to using the Random Article generator to find articles that could use copyediting, leading me to copyedit obscure pages like [Pre-dreadnought]. About half of the articles that come up have to do with Anime or television shows it seems, and some are in areas I know nothing about, but sometimes I find something interesting that needs work.
I can find articles on list, fine, but adding a little fun to it, and making it an all-Wikipedia project could seriously improve many Wikipedia articles. Editors would be encouraged to add citation needed tags, categories, and just do the rudimentary copyedit work that really makes Wikipedia viable. By allowing folks to get random articles in selected categories people would work on articles in their areas.
One of the best things about Wikipedia is writing a good article, then coming back the next day and finding someone else has spit-shined for you. There are a lot of articles that have some useful information but are rather sorry in appearance. Devoting some time to cleaning up these articles would, imo, greatly improve Wikipedia. Adding a little twist for those seeking something to do would make it a bit more interesting.
Please someone write this bot. Oh, I would call it the KPBot (for Kitchen police)!
KP Botany 20:55, 15 November 2006 (UTC)
As we know, images and templates on the main page are changed on a daily basis. To prevent vandalism to Wikipedia's most important page, these images and templates must be fully protected. I'm sure that many administrators will agree that this task can be pretty tedious. Also, it's always possible that something will be left unprotected by accident. After all, we're all humans! :)
Therefore, I'm proposing a bot that will automate the following tasks:
I do have one concern, though. If administrators become too dependent on the bot, some images or templates may be left unprotected if the bot suffers a downtime. -- Ixfd64 09:44, 22 November 2006 (UTC)
It'd be nice to have a bot automatically add Template:Verylong to articles that are above the recommended size. Vicarious 09:24, 23 November 2006 (UTC)
"On Wikipedia, and other Wiki-based websites broken external links still present a maintenance problem." linkrot
I am hoping someone takes on the task to write a WebCite-Bot, i.e. a bot which automatically feeds all cited URLs in Wikipedia to WebCite, which is a web archive similar to the Internet Archive, but with enhanced features such as on-demand archiving, and with a focus on scholarly material (as opposed to IA's shotgun-crawling-"archive-all-you-can"-approach). WebCite creates a snapshot (cached copy) of the cited URL, thereby archiving/preserving the cited webpage or webdocument (if caching was successful, the link to the snapshot should be added to the original URL on wikipedia). WebCite takes care of robot exclusion standards, no-cache tags etc on the cited page. If caching was successful, WebCite hands back a WebCite-ID, which should be added to the original link (for an example on how this could look like see the reference list at the bottom of the article http://www.jmir.org/2006/4/e27/ - all cited non-journal webpages also have a link to the WebCite snapshot). The benefit would be that all cited material will be automatically preserved and link rot, 404s or changes in the cited webpages will no longer be a problem. WebCite has an XML-based webservice which allows to communicate the caching request as well as to receive the WebCite ID if the caching was successful, see technical guide http://www.webcitation.org/doc/WebCiteBestPracticesGuide.pdf User:Eysen 19:45, 01 Dec 2006 (UTC)
I wanted a bot to do the following :
Thanks,
Jmfayard 14:02, 18 November 2006 (UTC)
But I just noticed, there is something else to care of.
Before, we used {{ Translation request}} and {{ Translation request from German}} which are now obsoleted. Instead, one should use {{ Translation}}. I just made a redirect of the two first templates to {{ Translation}}. This solve the case of future translation requests with the old templates.
But there is the problem of existing translation requests.
The bot should replace the obsoloted templates in the talk pages Talk:XXX ( the full list is hre) with
Is it doable ?
Jmfayard 23:31, 18 November 2006 (UTC)
No, this is not what I want.
To make it simpler :
I want you to replace
The thing is that, because I made a redirect, the talk pages with {{Translation request}} are not listed in Special:Whatlinkshere/Template:Translation request as you could expect, but in Special:Whatlinkshere/Template:Translation
Jmfayard 00:16, 19 November 2006 (UTC)
It'd be nice to have a bot automatically add Template:Verylong to articles that are above the recommended size. Vicarious 09:24, 23 November 2006 (UTC)
the list of fiddlers is getting a little unwielding. right now it duplicates itself completely, listing all the names first alphabetically and then by style - nice, convenient, but long. the plan is to split it into two articles. obviously there's a problem: how do we make sure people put their additions on both pages? already people aren't adding them to both lists.
seems like there are a couple ways this could be done with a bot, though I haven't read up on how they work and what they're capable of. conceptually most simply a bot could copy recent additions from the one page to the other - but it would have to check to see if the editor had edited both already. if bots can get around edit protection we could protect the list-by-style and have the bot check for changes to the list by name (take a look at the page - each name in the alphabetical list is followed by the styles they play... could a bot find and parse those parenthetical strings, and copy the name into the appropriate part(s) of a protected list-by-style article?) -- Eitch 19:21, 14 November 2006 (UTC)
Someone should write a bot to clean up Ganeshbot's bad grammar. Kaldari 06:56, 15 November 2006 (UTC)
Hello, would it be possible to have a bot created that would add the Law enforcement wikiproject header ({{Law enforcement}}) to articles that are in the Law Enforcement catagory? ( here). Many thanks.-- SGGH 14:37, 22 November 2006 (UTC)
Template:CopyrightedFreeUse has been officially deprecated since February. However, there are still about 7000 images using it. The equivalent Template:PD-release should be substituted for it (or Template:No rights reserved which is also legally equivalent). Personally I prefer Template:PD-release as it is less confusing but means exactly the same thing legally, i.e. that all rights are renouced by the copyright holder. Kaldari 17:45, 24 November 2006 (UTC)
Could someone please create a bot which would automatically replace the Template:SER with Template:SRB. Even though SRB is an official ISO 3166-1 3-letter country code and an abbreviation for Serbia (from Srbija) many people still think this code is SER and having a wrong template doesn`t help either. It would be nice to have an automatic bot to correct future mistakes. Avala 14:08, 27 November 2006 (UTC)
I'm tinkering with a proposal I've made which involves using categories to replace a list page (the proposal is Wikipedia:WikiProject The Simpsons/Proposal for managing song lists on Simpsons episodes) but I wonder if this is something that could be well managed by a bot.
The requirement would be to generate a page (or I guess edit a delimited section in the middle of a page) based on all the pages listed in a different category. Perhaps the extended requirement would be to find all the subcategories of a given category and use them to generate a list of lists. For instance, 'generate a page with sections named from subcategories of the category "songs on the simpsons", where each section has a list formed from the names of the pages in those subcategories (which would be the names of songs)'.
Or perhaps to take the text from a given section from each page of a list of pages (identified from being members of a category) and generate one page containnig all those sections. For instance 'copy all the sections called "songs" from all pages in the category "simpsons episodes" to the "songs" section of the page "list of songs on the simpsons" page'.
This all sounds quite fiddly, but I can imagine someone may have made a generic bot that can be fed with parameters to do this sort of processing. Is there anything out there to do something like this, or any keen on developing one? Please feel free to chat on my talk page if you want to ask questions or suggest solutions -- Mortice 18:10, 27 November 2006 (UTC)
I'm currently developing this bot -- Mortice 12:21, 29 November 2006 (UTC)
Category:Main pages with misplaced talk page templates contains pages with a template that belongs on the page's talk page instead. Can we employ a bot to move these? ( Radiant) 13:28, 29 November 2006 (UTC)
And, Category:Articles actively undergoing construction and Category:Articles actively undergoing a major edit are supposed to be temporary categories, but have grown very large. Perhaps a bot could depopulate them weekly? ( Radiant) 13:44, 29 November 2006 (UTC)
I try to go through this and weed out the forgotten ones. It is a bit frustrating because some people demand the right to leave up the tag for long periods of time, and aren't shy about complaining. Also it's transcluded on a bunch of instructional pages, like Wikipedia:Edit lock so some care needs to be taken not to mess up those pages. But if someone wants to run a bot and deal with the occasional upset article owner, I think that's great. -- W.marsh 18:22, 29 November 2006 (UTC)
How do I make a bot to message Wikiproject Gold Coast members? If possible, can someone make it for me. Thank you -- Nathannoblet 07:49, 1 December 2006 (UTC)
I have opened suggestions for creating a template for image pages that warns admins that a user has contacted the copyright owner requesting the image's use of Wikipedia and asks not for it to be deleted while a response is expected. The discussion can be found here.
In order to work, it will need a bot that checks a tagging date and remove images that have been tagged for more than one week, and create a relevant category of unlicensed images each day, the same way it is done in {{ Replaceable fair use}}.
I don't have any expertise in creating bots, so could someone please create a bot that can do this? ~ ► Wykebjs ◄ ( userpage | talk) 18:05, 1 December 2006 (UTC)
Also, Category:Pages needing an infobox conversion and Category:Needs album infobox conversion contain a template that should be switched to another template. Would a bot be feasible here? ( Radiant) 13:33, 29 November 2006 (UTC)
There are quite a few lists of new articles related to specific subject used by various wikiprojects. They all work 'manually' - with dedicated users adding articles they find to the lists - but this could all be easily autmomatized (botized). We need a bot that would: 1) look at a specified forum (i.e. Portal:Poland/New article announcements) 2) look at a specified section to find last reported article (they differ as various wikiprojects and such have no unified structure, so some may have 'November', others 'November 1-15', and so on 3) look at a 'what links here' of given article(s) - for example, Poland, Polish, Polish language for Portal:Poland, see what new articles have been added to the 'what links here' and generate a report in the above section in the format *[[article name]] created by [[User:Username]] on date This bot would save much time now spend by dozens of dedicated editors who scour the 'what links here' lists instead of doing more constructive work. Issues to consider: articles in question (countries) have many pages of 'what links here', to speed up the process the bot may want to look from the end to find the most recent article added, thus skipping 99% of the links - but this may skip checking redirect. I don't know how long will it take to analyze the entire page to find and analyze redirects, but once they are found they can be added to the main 'check' list and the bot wouldn't have to look through main article for them again, so it would be useful to have to options: normal scan (from the end to the last reported) and complete (from the end to the last reported, and then to the begining but generate only list of redirects). Additional features which I doubt would be included (wishlist): add lenght of the article, tags, lead; scan for new pictures, categories and stubs.-- Piotr Konieczny aka Prokonsul Piotrus | talk 18:03, 2 December 2006 (UTC)
Per here and on each affected template's talk page (thread name "Template name"), please dispatch a bot to rename all transclusions/links of:
Thanks! David Kernow (talk) 10:22, 3 December 2006 (UTC)
Templates {{ wc}}, {{ ec}} and {{ ec2}} need substing into what appears to be thousands of articles. Chris cheese whine 20:10, 5 December 2006 (UTC)
Hi, I'm trying to get some Country and City Wikipedia pages for use on a Google map travel site. The code is written in C# and works ok for other sites. On Wikipedia I get a 403 error when I read in a page. Do I need to register my site process as a bot or could I use an existing bot to get the pages ?
please
Thanks I'll try those -- Seewhere.net 02:08, 7 December 2006 (UTC)
"On Wikipedia, and other Wiki-based websites broken external links still present a maintenance problem." linkrot
I am hoping someone takes on the task to write a WebCite-Bot, i.e. a bot which automatically feeds all cited URLs in Wikipedia to WebCite, which is a web archive similar to the Internet Archive, but with enhanced features such as on-demand archiving, and with a focus on scholarly material (as opposed to IA's shotgun-crawling-"archive-all-you-can"-approach). WebCite creates a snapshot (cached copy) of the cited URL, thereby archiving/preserving the cited webpage or webdocument (if caching was successful, the link to the snapshot should be added to the original URL on wikipedia). WebCite takes care of robot exclusion standards, no-cache tags etc on the cited page. If caching was successful, WebCite hands back a WebCite-ID, which should be added to the original link (for an example on how this could look like see examples below and also see the reference list at the bottom of the article http://www.jmir.org/2006/4/e27/ - all cited non-journal webpages also have a link to the WebCite snapshot). The benefit would be that all cited material will be automatically preserved and link rot, 404s or changes in the cited webpages will no longer be a problem. WebCite has an XML-based webservice which allows to communicate the caching request as well as to receive the WebCite ID if the caching was successful, see technical guide http://www.webcitation.org/doc/WebCiteBestPracticesGuide.pdf User:Eysen 19:45, 04 December 2006 (UTC)
--snip--
It is proposed to develop a bot which - using the WebCite webservice - changes a reference (or even "naked" URLs) as follows:
Replace a reference like:
with a reference like this
or this (in addition to the WebCite URL, the original URL might be given):
Alternatively, the cited URL can also be retained as part of the link to webcitation, to keep the cited URL explicit and to allow easy reverting to the original URL should this be desired:
--snap--
Hello again, I was wondering if one of you lovely people could help us out again - we've set up an Assement department and would like all {{Architecture|class=stub}} adding to all of the stubs currently listed at Wikipedia:WikiProject Architecture/Stub categories, starting with those articles with {{architecture-stub}} and {{architect-stub}} tags. Cheers. -- Mcginnly | Natter 12:06, 7 December 2006 (UTC)
(This isn't a bot request per se but an "is this worth doing" post - I can write the bot myself if people think its worth doing.)
I notice that one of the items permanently on the maintenance list is the Wikipedia:Cleanup list and that the number of articles needing cleanup seems to be increasing rather than decreasing. I had an idea for a bot that might help with this problem. I could write this bot myself (already have one bot on trial) but wanted to get people's ideas for whether it was worthwhile, comments etc etc, before I started on it.
Brief scope as I see it now (amenable to change): Bot would be manually run or automatically run on an eg weekly basis. Bot would trawl Category:All_pages_needing_cleanup and find any new additions since its last trawl. It would visit each new addition and pull a list of contributors. It would then leave a message on the talk page of (every contributor) or (last 10 contributors) or (article starter) or (contributors with 10+edits) or (whatever) notifying them that the article is in need of cleanup and listing tips for how they could help to achieve this etc.
What do you think? Worthwhile? Ideas? Comments? - PocklingtonDan 17:44, 7 December 2006 (UTC)
Hi there!
I was wondering if it would be possible to run a spacing bot, either automated or manual, that puts spaces or newlines between wiki syntax and removes spaces or newlines if they weren't needed? Obviously it would have to be pretty simple so as not to get spacing wrong. I could probably knock something up in Yabasic pretty quickly, using an external program like Wget to handle getting and sending data from the page.
If anyone's interested I'll give an account of different wiki syntax that I believe spacing makes easier to understand (like putting spaces after bullet points before the text).
Cheers,
Yuser31415 @ ? # & help! 02:44, 7 December 2006 (UTC)
and although that's not official policy, it's generally accepted. Would it change the way pages are displayed, or only their wikicode? — Mets501 ( talk) 11:56, 7 December 2006 (UTC)Avoid making insignificant minor edits such as only adding or removing some white space
Aloha. WikiProject Hawaii assessment is just getting started and we really need help. To start, I need a bot to replace the current WikiProject tag with the new tag {{WikiProject Hawaii |class=NA |cat=yes}} on every category talk page contained within Category:WikiProject Hawaii articles (please exclude the subcats). Thank you for your assistance! — Viriditas | Talk 21:51, 7 December 2006 (UTC)
Great job. Now, on to stub assessment. I would like to add {{WikiProject Hawaii|class=Stub}} to all talk pages in Category:Hawaii stubs (including subcats). Please add class=Stub to unassessed or untagged articles only, skipping articles where class is already flagged. Thank you again. — Viriditas | Talk 01:32, 9 December 2006 (UTC)
User:WatchlistBot can help you with this. I'm a bit behind right now, but I'll add it to my to-do list and contact you when I can get to it, if you can't find anyone to do it sooner. Ingrid 01:48, 9 December 2006 (UTC)
Per discussions
here, I need
these 152 users informed of a change made recently to the
common.css file so that they can restore their ability to view
Persondata. The bot should leave the following notice on those users' talk pages:
"Per
recent discussions, the way in which
Persondata is viewed by Wikipedia editors has changed. In order to continue viewing Persondata in Wikipedia articles, please edit your user CSS file to display table.persondata rather than table.metadata.
More specific instructions can be found on the Persondata page."
If there are any questions about this request, please ask me on my talk page rather than here. Thanks.
Kaldari 08:33, 24 December 2006 (UTC)
Simply, this bot would check for {{ lowercase}} tags on an article, then rename an article (let's say, "Test article") to "Thisisanarticleusedbylowercasebot". Then, the bot would rename "Thisisanarticleusedbylowercasebot" to "test article". It should be easy to create, and would only need to be run once. -Slash- 20:27, 10 December 2006 (UTC)
This a request for a bot that counts the number of items at Wikipedia:Featured articles and puts that number into a template. Once the bot has proved reliable, it envisaged that the bot will be flagged to edit the protected page Template:FA number, which is used as a counter within the Main Page FA box.
A bit of background, in recent discussion at Talk:Main Page consensus was reached for an FA counter. Once it was implemented as template requiring manual updating, several FA regulars expressed their unhappiness with being asked to keep track of another page when pulling together the results of WP:FAC and WP:FARC. On the Talk:Main Page discussion, FA regulars suggested a bot solution, which seems to address everyone's concerns. In case anyone remembers the recent attempt to get approval for a bot to edit a protected page, Raul654 has stated that he would be willing to set its flag once it proves reliable.
It occurs to me (as opposed to me relaying the results of the discussion to date) that some sort of vandal-spoofing feature would be useful. Wikipedia:Featured articles is already under semiprotection, but a particularly determined vandal might add or remove items to make the Main Page number jump. One idea is the use of a user whitelist (admins and selected non-admin FA regulars), in which the bot waits 15 minutes or so if anyone not on the whitelist adds or removes articles, to give time for vandalism to be reverted. (Yes, I'm paranoid.) Hopefully this description has made sense. Thanks! - Banyan Tree 13:43, 9 December 2006 (UTC)
Out of curiosity, is there a GA botcounter? b_cubed 21:00, 11 December 2006 (UTC)
Any chance that I can get a bot to go through Category:Unassessed biography articles on the Biography Wikiproject, and have it label any that are stubs as a stub in Wikiproject Biography's rating? I know there's a bot that's doing something similar, but that one's more purpseful in finding unassessed ones rather than assessing some. -- Wizardman 05:56, 10 December 2006 (UTC)
Anyone? If there's a way to do it on AWB, how would I go about doing that then? -- Wizardman 00:52, 12 December 2006 (UTC)
Is there anyway that a bot capable of unwikilinking dates (specifically years) could be made? I read a lot of articles that contain such wikilinked dates, e.g. 1942 or 1784, which really add nothing to the article. I am aware, that as it stands now, the wikipedia policy on dates is to have all of the date wikilinked. However, in practice, there has been a growing trend with FA articles to unlink the dates. Personally I think a bot capable of this would be very useful. I'm not sure how to do it otherwise I'd try myself. The only concern is that you'd have to make sure it doesn't unlink the "fuller" dates, e.g. November 20 1983. (if you can respond on my talk page it would be helpful) b_cubed 19:52, 8 December 2006 (UTC)
::I think I saw some discussion of this on another page recently. Objections were raised that:
::*It would lead to overlinking - ie wikipedia doesn't want every year linked.
::*Four-digit numbers could be used as a number or a year. ie "In the year 2000, X did Y" or "X led 2000 troops into battle"
::*Numbers can refer to something more specific ie "2000 AD" or whatever that Judge Dredd comic is.
::*Numbers could be years, but as part of fuller strings, ie "Battle of Suessonia (1976)" should link tot he whole battle article, not just the year.
::*It wouldn't be possible to have a sufficiently clever bot to get round these caveats, so it would probably have to be manually-assisted and would thus represent a massive amount of work.
::Note the above are my recollections of what I read of a discussion of this same idea elsewhere
[Copied from above]: (removing indent) I applaud your wish to contibrute your programmign knowledge its just I personally think re-arranging whitespace is a poor use of your bot's time and the server's resources. If there isn't already such a thing, a bot that flags near-empty articles as stubs would seem a much better idea. If it was able to flag the stubs as project stubs based on the category it was in or article name (this would probably need to be manualy-assisted) then all the better. Why not start a new section on this page to discuss this and move these comments there? - PocklingtonDan 19:11, 8 December 2006 (UTC)
I recently created a new category Cars of England. There are quite a few cars from england, and it's quite tedious to add them all. I do, however, has a list of categories that list cars from manufacturers in England. Is there a way to have a bot add the category to all the articles found in these sub-categories? Thanks, Riguy 07:13, 10 December 2006 (UTC)
Does anyone want to take over tagging orphan articles with {{ linkless}}? The actual tagging is simple with AWB, just load Special:Lonelypages (it refreshed Saturdays/Wednesdays as of mid-November) and run the bot, it will make about 800 edits each refresh. I can supply you with the regex I used to ignore an array of pages that should not be tagged (dab pages, pages to be transwikid, various other odd stuff). You would also want to maintain a list of orphaned articles, you can see what I mean at User:W.marsh/orphans articles/A-C.
The one hitch is that you will also want to de-orphan the ignored articles somehow or other, you could create a list of orphaned dab pages and so on from your userspace, or manually add them to Wikipedia:Links to disambiguation pages (which is what I did, hence the burnout after 6 months most likely). It takes maybe 2-3 hours a week (almost all on the manual stuff), maybe 5-10 minutes plus bot time if you just do the tagging and output the skipped articles to lists in your userspace.
The more automation you can add (e.g. automatically updated lists) the faster this would be, unfortunately I could never add much except automated tagging with AWB. It's not glamerous work (for every 3,000 or so edits my bot made, I got about one reply on my talk page) but I think it's helpful work, I did notice a whole lot of articles getting de-orphaned within a few days after the tag was added. -- W.marsh 15:37, 13 December 2006 (UTC)
linkless|geodis|copyvio|{{Disambig|this AfD|This page has been deleted|#redirect|dated prod|4LA|{{dab}}|{{hndis|{{disamb|numberdis|4CC|3CC|2CC|Schooldis|Shipindex|Tempdab|Wikipedia does not currently have an encyclopedia article for|{{wi}}|{{surname|4CC|TLAdisambig|{{deleted|{{move to|{{dicdef
We have a lot of articles about individual US towns, many of which refer to the "2000 census" in their introductions. As we have an article for the United States Census, 2000, maybe somebody could script a bot to wikify those references. Cribcage 07:43, 14 December 2006 (UTC)
|
Hello, I've upgraded with others Wikipedia:Translation into English which is now here : Wikipedia:Translation.
One thing we decided is to migrate from one big static page with all the available translators (which is heavy, hard to maintain, never up-to-date) to two userbox templates ({{ Translator}} and {{ Proofreader}}) so 1) all the work is done automatically by the categories 2) each translator has a link to the translation page on his user page.
To migrate from the old system to the new one, since they were a lot of people, I need a bot which would go through every user listed on Wikipedia:Translators available, and which would let the following message on his talk page :
{{subst:Translation/Talkpage}}
Jmfayard 18:22, 14 December 2006 (UTC)
Can someone make a bot which places new user templates on someone's talk pages? If you can, I would seriously appreciate it! Bushc a rrot ( Talk· Desk) 23:20, 15 December 2006 (UTC)
Is it possible to make a bot that automatically removes users that are blocked? Often times admins forget to remove users that they block or spend time blocking the user and putting the appropriate message(s) on the blocked user's page. Or, sometimes when there are a large number of notices on WP:AIV they will go through blocking users first, waiting until a bit later to remove users. So, to help with efficiency (by cutting down on conflicts) a bot might be useful. -- tariqabjotu 00:37, 15 December 2006 (UTC)
As part of some work I am carrying out on Scottish Historic Railways, I have created a progress and reference page in my user space at User:Pencefn/Historical Scottish Railways. I would like to add the latest revision date/time into the table for the article and associated talk page on the second and fourth column respectively - which is updated to reflect the work in progress (covering updates to articles and the potential addition of more articles). Can anyone help me? Stewart 19:16, 21 December 2006 (UTC)
Whoever puts any comments on REF desk without signing should be given a welcome template, so they will sign next time. (only if they don't have the welcome template already).-- Judged 23:10, 22 December 2006 (UTC)
Happy Holidays and Happy New Year to everyone. I'm curious if anyone knows of any bots working the neutrality template categories. I would like to know what percentage of articles have neutrality-related tags by WikiProject and have a report generated, with a template updated on the project page (Pearle produced a similar report listing articles needing cleanup). After the report is generated, the template on the project page could be updated with a percentage linking to the category of WikiProject-related neutrality issues. Something like, "12% of articles require attention for neutrality-related issues." WikiProject departments would deal with this. The bot would only need to be run once a week. Thanks. — Viriditas | Talk 03:33, 25 December 2006 (UTC)
I was wondering if someone with a bot would be able to place banners for Wikipedia:WikiProject Massively multiplayer online games on all talk pages (including non-existant ones) in the category of Category:Massively multiplayer online games including it's sub-categories. This need not be a reoccurring event, but it is necessary to get this WikiProject up and running. Any help is appreciated! Greeves 04:31, 19 December 2006 (UTC)
A bot would be needed to carry this task following a modification to {{ Permprot}}:
Circeus 21:14, 23 December 2006 (UTC)
Given that my proposal for an additional step to the AfD process (found here) is meeting both opposition and the suggestion that the job could be better done by a bot, I've brought that proposal here. The suggestion is a reasonably simple one:
This would avoid the bureaucracy that is the major criticism of my original proposal, and (hopefully) significantly reduce the problems of biting that I raised there. Thanks! Daveydw ee b ( chat/ review!) 01:11, 27 November 2006 (UTC)
I'm using the Weblinkchecker.py bot and have a whole load of bad links. Is there a way to have a bot remove them from the articles? (I reaize that this could be hard, since we have refs and [] links) One output looks like:
ST47 Talk 22:13, 5 December 2006 (UTC)
I'm trying to find out how many articles and categories ultimately descend from Category:Dungeons & Dragons and how many of these are stubs (both by categorization and by byte/word count). Lists would be good if possible. For comparison sake, I'm also seeking similar numbers for Category:Chess. This is for the following purposes:
Neon Merlin 23:56, 24 December 2006 (UTC)