This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | → | Archive 15 |
Removing redlinks is a tedious part of keeping WP:DEAD current. Could this be done easily by a bot? Salad Days 23:20, 24 January 2007 (UTC)
I'am unable to create a bot. My programming skills are not in this area. I would like help in developing my RVBOT as an automated clone of the Antivandal bot to help the wikipedia anti-vandal fleet. Retiono Virginian 16:38, 27 January 2007 (UTC)
Can someone create a bot called Obiwanbot for me? - Patricknoddy 10:49am, January 28, 2007
This has been moved to User:Betacommand/Bot Tasks 18:07, 29 January 2007 (UTC)
Can someone make a bot named Realbot? The bot puts {{SharedIP}} and {{SchoolIP}} on the IP's page as well as put where the IP is editing from, in case of vandalism. Some administrators forget to see where the IP is editing, before blocking the IP. This is to lessen the amount of collateral damage. Whoever makes this bot will receive a barnstar and/or special template! Thanks. Real96 03:29, 29 January 2007 (UTC)
(reduce indent) Yes, the bot can put an education or shared IP template on the IP's page without revealing the IP's location. If someone would manipulate and copy AntiVandalBot's code to create Realbot, maybe the bot could exist! Real96 03:41, 30 January 2007 (UTC)
Can AIV helperbot add block templates to indef. blocked users or temporary blocked users if administrators fail to do so? Real96 05:39, 29 January 2007 (UTC)
I have a fast computer and internet connection, and I am on it most of the time. I have created this account, and would like to be able to use the source for an existing bot as I have no programming experience outside Flowcharts. Any ideas on a bot I could run? I already run AutoWikiBrowser and have been doing category renaming and deletion work recently. Maybe an automatic bot that did this would be a good start? Thanks everyone. Robertvan1Bot 23:17, 29 January 2007 (UTC)
Hows this: a bot which, if you save a link to a disambiguation page, asks you to adjust the link to make it a bit more specific.
What would be ubercool is if the bot would list the links provided by the disambiguation page, and just allow you to click to select which one you meant.
Paul Murray 03:08, 30 January 2007 (UTC)
someone pls do that so we can have more featured articles —The preceding unsigned comment was added by PWG99 ( talk • contribs) 21:07, 30 January 2007 (UTC).
Does anyone know of a bot that will crawl a category and create a list based on the "What links here" count. I'm looking for something to guide priority assessment in a Wikiproject (not that it would be the basis for priority - just a helpfull tool). Morphh (talk) 02:24, 31 January 2007 (UTC)
Done -- SatyrTN ( talk | contribs) 21:25, 31 January 2007 (UTC)
A bot that does the trivial aspects of closing an AFD. A real person would close the AFD by placing a tag for the bot (perhaps {{close|keep}}), the bot would be watching the afd pages and see the tag being added. In the case of keep it would close the voting page, remove the afd template from the article and add the {{ oldafdfull}} tag with all the appropriate information to the talk page. The tag itself would probably just have some trivial message because it would only be on the page for a few seconds, something like, "A bot will come take care of this soon". The bot would be worthwhile if it only handled variations of keep (including "no consensus", "speedy keep" and "nomination withdrawn"), but could also be made to do deletes as well. For either just the delete or any use it could use a whitelist of users allowed to do the action, I'm not sure what the current policy is on who can close. Vicarious 05:01, 31 January 2007 (UTC)
This has been moved to User:Betacommand/Bot Tasks 15:54, 31 January 2007 (UTC)
I just had an idea for a bot that goes through articles and fixes section headers.
Example: Changing
==Section title==
to
== Section Title ==
and so on. Like changing "See Also" to "See also" and correctly sequencing "references", "see also", and "external links" at the end of pages.
A name: SecBot
Just an idea. ~ Flame vip e r Who's a Peach? 17:30, 31 January 2007 (UTC)
Can a bot please add {{ WikiProject Abkhazia}} to all articles in Category:Abkhazia? - Patricknoddy 4:57pm, January 31, 2007
I had an idea for a Copyvio bot (let's just call it, .V.iobot). It would search through articles and take segments from those articles and check them in Google, sans other Wiki links. If it finds matches to those segments, it adds the name of the article checked to the bot's userpage (or a subpage), entitled "Possible Copyvio Articles." Then I (or another editor) could go to that page and double-check they really are copyvios and then put them up for deletion.
Has this been done before or is currently being done? If not, is it a good idea? It sounds like it would be helpful. I'd really love to script it if the job is open. .V. Talk| Email 22:08, 31 January 2007 (UTC)
.{{fact}} -> {{fact}}. and . {{fact}} -> .{{fact}} And I mean an actual running bot, not just AWB. AWB is a very sloooww thing. {Slash -|- Talk} 06:34, 1 February 2007 (UTC)
How about a bot that scans through All pages (Talk namespace), and checks each page to see if it isn't an orphaned talk page. If it is, tag it with {{ db-talk}}. Or does this exist in some form already? – Tivedshambo (talk) 20:19, 1 February 2007 (UTC)
A bot adding templates to new pages which fall under individual Wikiprojects. Real96 13:14, 2 February 2007 (UTC)
It would be great if a bot that blanks/cleans personal user-sandboxes were to be created...
-- TomasBat ( Talk)( Sign) 19:02, 2 February 2007 (UTC)
Could there be a bot that goes down the new users page and posts {{subst:Welcome123}} or some other welcome template? There are a lot of new users every minute so maybe the bot could relieve people on the Wikipedia:Welcoming Committee of greeting new users. Orin LincolnÆ 04:21, 3 February 2007 (UTC)
Can a bot please put {{ WikiProject Afghanistan}} of the talk pages of articles in Category:Afghanistan? - Patricknoddy ( talk · contribs) 5:37pm, February 1, 2007
Please be more careful while listing categories for tagging. Category:Balochistan which is part of Geography of Afghanistan is also part of Geography of Iran as well as Pakistan. All those articles have got tagged as well. The categories need to be chosen much more carefully — Lost (talk) 13:44, 4 February 2007 (UTC)
Hi. How about a bot that removes images that do not exist from articles? This is because I have encountered many broken images in articles, and they are rarely removed. My request is for a bot that removes images and descriptions that link to nonexistant images that have been in an article for more than 24h. This way we can edit without having to remove nonexistant images. Thanks. Suggestions? A stroHur ricane 00 1( Talk+ Contribs+ Ubx) 15:14, 3 February 2007 (UTC)
Hi. How about a bot that removes double redirects after a move? I have encountered many double redirects after an article containing redirects was moved, andthe redirects became double. I have often had to fix them myself, because the people that moved the article forgot to follow instructions and did not fix the double redirects. I request a bot that automaticly recognises double redirects after a move and fixes them, so we can edit articles without having to fix the double redirects. Thanks. Suggestions? A stroHur ricane 00 1( Talk+ Contribs+ Ubx) 15:19, 3 February 2007 (UTC)
Since we're on a roll with WikiProject-related requests, would it be possible for a bot to put {{ Visual arts}} on all the pages within Category:Visual Arts? Appreciate the help. Planetneutral 05:45, 4 February 2007 (UTC)
Could you use your magic to tag all articles related to Spain for the {{WikiProject Spain}}? ¡Muchas gracias! Espana Viva 22:18, 1 February 2007 (UTC)
ST47 - well, I'm puzzled why Aztec Triple Alliance and Florentine Codex wound up on the list to be tagged, since I went through their categories and parent categories, and they don't seem to have an obvious tie to any of the categories we listed. Let's give this some thought so as how to avoid problems the next time! Perhaps run it so that articles in sub-categories are not automatically tagged? Also, may run it in batches of smaller groups so that issues can be managed as they arise on a smaller scale? Espana Viva 16:32, 4 February 2007 (UTC)
After my previous AIV Warning Bot went nowhere, I got an idea while reading an article, then I found out it was a previous request that was not kept alive. I was thinking how convenient it would be for a bot to turn all the large amounts of web citations on Wikipedia articles into the more formal reference tags such as {{cite web}}. As User:John_Broughton stated in the previous request, it would go to Special:Linksearch and get a list of all external links for a specific domain (e.g. www.msn.com). It should understand that when a URL appears twice in a page, there should not be two identical footnotes. So, the first reference/footnote gets a name="X" argument. It probably should avoid pages that have both a "Notes" and a "References" section. It probably shouldn't footnote a URL if the URL is in the "External links" section. And, obviously, it should put as much information as possible into the "cite" ({{cite web}}) template. If you are willing to create a excellent and much needed bot that I would be glad to run and operate, it would be much appreciated. Thank you. -- Extranet ( Talk | Contribs) 07:08, 3 February 2007 (UTC)
It's possible that there are some more standard (meta) fields than these (I didn't look through the entire source), or that one would have to write a bit of code for each major news source (PubMed is already standardized, and a converter already exists), but even if this bot only covered (say) the top 25 news sources, it could still make a major difference, if only to move editors from seeing articles with only embedded links to seeing what good footnotes look like. -- John Broughton (☎☎) 07:56, 3 February 2007 (UTC)
Giving this some more thought, I think that in addition to formatting the citations, it is also necessary to check the articles to make sure they actually support the cited statment. I think this is a very important job for the good of the Wiki, and it almost certainly requires human input. Would anyone be interested in starting a fact checking wikiproject to go around, check citations, and put references in the propper format? I would love to start such a thing if others are interested. -- Selket 20:47, 3 February 2007 (UTC)
Some people have used the {{
PDFlink}} inside the citation template as the format parameter. With the revised PDFlink usage, the first parameter is required. I would like somebody to run a bot which will convert {{PDFlink}}
and {{PDF}}
to PDF
. --
Dispenser 23:10, 3 February 2007 (UTC)
I'm sure there must be a bot that could do this already but I'm not sure which one. I'd like a bot to run through the sub-categories at Category:Proposed deletion. It would look for articles that had been prodded, had the tag removed and then replaced. If it finds any they could be dumped into a file for a human to go through. I realise that the bot could remove the tag and notify the editor that had replced it by mistake, but I don't think that would be a good idea. First of all to have the bot re-remove the tag would give the apperance of the bot edit warring. Second, I've noticed that editors get upset when the prod tag is removed a second time especially when it is for an article that has been justifiably prodded. I went through on of the sub-categories and removed 6 prod tags that had been restored improperly so I do see the need for it. CambridgeBayWeather (Talk) 12:30, 4 February 2007 (UTC)
Could a bot please put {{ WikiProject Åland Islands}} on all the talk pages of the articles in Category:Åland? - Patricknoddy ( talk · contribs) 9:11am, February 4, 2007
May I request that you hold off on this project and other similar projects by User:Patricknoddy, pending the results of the discussion here. I have a concern that the creation of these large WikiProjects which may not have any active participants may create difficulties. Spamreporter1 15:32, 4 February 2007 (UTC)
Can a bot put {{ WikiProject Algeria}} on the talk pages of all the articles in Category:Algeria? - Patricknoddy ( talk · contribs) 9:36am, February 4, 2007
May I request that you hold off on this project and other similar projects by User:Patricknoddy, pending the results of the discussion here. I have a concern that the creation of these large WikiProjects which may not have any active participants may create difficulties. Spamreporter1 15:34, 4 February 2007 (UTC)
At present, when an article comes up for AfD, the Jayden54Bot inserts a warning into the users' talk page. Excellent idea!
Last week I had the Lasagna cell article vanish without warning. I was stunned, since there was no Talkpage discussion. Also there were no watchlist notifications. It disappeared from "my contributions," and all my old watchlist entries vanished too. I know that it hadn't been AfD'd. Most frightening: if I hadn't been recently using its talkpage, I might never have noticed its disappearance. Lacking experience with this, and lacking evidence, I had no idea what to do next. (Unfortunately I didn't notice the deletion resources link on the error page.) I ended up yelling to admin for help. It turned out that the article had been Speedy Deleted by an admin under incorrect SD criteria, without tagging, and who also hadn't seen the talk page.
I found the whole experience very unsettling. It was like seeing "Extraordinary Rendition" in action! (grin.) I had no clue that SD existed, or how to find what had happened. And... if I found it confusing, I assume that lots of others must have the same plight. I seem to have guessed right. Someone attempted an ineffective cure in the form of a SD Patrol started in late 2005, even with comment about others' saving good articles from accidental deletion. (Scary: if some have been saved, how many have been lost?!!)
Here's a much better fix: just make the original author responsible for saving the SD'd article. Use a bot similar to Jayden54Bot to announce on the original authors' talkpage the speedy-deletion event, as well as providing some resources for that author. This way legit articles won't just vanish silently with nobody noticing. Sample:
-- Wjbeaty 07:01, 27 January 2007 (UTC)
However, usually the original creator of an article is not present. How about also notifying everyone who's made 4 edits or more, or if it's speedied on sight, how about 3? It should also check whether a user has made an edit within the last 2 weeks also. {Slash -|- Talk} 06:19, 1 February 2007 (UTC)
Wikipedia:WikiProject Victoria Cross Reference Migration moved a heap of content into wikipedia, the old domain is now being squatted and we are providing heaps of links (over 1300) to it. Could someone with access to some sort of automated tool, find and replace (or just get rid of) the links to the external site with a link to the wikiproject instead please? -- Peta 02:01, 30 January 2007 (UTC)
This page has been [[Wikipedia:WikiProject Victoria Cross Reference Migration|migrated]] from the [http://www.victoriacross.net Victoria Cross Reference] '''with permission.'''''
Should be converted to:
This page has been migrated per the [[Wikipedia:WikiProject Victoria Cross Reference Migration|Victoria Cross Reference]] project '''with permission.'''''
And the list of pages to be checked is found here. -- John Broughton (☎☎) 03:20, 2 February 2007 (UTC)
I've removed most of them manually, and will finish the job soon, thanks anyway bot writers. -- Peta 08:33, 6 February 2007 (UTC)
I need a bot/AWBer to go through Special:Whatlinkshere/Stv, and change occurrences in articles of [[stv]] and [[STV|stv]] to [[STV]], in line with WP:MOSTM (the move requires divine intervention, which has been sought). I managed to get a bunch of the direct links out with a change to Template:ITV, but there look to be over 100 occurrences articles affected. Chris cheese whine 15:49, 5 February 2007 (UTC)
I'm not sure if this exists already, but due to the new protection feature I believe a bot that automatically removes protection templates from pages whose protections have expired would be a valuable asset. -- tariqabjotu 23:46, 22 January 2007 (UTC)
Could we have a bot to place {{ cycling project}} on the talk page of all articles within Category:Cycling and its sub-categories please? Mk3severo 14:42, 3 February 2007 (UTC)
Are there any bots currently running that can apply WikiProject tags on all the articles in a particular category and in its entire sub-category tree? If there's one available, there are three (very large) requests listed here that we'd love to have some help with! :-) Kirill Lokshin 19:41, 4 February 2007 (UTC)
I need a bot which adds category tags to people/states/etc. to the respective category. Currently, I am trying to clean up Phi Beta Sigma and am going to make a category for it. I have to put category tags on each person's page, which is EXTREMELY tedious. Can a bot please help out for this? Thanks. Real96 20:50, 4 February 2007 (UTC)
The WikiProject Spain has a list of Spain-related categories here. We will be asking you to place the WikiProject Spain template on the talk page of the articles in these categories. We will be asking you to go through these categories a chunk at a time, so as to cut down on any "over-inclusion" issues.
When that first run is complete, and we have dealt with issues (if any), then we will be aking you to move forward to the next chunk of categories. We would also ask that you not go down into the sub-categories of these at this time. Just add the tag to these categories only.
If all that makes sense:
Please let me know if you have any questions! Espana Viva 19:23, 7 February 2007 (UTC)
Excellent, thank you! Espana Viva 20:03, 7 February 2007 (UTC)
Certainly OK with me! Please let me know if you have any questions on my talk page. Espana Viva 21:17, 7 February 2007 (UTC)
Well, just thought of something, though . . . each category is going to have multiple articles in it. So, if your limit is 50-100 edits you're going to have to do just a handful of categories. Take a look at the number articles in the first few categories (starting with #1), and then select just that number of categories that will give you the number of edits you are aiming for. (This sounds like quite a bit of work, but I doing this in the interest of science!) Espana Viva 21:23, 7 February 2007 (UTC)
What do you think of the idea of a bot for users who are blocked, especially those who completely blocked (i.e. from editing the talk page), are not unblocked and given the chance to insert an appeal/edit WP:ANI or WP:RFAR, and (for whatever reason) don't have e-mail with which to e-mail a request to a higher authority? Specification idea: This bot would use a CGI form page for blocked users to fill out with their username and information such as a reason for requesting the appeal. The bot would require that this is coming from the same IP address the user last used, by comparing the HTTP client's originating IP with the last-known IP with which the user was logged on at Wikipedia and making sure that IP address itself (and possibly its talk page) is/are blocked from editing. If this test passes, the bot would use a designated username to edit either some page such as WP:ANI, WP:RFAR, or some designated subpage specifically for entries by the bot (to "sandbox" it just in case of abuse of the feature, but the page would have to be watched like the other logistical pages). -- Blooper Glooper 01:45, 8 February 2007 (UTC)
An anti vandal / spell check bot, that you tell to "crawl" wikipedia for new changes, and misspellings. A feature would be a tab for it on you wiki tab bar. Has a Database, and is always adding to it. "learns" for your actions. Bayesian style of learning / database.
settings: Auto (auto, fixes spellings, checks for vandalism, uses Learns, tells you what it did on your talk page, incoperation of Lupen's anti vandle tool) Learn (learns from your actions) Off (turns bot off) Vote (on bot's page, lets other's vote on what to do) Actions (List's all the actions made by the bot) Fix this Page (make's the bot run on the current page, and attempt's to fix it)
Has an interface similar to Lupen's Live spell check. But actually corrects words, and fixes broken links, and redirects. can only be controlled by user.
Also adds citations via looking up fact needing cite, in Google, Yahoo, MSN, and Wikipedia; and adds link to site that appears on 2 out of 5 and up.
Bot is also capable of tagging with "Wikify" and "Clean up" tags, if it can not help via any other means.
while in vandle cleanup, report's user or IP to the vandle registry, and leaves the do not vandalize page on the user's talk page.
The bot try's to watch what you do, and learn from your actions. Script for the also needed.
--' •Tbon e 55• (Talk) (Contribs) (UBX) (autographbook) 01:31, 7 February 2007 (UTC)
Look in the rules for bot creation. "No spell checkers." Other part seems doable. RED skunk TALK 02:45, 7 February 2007 (UTC)
23:53, 8 February 2007 (UTC)
The first trial run of the Satyrbot went very well. Satyr47's work is much appreciated! While Satyr47 is waiting for the "official" results of this trial run, we'd like to press forward with a full-fledged run of the following:
Please let me know when this might be done . . . much appreciated! Espana Viva 07:31, 8 February 2007 (UTC)
I made a very minor mistake when I tagged about 600 or 700 articles incorrectly with my AWB as part of WikiProject American Open Wheel Racing. I tagged them with {{Template:WikiProject American Open Wheel Racing}} when I should have tagged them {{WikiProject American Open Wheel Racing}}. The template displays correctly. I'd hate to sit there for several hours to correct this little stupid error that isn't causing any harm. Could this bot do it in an unattended fashion at some low usage time? Royalbroil T : C 14:59, 8 February 2007 (UTC)
I think that a robot that could revert blanked pages would be extremely valuable. Moreover, blanking is an immediately noticable act, so it shouldn't be difficult to program a bot to do this kind of task.-- Orthologist 23:21, 9 February 2007 (UTC)
Forensics was just moved to forensic science. Can someone with AWB access fix up all those redirect links please? -- Selket Talk 00:23, 6 February 2007 (UTC)
I was thinking a new page patrol bot that would use Bayesian filtering and learn from other newpage patrollers (such as me) the general content of vandalism/attack/advert/joke pages.
It would go to new pages, check the length, check it for words like "gay" and "sucks", and if it raised a sufficent number of red flags, it would tag it with something like {{ Botspeedy}}.
And as an added bonus, it could also add tags like {{ cleanup}}, {{ unreferenced}}, and {{ uncategorized}} (which are simple business really).
Now we can't get too headstrong and assume that a bot will be right 100% of the time when tagging pages for deletion. So we would have to make a separate "bot flagged" template like {{tl|Botspeedy}. It would say something like
This page has been flagged for speedy deletion by BayBot, an automated script which patrols Wikiepdia's new pages. If this page has been flagged incorrectly and you are the page creator, it is advisable to not remove this tag youself. A Wikipedia administrator will come by shortly to review the tagging of this page and decide if it should be deleted or not.
So yeah, I'm a genius.
~ Flame vip e r 16:49, 9 February 2007 (UTC)
Table markup conversion from HTML needed on two tables with some 200 rows between them. Chris cheese whine 05:54, 10 February 2007 (UTC)
I've been noticing that it has been necessary to add a lot of {{aero-specs}} and {{aero-table}} tags. It is very tedious. Can someone write a bot for that? I probably could but i'm pretty busy. RED skunk TALK 04:16, 6 February 2007 (UTC)
I've been working on the Dead External Links Wiki Project. Many of the articles about various sessions of the US Congress have links to "Rules of the House" pages on clerk.house.gov. The URL has changed from clerk.house.gov/legisAct/legisProc/etc to clerk.house.gov/legislative/etc. Is there a way to do a global replace to fix this on every Wiki page? Sanfranman59 21:39, 10 February 2007 (UTC)
I've never put out a request for bot help before, so I hope I'm doing it right. I recently moved North Dakota Capitol to North Dakota State Capitol. Any help fixing the double redirects would be greatly appreciated. -- MatthewUND( talk) 06:42, 11 February 2007 (UTC)
At The Moment My User Page Userboxes Have A lot Of Gaps In between Them. If I Could Get This Fixed And Could Be Told How To Prevent This In The Future Would Be Very Helpful. ( Id Rather Be Hated For Who I Am, Than Loved For Who I Am Not 08:45, 11 February 2007 (UTC))
Wikipedia needs a bot that will add automaticly the tag {{ unreferenced}} to articles where the majority of the material is unsourced.-- Sefringle 08:01, 12 February 2007 (UTC)
I believe that there are currently a large number of uncategorized templates that can't be found except by finding pages that they're used on, i.e. they aren't in the template categories - Category:Wikipedia templates and subcategories. I would like to see a bot that can check for these and place them in Category:Uncategorized templates, where they can subsequently be sorted into the appropriate template categories by hand. Are there any bot frameworks out there that would be able to do this, or would anyone be interested in putting together a bot to do this? Mike Peel 16:18, 10 February 2007 (UTC)
I'll do it.-- Balloonguy 22:14, 13 February 2007 (UTC)
Hi, i'm trying to standardize all filmographies etc to the guidelines laid down by Wikipedia:Manual of Style (lists of works), specifically that items in filmographies, discographies etc should be listed in chronological order, earliest first. At the moment i am manually searching for articles that have their filmographies listed upside-down (reverse-chronological) and manually tagging them with the {{ MOSLOW}} tag.
Obviously there will be thousands, including many i would never find manually. I wonder if there would be some way of recognizing an "upside-down filmography" automatically (e.g. by finding a section called "Filmography" then looking at two consecutive lines - if they have differing years, the second being lower than the first, then we can assume its an upside-down filmography). In that case the page just needs to be tagged with {{ MOSLOW}}. The next stage, of actully fixing it, is not needed just yet (although User:Whilding87 has a nice script on his user page for that).
The bot doesnt have to recognize *every single* upside-down filmography (because there are many different formats), but if it can capture a sizeable number of articles with upside-down filmographies, it will be worth it. 82.9.25.163 18:15, 11 February 2007 (UTC)
Just fyi, there is currently no consensus for ordering filmographies chronologically, as pointed out in a ref for the guideline ( Wikipedia:Manual of Style (lists of works)#Ordering), and currently being discussed on its talkpage. (With prior discussion at Wikipedia talk:Filmographies#Chronological ordering.) There is also Wikipedia:Requests for comment/Filmography which could really use some more feedback. -- Quiddity 23:54, 14 February 2007 (UTC)
Its a bit complicated, but the main use would be to help with a mass reversion of unicode characters in sort keys. See a more detailed explanation at Wikipedia:Administrators'_noticeboard/Incidents#Bizarre_category_modifications. Thanks. -- Jeffrey O. Gustafson - Shazaam! - <*> 14:56, 14 February 2007 (UTC)
I just had a few great Ideas. Thought I would bring them here. Cant even code worth crap. . . Do what you want with my ideas let me know if any are going to be useful. -- Darkest Hour 22:15, 14 February 2007 (UTC)
This is a simple, but very great bot idea and I would really like this bot to be made for me since I don't have enough programming language knowledge to make my own.
Anyway, over the month I have noticed numerous article with the word, "although", commonly misspelled as "althought". A bot that would change all the mispelled "although"s, would be useful on Wikipedia and it would save time and effort rather than doing this process manually.
If someone is willing to make this bot, hand the "code" over to me and teach me how to operate/set the bot up, etc. Please contact me on my talk page. Thank you :D -- Bhavesh Chauhan 02:31, 10 February 2007 (UTC)
I made some changes to {{ WikiProject Business & Economics}} and I recreated the categories you made with lower case, e.g., Category:High-importance Business and Economics articles to Category:High-importance business and economics articles. The bot created Wikipedia:Version 1.0 Editorial Team/Business and economics articles by quality which is I think what you wanted. Cheers, Oleg Alexandrov ( talk) 04:46, 16 February 2007 (UTC)
Can a bot free right now start puting the tags: {{WikiProject Business & Economics|class=|importance=}} on the pages in the following categories and if the bot finds that the page is a stub it puts tag: {{WikiProject Business & Economics|class=stub|importance=mid}}. Please? -- Parker007 08:28, 15 February 2007 (UTC)
I've just created the article Christian heresy with text extracted from Heresy. There are lots of articles that link to Heresy. Most of them need to link to Christian heresy. What I need is a bot that will go through a set of links and either automatically or interactively change links to Heresy into links to Christian heresy. Where can I post such a request so that a kindly bot writer will see it and maybe help me with this? Or... is there a bot in existence that will do what I want done? -- Richard 07:13, 15 February 2007 (UTC)
Not sure I need a bot for this, but couldn't think of where else to ask. Wikipedia:Wikipedia is failing got me to thinking about finishing my study of article quality. I'd like a list of X random articles older than a certain threshold, say two years, with information such as the date created, the number of kb of text, the number of images, and the number of edits. Ideally I'd get links to oldids at various stages such as quarterly and the same data on the article at each of those points. Could certainly be run on an offline database copy. Anyway, would love to have the data. Could make a nice paper/Signpost writeup. - Taxman Talk 05:16, 16 February 2007 (UTC)
Is there a bot which checks the 'What links here' of pages and adds the 'orphan page' template if the number is less than a certain number? If not, can i make it? Smomo 22:09, 6 February 2007 (UTC)
I still think this is a good idea. Does anybody want to take this bot on and make it? Smomo 17:05, 12 February 2007 (UTC)
Would it be doable to change aero-spec tags to the newer aero-specs tag. Could you do hat with AWB? RedSkunk 00:45, 17 February 2007 (UTC)
If I make three or four edits in a row, or ten, I would like them to all be condensed into one, as if I had used "preview" correctly. I'd like this mostly so people wouldn't say "hey I notice you don't use preview. Why don't you give it a try?", but I find that I keep fiddling with a sentence or paragraph even after I thought it was good. Martin | talk • contribs 07:31, 17 February 2007 (UTC)
Most Japanese military biography stubs are double-tagged with {{ asia-mil-bio-stub}} and {{ Japan-bio-stub}}. I'd like to request a bot to go through Category:Asian military personnel stubs looking for the Japan-bio-stub tag, and replace the combination of those two template tags with {{ Japan-mil-bio-stub}}. Please see this edit for one example I did manually. Thanks in advance!! Neier 13:39, 17 February 2007 (UTC)
I recently updated the infobox U.S. County template, and I wanted to see that counties are actually using this template. I have just started looking at counties in Alabama, and I see that most of the county articles are not using the template! I manually added the {{infoboxneeded|infobox U.S. County}} infoboxneeded tag, but this will take forever. Is there a way for a BOT to be created that will check each county's page — for all states — and if it is not using the infobox U.S. County template, it will throw the tag on the top of the article? Thanks! Timneu22 ( talk · contribs) / Timneu22 00:50, 19 February 2007 (UTC)
There are a lot of double re-directs that need to be fixed in response to a page move I made. Georgia guy 14:35, 19 February 2007 (UTC)
I was recently nominating an article for deletion when I realized how tedious it is, plus this is something that everyone (not just admins) can do anyway. Here's how it works, a user places a tag on the articles page that looks something like {{subst:afd | cat=Category | text=Reason the page should be deleted}}, the bot is monitoring recent changes, creates the Wikipedia:Articles for deletion/PageName with the preloaded debate, and adds the page to that days listings. Additionally an intelligent bot could easily tell if there had been a previous listing and handle the page naming accordingly. Also, because this process involves creating a new page the bot would simply remove the tag and take no action if an anonymous IP added it. Vicarious 08:49, 19 February 2007 (UTC)
Last night and on Saturday night, there was a vandal who moved pages at about 10 pages per minute. For example, if the page was Wikipedia, the vandal would move the page to Wikipedia2. Do you think someone could create a bot which moves pages back to the original destination once this vandal has been blocked? As always, a custom award and a barnstar for the designer. :-) Also, for further information, please see this. Thanks! Real96 17:44, 19 February 2007 (UTC)
This page is always clogged. It would be great if a bot could remove all deleted links, and move them to an archive section (ordered by day?). Proto ► 12:53, 20 February 2007 (UTC)
Per discussion at WP:ANI#A_Semi-protected_article_that_.22isn.27t.22, it seems like there might be some added value in a bot that would remove templates such as {{ sprotected2}} from pages when their semi-protection or full-protection has expired, so that misleading headers aren't left in place. I have very little experience with programming, and thus I thought that I might be better off leaving a message here. - Hit bull, win steak (Moo!) 14:42, 20 February 2007 (UTC)
This Bot would well wikify. Takes names and dates and wikifies them.
If the above bot isnt approved or this one turns out to be better: Another bot: This bot would report newly created accounts with names containing certain banned words and phrases, or that use certain special characters to WP:AIV. -- Darkest Hour 22:25, 15 February 2007 (UTC)
I have a python tool that helps me filter the IRC feed I could re-configure it to report both username's and WoW attacks to AIV fairly easy if anyone is interested. Betacommand ( talk • contribs • Bot) 18:26, 19 February 2007 (UTC)
What I have in mind is a bot that would scan an article for a section titled References, External links, Sources, or Bibliography. It would also scan an article for a tag providing a link to another WikiMedia project (such as the one on the right).
If the bot couldn't find either an appropriately-titled section or a tag then it would place the following template at the top of the article: {{ Unreferenced}} The bot would also be programmed to leave an article alone if the first word in its title was List, because some people have claimed that lists should be exempt from reference requirements if the articles they link to have references. What do you guys think? Galanskov 06:58, 17 February 2007 (UTC)
This is from Wikipedia_talk:Speedy_deletion_criterion_for_unsourced_articles#A_proposal
...we get a bot to put an unreferenced template onto any article without ref tags or a references section. Mostlyharmless 08:50, 6 January 2007 (UTC)
- I LOVE that idea. Agne 09:09, 6 January 2007 (UTC)
- Don't forget the {{ cite}} template and its kin, which predate the
ref
tag. . -- nae' blis 20:45, 8 January 2007 (UTC)
- Or plain external links, or parenthetical citations, .... Christopher Parham (talk) 01:44, 13 January 2007 (UTC)
- RefBot has recognized all those for several versions, so it can be done. I'll add {{ citation}} when I finish its new core. Tagging unreferenced articles is trivial, but if nobody cared enough to supply citations in the first place then why tag it? ( SEWilco 03:47, 20 February 2007 (UTC))
- Some time ago, I wrote a script to make a list of unreferenced math articles, and while it is not impossible, it is harder than you expect because of natural language issues. The section headings that are used for references in articles are quite varied. Not all referenced articles use any sort of citation template. You should expect several percent false positives the first time you search for unreferenced articles (which I estimate at around a hundred thousand errors if you scan the entire article space and exclude redirects and disambiguation pages). CMummert · talk 04:26, 13 January 2007 (UTC)
- Thanks for the response! I don't imagine it would be easy, and whoever did create such a bot would deserve a few barnstars! I'd think that the text of any template of this kind should include a statement asking for refs and inviting users to remove it where references are provided. Mostlyharmless 10:50, 13 January 2007 (UTC)
- Actually, mass-tagging those pages won't really help unless you've got a cadre of volunteers to do the actual referencing. We have a plethora of cleanup processes and nearly all of them are terminally backlogged. >Radiant< 12:34, 16 January 2007 (UTC
- I agree. Cleanup lists aren't working all that well at the moment. This is more about giving a clear reminder to users that they have a duty to reference their articles, and getting them to do so voluntarily. I certainly don't think that most editors would do so, but a sizable minority would, particularly on anything that isn't an unwatched and unattended stub. And of course the template could go onto every new article. Mostlyharmless 02:11, 24 January 2007 (UTC)
A better idea might be to after a new article is created and the vandel fighters have delt with it, a bot goes and checks for references and if it does not have references it tags the articel and informs the editor.-- Balloonguy 19:21, 19 February 2007 (UTC)
Using this <span class="plainlinks" style="font-size: 100%;">[http://Yahoo.com/search Yahoo!]</span> Which gets: Yahoo! over Yahoo!. Can there be a bot to do this because doing them by hand will be a never ending task. -- Darkest Hour| DarkeBot 00:04, 22 February 2007 (UTC)
Any reference to TNA iMPACT! and variants (suggest using just iMPACT as a case-sensitive search term) needs the iMPACT! changed to Impact!, per WP:MOSTM and WP:MOSCL. Chris cheese whine 09:57, 22 February 2007 (UTC)
To the best of my (limited) understanding AWB is not able to do the following in a reasonably efficient way. A manually assisted bot with a easy user interface so more than the bot creator can use it, that points links to disambig pages to the correct page. First, the bot would find links to disambig pages that are likely to be incorrect, meaning it would ignore links from other disambig pages as well as anything but the main article space. Then it would show the user the few sentences around where the link in the article on one side, and the disambig page on the other, however simply clicking the link on the disambig page completes the operation with no typing involved. Also, a well made bot could guess the correct link frequently enough for it to be worth adding a feature where the bot recommends an option to the user by highlighting it. Vicarious 07:47, 22 February 2007 (UTC)
List of dialing codes of Greece numerically - possible delink all the area code entries automatically? For instance:
:[[Greece dialing code 210|210]] - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
... becomes ...
:210 - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
There are dozens of entries on the page, so if this can be done automatically rather than picking through it manually, it would be much appreciated. Chris cheese whine 16:56, 22 February 2007 (UTC)
Could a bot operator configure a bot to do a single dummy edit (i.e. a single space at the end of a line, an enter carriage return, etc.) on every single file in the image namespace in Category:Non-free image copyright tags? I noticed that the cache has not updated and did not (for the image I tested) until I made a dummy edit. Thanks, Iamunknown 23:27, 22 February 2007 (UTC)
I think we need a bot like commons:User:FlickreviewR here. Images from Flickr are uploaded here all the time. It would be nice if we had a bot that could review these images, verify them, and then either tag them as verified (and {{ Move to Commons}}) or moves them to a directory for either human review or deletion. It would be ideal if editors uploaded Flickr images straight to the Commons but we all know many do not so having this bot that would either mark them for deletion or verify them and tag them to be moved to Commons would be very beneficial.↔ NMajdan• talk 18:33, 23 February 2007 (UTC)
A common mistake when linking to categories is to forgo the initial colon, i.e. write [[Category:Wkipedia]] instead of [[:Category:Wikipedia]]. A related mistake is when users copy an article over to user space and forget to <nowiki> or link the categories and interwiki links rather than including the page in them. A similar thing happens when developing/reworking templates. Could a bot be made to regularly (say every month) scan all wikipedia categories looking for these incidents and sorting them out? Note that extra care would have to be taken with respect to user categories and user-based templates, which obviously don't want to be touched. Mike Peel 23:38, 23 February 2007 (UTC)
Is there a bot which finds fullurls to wikipedia pages? I've been running into cases where they are used for inline links, or to provide what appear to be "references". [4] Gimmetrow 23:00, 24 February 2007 (UTC)
I think there should be a bot that help replace bitmap images to svgs after their svg has uploaded. Often, bitmaps exists in many articles. it is hard to replace all of the bitmap image to svg, right?
--
Jacklau96 09:07, 25 February 2007 (UTC)
Example:
Bitmap:
Image:Wheelchair.png and
SVG:
Image:Wheelchair.svg
I would really like to run a bot that goes through the new user log and automatically welcomes new users - it would make many editors actually edit - as most accounts never get round to this! I would like a bot that puts {{subst:welcome}} ~~~~ (or something to that effect) on new users talk pages. I've got no expertise in creating these kinds of scripts else I would create one myself RyanPostlethwaiteSee the mess I've created or let's have banter 23:06, 25 February 2007 (UTC)
Various good faith move and page creations have resulted in large numbers of Dungeons & Dragons articles having their name suffixed with "(Dungeons & Dragons)" even if there is no article at the unsuffixed title. If a bot could move all articles of the form "Foo (Dungeons & Dragons)" to "Foo" if "Foo" doesn't exist or is a redirect to "Foo (Dungeons & Dragons)", that would save a lot of tedious manual work. This has been suggested at Wikipedia talk:WikiProject Dungeons & Dragons#Disambiguation with no objections. Cheers -- Pak21 13:18, 26 February 2007 (UTC)
To the best of my (limited) understanding AWB is not able to do the following in a reasonably efficient way. A manually assisted bot with a easy user interface so more than the bot creator can use it, that points links to disambig pages to the correct page. First, the bot would find links to disambig pages that are likely to be incorrect, meaning it would ignore links from other disambig pages as well as anything but the main article space. Then it would show the user the few sentences around where the link in the article on one side, and the disambig page on the other, however simply clicking the link on the disambig page completes the operation with no typing involved. Also, a well made bot could guess the correct link frequently enough for it to be worth adding a feature where the bot recommends an option to the user by highlighting it. Vicarious 07:47, 22 February 2007 (UTC)
List of dialing codes of Greece numerically - possible delink all the area code entries automatically? For instance:
:[[Greece dialing code 210|210]] - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
... becomes ...
:210 - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
There are dozens of entries on the page, so if this can be done automatically rather than picking through it manually, it would be much appreciated. Chris cheese whine 16:56, 22 February 2007 (UTC)
Could a bot operator configure a bot to do a single dummy edit (i.e. a single space at the end of a line, an enter carriage return, etc.) on every single file in the image namespace in Category:Non-free image copyright tags? I noticed that the cache has not updated and did not (for the image I tested) until I made a dummy edit. Thanks, Iamunknown 23:27, 22 February 2007 (UTC)
I think we need a bot like commons:User:FlickreviewR here. Images from Flickr are uploaded here all the time. It would be nice if we had a bot that could review these images, verify them, and then either tag them as verified (and {{ Move to Commons}}) or moves them to a directory for either human review or deletion. It would be ideal if editors uploaded Flickr images straight to the Commons but we all know many do not so having this bot that would either mark them for deletion or verify them and tag them to be moved to Commons would be very beneficial.↔ NMajdan• talk 18:33, 23 February 2007 (UTC)
A common mistake when linking to categories is to forgo the initial colon, i.e. write [[Category:Wkipedia]] instead of [[:Category:Wikipedia]]. A related mistake is when users copy an article over to user space and forget to <nowiki> or link the categories and interwiki links rather than including the page in them. A similar thing happens when developing/reworking templates. Could a bot be made to regularly (say every month) scan all wikipedia categories looking for these incidents and sorting them out? Note that extra care would have to be taken with respect to user categories and user-based templates, which obviously don't want to be touched. Mike Peel 23:38, 23 February 2007 (UTC)
Is there a bot which finds fullurls to wikipedia pages? I've been running into cases where they are used for inline links, or to provide what appear to be "references". [5] Gimmetrow 23:00, 24 February 2007 (UTC)
First, I have a request for a new task for a bot which substitutes tags. Per the album assessment project, we assess albums which have the {{albums}} tag. Instead, for assessing, I need to have this tag (shown below) placed on each album talk page, instead of the regular {{albums}} tag...because the regular album's tag makes the album harder to assess.
{{Album
|class=
|importance=
|attention=
|needs-infobox=
|auto=
}}
Second, what happened to Antivandalbot? Thanks. Real96 00:18, 25 February 2007 (UTC)
I think there should be a bot that help replace bitmap images to svgs after their svg has uploaded. Often, bitmaps exists in many articles. it is hard to replace all of the bitmap image to svg, right?
--
Jacklau96 09:07, 25 February 2007 (UTC)
Example:
Bitmap:
Image:Wheelchair.png and
SVG:
Image:Wheelchair.svg
I would really like to run a bot that goes through the new user log and automatically welcomes new users - it would make many editors actually edit - as most accounts never get round to this! I would like a bot that puts {{subst:welcome}} ~~~~ (or something to that effect) on new users talk pages. I've got no expertise in creating these kinds of scripts else I would create one myself RyanPostlethwaiteSee the mess I've created or let's have banter 23:06, 25 February 2007 (UTC)
Various good faith move and page creations have resulted in large numbers of Dungeons & Dragons articles having their name suffixed with "(Dungeons & Dragons)" even if there is no article at the unsuffixed title. If a bot could move all articles of the form "Foo (Dungeons & Dragons)" to "Foo" if "Foo" doesn't exist or is a redirect to "Foo (Dungeons & Dragons)", that would save a lot of tedious manual work. This has been suggested at Wikipedia talk:WikiProject Dungeons & Dragons#Disambiguation with no objections. Cheers -- Pak21 13:18, 26 February 2007 (UTC)
I was hoping that I could enlist some assistance in having a bot slap {{ WikiProject Law}} on the talk page of each article in Category:Law and each of its sub-categories and sub-sub-categories etc. (of which there are many), in connection with WikiProject Law?
Apologies if I have posted the request in the wrong place; we lawyers are not techies, and so are not very good at this sort of thing. But we do give the world lots of jokes at our own expense.
-- Legis ( talk - contributions) 14:47, 26 February 2007 (UTC)
I originally requested bot Ganeshbot to do this, but I haven't got a response and that bot hasn't had a contrib on 2 months so I'm asking here. Can somebody please have a bot place {{WikiProject College basketball|class=|importance=}} on all article talk pages in Category:College basketball and its subdirectories?↔ NMajdan• talk 17:52, 26 February 2007 (UTC)
Will someone please run a bot to change all
Full list can be found HERE, only the articles need changing (no talk/wikispace stuff). Users who browse IGN from uk get put on UK servers, as not everyone who reads wikipedia is from the UK they shouldn't be sent to UK servers. It also uses up a little space in article. Thanks.-- Empire Earth 19:55, 25 February 2007 (UTC)
Someone — don't remember who — had a brilliant suggestion on Wikipedia:Adminship survey (I think) and it's good enough that I thought it should be brought up here: A manually assisted script to help close AfDs more quickly. I could envision it as follows:
This would help a lot (for me at least) in clearing the AfD backlog. Removing the category, manually closing the article and especially removing links to the page is time consuming, and could be done by a bot, I think. I'm a mac user, I don't know if I'd be able to utilize it necessarily but I'm sure other people would appreciate it too. Comments? I don't know jack about programming (though I learned a little Basic and Logo in middle school or earlier.) Grand master ka 09:41, 28 February 2007 (UTC)
A bot is requested to iterate through Category:WikiProjects, find all projects that are inactive, move those to Category:Inactive WikiProjects, and add {{ inactive}} to the top of those pages. A project is defined as inactive if neither the project nor its talk page has been edited for two months. This should ideally be repeated every couple weeks or so. AllyUnion's bot ( User:Kurando-san) used to do this in the past. >Radiant< 12:46, 28 February 2007 (UTC)
Please run a bot to replace all
There is about 8-12. Then an admin can delete Empire Earth (video game). Thanks.-- Empire Earth 01:12, 25 February 2007 (UTC)
I am looking for a bot that will add the following to the mainspace of the article:
{{subst:prod|[[Wikipedia:No_original_research]] & does NOT include [[Wikipedia:Reliable_sources]] for [[WP:Verifiability]] of content. Leaving Message on article creator's talk page regarding this, and will ask him to add sources.}}
& add the following to the creator of the article's (first editor) talk page:
Your article [[---------]] has been proposed for deletion "[[Wikipedia:No_original_research]] & does NOT include [[Wikipedia:Reliable_sources]] for [[WP:Verifiability]] of content." Please add [[Wikipedia:Citing_sources|references]], or it will be deleted. --~~~~
-- Parker007 18:36, 25 February 2007 (UTC)
importScript('User:Dycedarg/easyprod.js'); importScript('Wikipedia:WikiProject User scripts/Scripts/Add LI menu'); importStylesheet('Wikipedia:WikiProject User scripts/Scripts/Add LI menu/css');
Could there be a bot that uses Lupins bad word list and the RSS feed to revert vandalism? Rest 5-15 Sec. Pearl. BadwordBot. -- Darkest Hour 17:35, 14 February 2007 (UTC)
-- Darkest Hour 18:16, 14 February 2007 (UTC)
This bot would greet new users. That is accounts being no more than 5 minutes old. Using the original welcome template.
:-)
Cbrown1023
talk 00:13, 1 March 2007 (UTC)This Bot would well wikify. Takes names and dates and wikifies them.
It would be bassically the same as werdna bot.The difference is that every month it starts a new archive for you.(something I wish werdnabot did)
This bot would need human input. It uses the RSS feed and the recent changes page and updates itself every 1 minute. It then highlights possible vandalism. But a human would need to click a confirmation on wether its vandalism or not. The bot would then act accordingly,if its vandalism revert it and add a warning to the offenders talk page : If not ignore it. It would not be active unless you go to the certian page. (That way you are not bugged every minute by a bot buzzing at you). This might need to be a monobook script to really work out properly.
Wouldn't it be possible to make a bot that works trough Category:Copy to Wikimedia Commons and moves the files to Commons? A sort of combination between Push for commons and FlickrLickr. // Liftarn
Standard links to New York Times articles expire after a while, and direct the user to a paid archival service. These are the links one gets from simply copying the url of an news article they are viewing. However, the Times also provides permanent anchors for blogs. To get the permanent anchor, one can use this service, though there may also be some simple algorithm for it. I've probably fixed dozens of such links using this method, and the NYTimes must be one of the most common citation sources. So, it seems like a good project for a bored bot to run around and change the temporary links to permanent ones. As an example, here's one I just fixed, which inspired me to make this suggestion. Derex 02:56, 1 March 2007 (UTC)
Please would someone tag the talk page of all articles in Category:Urban studies and planning and all subcats with {{Planning|class=|importance=}}? Many thanks. -- Mcginnly | Natter 14:32, 1 March 2007 (UTC)
What about a bot to notify uploaders of images tagged as both {{imagewatermark}} and {{self}}? // Liftarn
We should have a bot that fixes wikilinks within an article. It would do 3 things on an article:
Pyrospirit Flames Fire 16:33, 2 March 2007 (UTC)
Greetings! I was wondering if a bot operator could direct their bot to tag all talk pages of articles within Category:Plants (and its subcategories) with {{Plants|class=|importance=}}. Articles will be assessed later by members of WikiProject Plants. Thanks! -- Rkitko 08:59, 3 March 2007 (UTC)
Hmm, well, I've been busy with random stuff lately, particularly minor edits. I've cleaned up from 1 - 1 BC (After the 19th century in its entirety) in the categories at current. My proposal would be for a bot to...
Now, I understand not all of this might be possible, but it's certainly labour intensive to do this by hand, especially when the same categories can simply popup again and again due to human error. Logical2u Review me! 18:22, 3 March 2007 (UTC)
Do you think HBC Helperbot could put a notice on Defcom in order to tell administrators that WP:AIV is backlogged? Real96 06:02, 4 March 2007 (UTC)
I think a bot that monitors a series of backlogs and adjusts the Defcon is a great idea, but that is not my bot. I think would be a task would be better suited to a bot designed to do this, instead of tacked onto my existing bot. HighInBC (Need help? Ask me) 00:03, 7 March 2007 (UTC)
Would anybody be interested in coding up a bot for the purpose of removing redirects in template space, as to preserve the bolding effect when templates are transcluded on pages with links pointing to that page. Many wikipedians often forget to update these templates after a page move. The bot's logic should also include the ability to change hyphenation, capitalization, accent mark, and macrons to the link text if the redirect is similar. — Dispenser 02:18, 5 March 2007 (UTC)
Deletion review could use a bot to do routine tasks related to creation of daily and monthly log pages, and including completed days on the monthly log at the appropriate time. Monthly log to be created once a month, and presumably updated once a day, daily to be created once a day. See User:Trialsanderrors for further details. GRBerry 23:08, 5 March 2007 (UTC)
Reports to AIV for inappropriate usernames (i.e. words with .com/.org/.net, fuck, shit, damn, fagnut, cunt, whore, Wikipedia sucks, I hate you, you are evil, suck my ass, as well as racial epithets, etc.) Real96 02:38, 6 March 2007 (UTC)
Double redirects is up and running:
http://en.wikipedia.org/wiki/Special:DoubleRedirects
Bots get to work, and stop slacking. -- Parker007 18:09, 6 March 2007 (UTC)
There are Double redirects still left:
http://en.wikipedia.org/?title=Special:DoubleRedirects&limit=500&offset=500
http://en.wikipedia.org/?title=Special:DoubleRedirects&limit=500&offset=0
a simple bot, that places certain templates at the end of an article(s) you specify. -- •Tbon e 55 •( T, C, UBX, Sign Here) 21:13, 6 March 2007 (UTC)
It always removed the langue links which is valid [6].-- Ksyrie 03:58, 7 March 2007 (UTC)
Another bot found a little disfuctional. User:Idioma-bot.See Gulf of Carpentaria and zh:卡奔塔利亚湾.-- Ksyrie 04:40, 7 March 2007 (UTC)
Third bot, User:Soulbot,See Mesoplodont whale and zh:安氏中喙鯨.-- Ksyrie 04:48, 7 March 2007 (UTC)
Fourth bot, User:STBotD,see Direct action and zh:直接行動-- Ksyrie 04:51, 7 March 2007 (UTC)
Is there a way to create a bot to automatically create redirects to American city articles from names that are incorrectly punctuated or capitalized? For example, there are many many city articles for which ONLY searching for "Baltimore, MD" will get you to the right article. "baltimore MD", "Baltimore MD", "baltimore md", sometimes even "baltimore, MD" either take you to a search results page or to nothing at all. As most of these articles were created by a bot to begin with, someone suggested that a bot could create these redirects. Thanks- Dmz5 *Edits* *Talk* 15:56, 8 March 2007 (UTC)
You:
Me:
long list of links |
---|
|
This might be overstepping, but I couldn't find any bot pages on wiktionary so I thought I'd propose an idea here. There are a series of pages here that should only list redlinks and have instructions at the top to remove blue links, this would be an ideal task for a bot. Vicarious 08:25, 9 March 2007 (UTC)
I'm suggesting a bot that simply inserts the template {{talkheader}} at the beginning of every artice's talk page.
As a side note, mabey this bot should also put {{archivebox|auto=yes}} automatically in every talk page.
This bot comes out of a discussion with Ray Saintonge on wikien-l.
The problem: we have a zillion schoolkids vandalizing Wikipedia from shared IP addresses. We clean it up some, and then we block. This a bunch of work for us, people other than the vandals see warning messages that don't apply to them, and then a lot of innocent people suffer until we unblock.
The solution: School administrators put a special template on the IP talk page to show their interest in a given IP. Whenever a warning or a block happens, the administrator gets an email notice. They then take some locally appropriate action, defusing the situation before we block. Or if a block happens, then when people come to them with questions, they already know the answer.
I don't have time to build this in the near future, but if anybody gets interested and wants to chat about it, feel free to get in touch. Thanks, William Pietri 17:12, 10 March 2007 (UTC)
I have a request or idea if it is possible re-make interwiki bots, so as there add {{ commons}} to other languages wiki. For example if the bot see in czech wiki {{ Commons}}, that he replace this to all other wikis (like replace to en, fr, de) by using interwiki links if there this template don´t exist. I´m trying do this by myself but giving myself a question if is not possible use a mechanical power for doing this :) The idea is that all wiki use the same while category on commons are in english.
Hope so that somebody understand what I mean :) Thx for your time and hope that in early time see this new generation of bot working :) -- Chmee2 14:35, 11 March 2007 (UTC)
Per my suggestion at Wikipedia talk:Multilingual coordination#Tool to spread interlanguage links, assuming interlanguage links are symmetric and transitive it seems a bot could add the transitive closure of all interlanguage links to all language versions of an article in the closure set, and step through all articles (in, say, en:) doing this. Does anyone know of some reason not to do this? Is anyone interested in writing such a thing? -- Rick Block ( talk) 17:54, 11 March 2007 (UTC)
Category:WikiProject Irish Republicanism articles has just been renamed, and a large number of articles have disappeared from it that are still in the category according to their talk page. According to this I need to do a null edit, and someone at the VP says there's bots that do that? Thanks. One Night In Hackney 303 03:04, 12 March 2007 (UTC)
A bot is requested to iterate through Category:WikiProjects, find all projects that are inactive, move those to Category:Inactive WikiProjects, and add {{ inactive}} to the top of those pages. A project is defined as inactive if neither the project nor its talk page has been edited for two months. This should ideally be repeated every couple weeks or so. AllyUnion's bot ( User:Kurando-san) used to do this in the past. >Radiant< 13:13, 12 March 2007 (UTC)
I was thinking there should be a bot for automatically updating the count for a RfA e.g. (41/2/0). Sometimes people forget to change them, and it would be handy for something to change them if one forgets. Nol888( Talk)( Review me please) 21:42, 12 March 2007 (UTC)
Make a bot that can scan through the user talk pages of IP addresses and detect a specific vandalism tag which includes the date of vandalism, then have it count the number of tags within a certain time period, and if a minimum number is met, post the IP address and a link to the user page on a "block page" for administrators to look at. Store a list of all IP address vandalism and check to see if vandalism tags are removed from user talk page. To save server access time, the list of IP addresses to block can be loaded to the page every x times.
Outline of the process: Check user page, counting all vandalism tags dated after x date. Also, count overall number of tags. Compare count to y number of allowed tags, if greater, move ip address to list to be updated to the server. Also, check overall number of tags against last overall number of tags count. If new overall number of tags is less than old, move ip address to list to be updated to server with comment about missing tags. If not, store new count of overall tags to client side data file. After z number of ip addresses have been found to be over the limit of number of allowable tags, append list to special "block consideration" page for administrators to view, along with any comments on why they should be blocked and a link to their talk page. D 16:33, 9 March 2007 (UTC)
Trampton 17:19, 13 March 2007 (UTC).
Please could somebody make a bo that would automatically send Welcome Mesages to new Wikipedians. This would help the welcoming committee. Thank You. Djmckee1 20:04, 15 March 2007 (UTC)
I might (try) making something like this myself, but I would like some feedback on wether it is a good idea first. Basically, it would take Category:Wikipedia_external_links_cleanup, and iterate through each article in there, trying to detect if the NoMoreLinks template has been added. If it has been added, it will simply continue to the next article, if not, it will add the "NoMoreLinks" template to the "External Links" section. Tim.bounceback - TaLk 14:31, 12 March 2007 (UTC)
Just a quick question - is it worth running multiple times, and if so, how often (how often are articles added to that category)? Tim.bounceback - TaLk 20:27, 16 March 2007 (UTC)
I'm encouraging anyone to make a bot that monitors bot approval requests. Taxman described the task of such a bot in a discussion on the bot approval time as follows.
The suggestion on a bot to track requests isn't a bad idea, something like WP:RFASUM that lists the last time each bot request has been edited and perhaps the last time edited by a BAG memeber. BAG wikilinked by Ocolon
Such a bot would help bot approval group members and all the other Wikipedians assisting with the bot improvement and approval process to keep track of approval requests and prevent requests from being neglected unintentionally. It would provide a good summary of the current approval process and make this more transparent. — Ocolon 17:09, 16 March 2007 (UTC)
I am wondering if there is a bot available to deliver WikiProject Massively multiplayer online games' newsletter on April 1. Instructions on who to send it to are available on our newsletter page. Thanks! Greeves ( talk • contribs) 17:22, 16 March 2007 (UTC)
Just as a side-note, you put in your bot request for monthly newsletters, ours will probably only be quarterly. Thanks! Greeves ( talk • contribs) 20:11, 16 March 2007 (UTC)
Purpose: The bot will clear off old IP warnings from 2006. Real96 03:20, 15 March 2007 (UTC)
I'd like to see scores for 50 randomly-selected articles, five of them featured, according to my draft scoring system. The system is an early draft and has not been discussed by the community in its present form. (The previous version, which the community rejected, called for user voting in combination with a system like this.) I'm thinking it would be better to come to the VP with an already tested system. Even if it is rejected as an official system, though, there has been enough interest expressed that someone will probably code it as an off-site tool if it proves adequate, in which case I will support them. Neon Merlin 19:18, 16 March 2007 (UTC)
P.S. Yes, I know the Thoroughness rating isn't quite finished yet, but I'm not sure how to improve it. I'd welcome input into the system itself. Neon Merlin 19:32, 16 March 2007 (UTC)
I noticed that there's quite some deleted articles, which still have a wikiproject template on their talk page. This is particularly bad for projects like WP:VG, where there's lots of articles that don't meet the policies for inclusion, and are thus deleted. This clutters up statistics (deleted articles are often poorly rated) and unnecessarily adds extra articles to 'unassessed articles' categories. This could be added to an existing bot that browses talk pages. (Maybe one of the bots that add signatures to unsigned comments?) -- User:Krator ( t c) 17:20, 17 March 2007 (UTC)
I don't know if this is the place to put this, (but I was given a link). Could a bot please put the template {{ hiking-stub}} on all pages in the category Category:trail stubs? And/Or just on every page that has another stub tag in the following categories:
If this is possible, I'd be much abliged as it will be very tedious (sp?) to do this by hand. EDIT: Oh yah, hiking stubs is a subcategory of hiking... probably not a good idea to put any there because obviously it'll loop - Leif902 12:26, 18 March 2007 (UTC)
Thanks in advance, - Leif902 20:59, 18 March 2007 (UTC)
Hi, I would like to run a bot and I am not really experienced with programming, so it told me to post here. I was thinking on a bot (maybe called FAQbot or equivalent) that watches Wikipedia:Help desk for changes and if it finds a similar keyword match to a question in the FAQ pages, it will respond with an answer. This could also work with the {{ helpme}} requests at CAT:HELP. If someone would be kind and very grateful to create this for me, it would be much appreciated. Many thanks, Extranet ( Talk | Contribs) 06:59, 20 March 2007 (UTC)
Would it be possible for someone to run a bot through all transclusions of Template:Infobox UK station and perform the following changes:
These two fields are identical, the intention is to move all articles over to 'usage' which is a better description of the data. This will allow the template to be simplified and so make it easier to maintain.
The current use of the fields is:
Field | Number of uses | |
---|---|---|
usage0203 | 1 | 0.05% |
lowusage0203 | 17 | 0.92% |
usage0405 | 2 | 0.11% |
lowusage0405 | 24 | 1.30% |
exits | 29 | 1.57% |
lowexits | 29 | 1.57% |
lowexits0405 | 658 | 35.55% |
exits0405 | 1091 | 58.94% |
Total | 1851 |
So this change would require in the region of 1749 edits (use of lowexits0405 + use of exits0405). This is of course more than going the other way but as I've said, usage better describes the field. Some of the other fields in the list can also be merged together but I'll work through these manually as they need checking.
If anyone is able to do this, don't hesitate to contact me if there is anything you wish to verify. Thanks. Adambro 14:36, 25 March 2007 (UTC)
I don't know if this has been done before, or is technically feasible, but a bot that scans Wikipedia to locate articles tagged as orphaned and introduces internal links by putting brackets in words same as the name of the article would be extremely helpful. It wouldn't introduce every link available, but it would reduce the amount of work.-- Orthologist
Links to non-existent page Wikipedia is not a soapbox should point at Wikipedia:Wikipedia is not a soapbox redirect page or to its destination, Wikipedia:What Wikipedia is not#Wikipedia is not a soapbox. -- CiaPan 15:15, 20 March 2007 (UTC)
Very simple, I think (at least for people who know how to code). So simple, indeed, that I suspect it had already been suggested and eventually refused.
A bot that transform incoming links when they point to a redirection. Example: [7] has a link to Seven Deadly Sins wich redirects to Seven deadly sins.
Such a “redirectbot” (proposed name) would edit the link in Seven virtues and change it to Seven deadly sins. In the long run, it would save some CPU cycles and from day one, would remove a bit of the feeling of clutterness of the Encyclopedia.
What do you think of it? Has this already been suggested? Is there a place to browse the various proposals and why they have not been implemented?
David Latapie (
✒ |
@) 12:17, 21 March 2007 (UTC)
I recently created an addition to the Country Infobox template, namely the Gini coefficient.
Now, I started to add this coefficient to the infoboxes of 5 countries. But it's rather a repetitive and quite long and boring procedure. Additionally, it would be nice if a bot could take values from the page and update changes of the coefficients.
The pages where one can get the values from is here: List_of_countries_by_income_equality
And the value for each country's infobox to be completed/added is, e.g.:
|Gini = 33 |Gini_year = 2000 |Gini_rank = 33th |Gini_category = <font color="#00c600">low</font>
low (green bc. good), medium (orange), high(red because bad) have these color-intervals (found here
Gini_coefficient#Income_Gini_coefficients_in_the_world):
low:
medium:
high:
Anyone fancy to make this bot (if possible)?
Is there another todo list, taskforce, project I can place this?
Please tell me if I am barking the wrong tree.. —The preceding unsigned comment was added by R U Bn ( talk • contribs) 12:21, 18 March 2007 (UTC).
I had used a bot to create Indian town articles using census data. For the towns that already existed, the bot created the stub article in it's sandbox. These are currently being merged into article space manually. After a merge, the article is striked out from the list. A merged sandbox article will not have a category. The category is removed by adding a colon to the syntax. See User:Ganeshbot/sandbox/Adilabad. Is it possible to go through this list and delete the sandbox articles that have already been merged (striked or not). This would help identify what is complete and what is not. Completed ones will show up as red-links after the delete. Thanks, Ganeshk ( talk) 06:14, 20 March 2007 (UTC)
Would it be possible to create a bot that would go through all articles tagged with a specific WikiProject banner (i.e. all articles under Category:Caribbean articles by quality), find ones that are tagged with a cleanup template, and add Category:Caribbean articles needing attention to their talk pages? Jwillbur 22:00, 22 March 2007 (UTC)
I noticed that many articles in Wikipedia say things like " As of 2007", "As of September 2006", etc., referencing to things that have not yet happened but could happen soon or other similar situations. The problem is that many of these references are terribly outdated: I have found some that say "As of January 2005" and the like. I wondered if it was possible for a bot to update all these situations, so that they remain up to date until those things actually happen, and people can simply remove the "as of"s when it is adequate. The bot would only have to detect the phrase "as of" and if it is followed by a date, then it could update it to the current month and year. (I believe there are no situations where "as of (a specific date)" wouldn't need an update, do you??...) Just my humble first bot proposal... Kreachure 23:11, 22 March 2007 (UTC)
Yeah, the problem is that some of the "As of"'s refer to facts or statistics that are themselves outdated, so there could be instances where updating only the date would be inaccurate. I guess there's no way for a bot to ignore these while updating the others which deserve updating. But what if people put some sort of special marker or template along with the "As of"'s that would actually need an update, so that the bot would recongnize these and update them once in a while? That's the only way I could see this bot working... thanks anyway! Kreachure 00:13, 23 March 2007 (UTC)
It may be useful if a bot could add some kind of tag to the articles that have (or that it thinks has) outdated info, then they could be addressed manually by users Akubhai 20:36, 27 March 2007 (UTC)
See Wikipedia:As of. Get more people to work on that project, and there won't be a problem. -- Rory096 22:36, 27 March 2007 (UTC)
There doesn't seem to be a bot that does this, so I'd like to request one.
The bot I envision would check the "File links" on images which include {{ screenshot}}, {{ logo}}, and other fair use templates. It would then remove the images from the Wikipedia space, template space, portal space, and, most importantly, user space. In the case of user space it would leave a message along the lines of "I have removed a Wikipedia:fair use image (or images), Image:Example.jpg, from your userpage (or user subpage) because Wikipedia policy doesn't allow them to be used on any pages besides articles. If you believe this removal was in error because the image was not a fair use image, please ask at Wikipedia:Media copyright questions."
Overall, it would function something like OrphanBot. How does this sound? Picaroon 21:44, 25 March 2007 (UTC)
I need a bot for my talk page that moves the </div> tag to the bottom of the page every time the </div> tag is not at the bottom. This is needed to combat the issue of new sections being placed below the </div> tag. This is an example of what I need it to do: User Talk:Andrew Hampe (diff) -- Andrew Hampe | Talk 16:33, 26 March 2007 (UTC)
Shoot. Sorry, screwed up. I need it to do this instead. Sorry about the goof up. -- Andrew Hampe | Talk 16:48, 26 March 2007 (UTC)
<!-- please add new sections above this comment --> |} </div> <!-- please add nothing below this comment -->
We really need a bot to start tending to the newly created Wikipedia:Request an account page. At present, it takes several minutes to complete the clerical work surrounding an account creation. Specifically, the bot needs to check whether a user account has been created (either by a user, or by an admin to meet the request). If it is already created when the user posts the request, the bot should tag that account as already existing and post that in the user's entry. If the account is created by an admin, the bot should move that request to the relevant archive ( current archive using the formatting illustrated on that page.
Let me know if you need further clarification. I'd sincerely appreciate anyone's help with this. (There's a barnstar involved :D) alphachimp 18:56, 26 March 2007 (UTC)
I need a bot. Can somone create one for me that will search out words or mispellings on a page and replace them with a different word? Silverpelt 00:51, 27 March 2007 (UTC)
As I mentioned on the Patel Talk Page, the list in the article is not comprehensive. Can a bot search through Wikipedia and create a page such as List of Notable Patels (with the criteria being they have a wikipedia page) with a link to each page. I'm guessing the bot will pick up extraneous pages such as Patel but I can go through and remove those pretty easily. Thanks. Akubhai 20:53, 27 March 2007 (UTC)
Is there a bot that does boolean logic on categories? I'd like to be able to say things like:
IF an article is in a subcategory of Category:Calvinism AND is NOT in Category:WikiProject Calvinism THEN dump the list to such-and-such a place
Is there any bot that will do this sort of thing?
-- TimNelson 06:24, 27 March 2007 (UTC)
Thanks for the information, both of you. Unfortunately AWB doesn't run on Linux, but it's good to know that it can do that. I'll be using CatScan, though.
-- TimNelson 00:46, 29 March 2007 (UTC)
Where is AntiVandalBot? I found lots of test edits in pages listed in the "File Links" of [[Image:example.jpg]]. Although I am using VandalProof to track them, but they keep vandaling (the vandals are mostly by IPs). -- Jacklau96 09:06, 29 March 2007 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | → | Archive 15 |
Removing redlinks is a tedious part of keeping WP:DEAD current. Could this be done easily by a bot? Salad Days 23:20, 24 January 2007 (UTC)
I'am unable to create a bot. My programming skills are not in this area. I would like help in developing my RVBOT as an automated clone of the Antivandal bot to help the wikipedia anti-vandal fleet. Retiono Virginian 16:38, 27 January 2007 (UTC)
Can someone create a bot called Obiwanbot for me? - Patricknoddy 10:49am, January 28, 2007
This has been moved to User:Betacommand/Bot Tasks 18:07, 29 January 2007 (UTC)
Can someone make a bot named Realbot? The bot puts {{SharedIP}} and {{SchoolIP}} on the IP's page as well as put where the IP is editing from, in case of vandalism. Some administrators forget to see where the IP is editing, before blocking the IP. This is to lessen the amount of collateral damage. Whoever makes this bot will receive a barnstar and/or special template! Thanks. Real96 03:29, 29 January 2007 (UTC)
(reduce indent) Yes, the bot can put an education or shared IP template on the IP's page without revealing the IP's location. If someone would manipulate and copy AntiVandalBot's code to create Realbot, maybe the bot could exist! Real96 03:41, 30 January 2007 (UTC)
Can AIV helperbot add block templates to indef. blocked users or temporary blocked users if administrators fail to do so? Real96 05:39, 29 January 2007 (UTC)
I have a fast computer and internet connection, and I am on it most of the time. I have created this account, and would like to be able to use the source for an existing bot as I have no programming experience outside Flowcharts. Any ideas on a bot I could run? I already run AutoWikiBrowser and have been doing category renaming and deletion work recently. Maybe an automatic bot that did this would be a good start? Thanks everyone. Robertvan1Bot 23:17, 29 January 2007 (UTC)
Hows this: a bot which, if you save a link to a disambiguation page, asks you to adjust the link to make it a bit more specific.
What would be ubercool is if the bot would list the links provided by the disambiguation page, and just allow you to click to select which one you meant.
Paul Murray 03:08, 30 January 2007 (UTC)
someone pls do that so we can have more featured articles —The preceding unsigned comment was added by PWG99 ( talk • contribs) 21:07, 30 January 2007 (UTC).
Does anyone know of a bot that will crawl a category and create a list based on the "What links here" count. I'm looking for something to guide priority assessment in a Wikiproject (not that it would be the basis for priority - just a helpfull tool). Morphh (talk) 02:24, 31 January 2007 (UTC)
Done -- SatyrTN ( talk | contribs) 21:25, 31 January 2007 (UTC)
A bot that does the trivial aspects of closing an AFD. A real person would close the AFD by placing a tag for the bot (perhaps {{close|keep}}), the bot would be watching the afd pages and see the tag being added. In the case of keep it would close the voting page, remove the afd template from the article and add the {{ oldafdfull}} tag with all the appropriate information to the talk page. The tag itself would probably just have some trivial message because it would only be on the page for a few seconds, something like, "A bot will come take care of this soon". The bot would be worthwhile if it only handled variations of keep (including "no consensus", "speedy keep" and "nomination withdrawn"), but could also be made to do deletes as well. For either just the delete or any use it could use a whitelist of users allowed to do the action, I'm not sure what the current policy is on who can close. Vicarious 05:01, 31 January 2007 (UTC)
This has been moved to User:Betacommand/Bot Tasks 15:54, 31 January 2007 (UTC)
I just had an idea for a bot that goes through articles and fixes section headers.
Example: Changing
==Section title==
to
== Section Title ==
and so on. Like changing "See Also" to "See also" and correctly sequencing "references", "see also", and "external links" at the end of pages.
A name: SecBot
Just an idea. ~ Flame vip e r Who's a Peach? 17:30, 31 January 2007 (UTC)
Can a bot please add {{ WikiProject Abkhazia}} to all articles in Category:Abkhazia? - Patricknoddy 4:57pm, January 31, 2007
I had an idea for a Copyvio bot (let's just call it, .V.iobot). It would search through articles and take segments from those articles and check them in Google, sans other Wiki links. If it finds matches to those segments, it adds the name of the article checked to the bot's userpage (or a subpage), entitled "Possible Copyvio Articles." Then I (or another editor) could go to that page and double-check they really are copyvios and then put them up for deletion.
Has this been done before or is currently being done? If not, is it a good idea? It sounds like it would be helpful. I'd really love to script it if the job is open. .V. Talk| Email 22:08, 31 January 2007 (UTC)
.{{fact}} -> {{fact}}. and . {{fact}} -> .{{fact}} And I mean an actual running bot, not just AWB. AWB is a very sloooww thing. {Slash -|- Talk} 06:34, 1 February 2007 (UTC)
How about a bot that scans through All pages (Talk namespace), and checks each page to see if it isn't an orphaned talk page. If it is, tag it with {{ db-talk}}. Or does this exist in some form already? – Tivedshambo (talk) 20:19, 1 February 2007 (UTC)
A bot adding templates to new pages which fall under individual Wikiprojects. Real96 13:14, 2 February 2007 (UTC)
It would be great if a bot that blanks/cleans personal user-sandboxes were to be created...
-- TomasBat ( Talk)( Sign) 19:02, 2 February 2007 (UTC)
Could there be a bot that goes down the new users page and posts {{subst:Welcome123}} or some other welcome template? There are a lot of new users every minute so maybe the bot could relieve people on the Wikipedia:Welcoming Committee of greeting new users. Orin LincolnÆ 04:21, 3 February 2007 (UTC)
Can a bot please put {{ WikiProject Afghanistan}} of the talk pages of articles in Category:Afghanistan? - Patricknoddy ( talk · contribs) 5:37pm, February 1, 2007
Please be more careful while listing categories for tagging. Category:Balochistan which is part of Geography of Afghanistan is also part of Geography of Iran as well as Pakistan. All those articles have got tagged as well. The categories need to be chosen much more carefully — Lost (talk) 13:44, 4 February 2007 (UTC)
Hi. How about a bot that removes images that do not exist from articles? This is because I have encountered many broken images in articles, and they are rarely removed. My request is for a bot that removes images and descriptions that link to nonexistant images that have been in an article for more than 24h. This way we can edit without having to remove nonexistant images. Thanks. Suggestions? A stroHur ricane 00 1( Talk+ Contribs+ Ubx) 15:14, 3 February 2007 (UTC)
Hi. How about a bot that removes double redirects after a move? I have encountered many double redirects after an article containing redirects was moved, andthe redirects became double. I have often had to fix them myself, because the people that moved the article forgot to follow instructions and did not fix the double redirects. I request a bot that automaticly recognises double redirects after a move and fixes them, so we can edit articles without having to fix the double redirects. Thanks. Suggestions? A stroHur ricane 00 1( Talk+ Contribs+ Ubx) 15:19, 3 February 2007 (UTC)
Since we're on a roll with WikiProject-related requests, would it be possible for a bot to put {{ Visual arts}} on all the pages within Category:Visual Arts? Appreciate the help. Planetneutral 05:45, 4 February 2007 (UTC)
Could you use your magic to tag all articles related to Spain for the {{WikiProject Spain}}? ¡Muchas gracias! Espana Viva 22:18, 1 February 2007 (UTC)
ST47 - well, I'm puzzled why Aztec Triple Alliance and Florentine Codex wound up on the list to be tagged, since I went through their categories and parent categories, and they don't seem to have an obvious tie to any of the categories we listed. Let's give this some thought so as how to avoid problems the next time! Perhaps run it so that articles in sub-categories are not automatically tagged? Also, may run it in batches of smaller groups so that issues can be managed as they arise on a smaller scale? Espana Viva 16:32, 4 February 2007 (UTC)
After my previous AIV Warning Bot went nowhere, I got an idea while reading an article, then I found out it was a previous request that was not kept alive. I was thinking how convenient it would be for a bot to turn all the large amounts of web citations on Wikipedia articles into the more formal reference tags such as {{cite web}}. As User:John_Broughton stated in the previous request, it would go to Special:Linksearch and get a list of all external links for a specific domain (e.g. www.msn.com). It should understand that when a URL appears twice in a page, there should not be two identical footnotes. So, the first reference/footnote gets a name="X" argument. It probably should avoid pages that have both a "Notes" and a "References" section. It probably shouldn't footnote a URL if the URL is in the "External links" section. And, obviously, it should put as much information as possible into the "cite" ({{cite web}}) template. If you are willing to create a excellent and much needed bot that I would be glad to run and operate, it would be much appreciated. Thank you. -- Extranet ( Talk | Contribs) 07:08, 3 February 2007 (UTC)
It's possible that there are some more standard (meta) fields than these (I didn't look through the entire source), or that one would have to write a bit of code for each major news source (PubMed is already standardized, and a converter already exists), but even if this bot only covered (say) the top 25 news sources, it could still make a major difference, if only to move editors from seeing articles with only embedded links to seeing what good footnotes look like. -- John Broughton (☎☎) 07:56, 3 February 2007 (UTC)
Giving this some more thought, I think that in addition to formatting the citations, it is also necessary to check the articles to make sure they actually support the cited statment. I think this is a very important job for the good of the Wiki, and it almost certainly requires human input. Would anyone be interested in starting a fact checking wikiproject to go around, check citations, and put references in the propper format? I would love to start such a thing if others are interested. -- Selket 20:47, 3 February 2007 (UTC)
Some people have used the {{
PDFlink}} inside the citation template as the format parameter. With the revised PDFlink usage, the first parameter is required. I would like somebody to run a bot which will convert {{PDFlink}}
and {{PDF}}
to PDF
. --
Dispenser 23:10, 3 February 2007 (UTC)
I'm sure there must be a bot that could do this already but I'm not sure which one. I'd like a bot to run through the sub-categories at Category:Proposed deletion. It would look for articles that had been prodded, had the tag removed and then replaced. If it finds any they could be dumped into a file for a human to go through. I realise that the bot could remove the tag and notify the editor that had replced it by mistake, but I don't think that would be a good idea. First of all to have the bot re-remove the tag would give the apperance of the bot edit warring. Second, I've noticed that editors get upset when the prod tag is removed a second time especially when it is for an article that has been justifiably prodded. I went through on of the sub-categories and removed 6 prod tags that had been restored improperly so I do see the need for it. CambridgeBayWeather (Talk) 12:30, 4 February 2007 (UTC)
Could a bot please put {{ WikiProject Åland Islands}} on all the talk pages of the articles in Category:Åland? - Patricknoddy ( talk · contribs) 9:11am, February 4, 2007
May I request that you hold off on this project and other similar projects by User:Patricknoddy, pending the results of the discussion here. I have a concern that the creation of these large WikiProjects which may not have any active participants may create difficulties. Spamreporter1 15:32, 4 February 2007 (UTC)
Can a bot put {{ WikiProject Algeria}} on the talk pages of all the articles in Category:Algeria? - Patricknoddy ( talk · contribs) 9:36am, February 4, 2007
May I request that you hold off on this project and other similar projects by User:Patricknoddy, pending the results of the discussion here. I have a concern that the creation of these large WikiProjects which may not have any active participants may create difficulties. Spamreporter1 15:34, 4 February 2007 (UTC)
At present, when an article comes up for AfD, the Jayden54Bot inserts a warning into the users' talk page. Excellent idea!
Last week I had the Lasagna cell article vanish without warning. I was stunned, since there was no Talkpage discussion. Also there were no watchlist notifications. It disappeared from "my contributions," and all my old watchlist entries vanished too. I know that it hadn't been AfD'd. Most frightening: if I hadn't been recently using its talkpage, I might never have noticed its disappearance. Lacking experience with this, and lacking evidence, I had no idea what to do next. (Unfortunately I didn't notice the deletion resources link on the error page.) I ended up yelling to admin for help. It turned out that the article had been Speedy Deleted by an admin under incorrect SD criteria, without tagging, and who also hadn't seen the talk page.
I found the whole experience very unsettling. It was like seeing "Extraordinary Rendition" in action! (grin.) I had no clue that SD existed, or how to find what had happened. And... if I found it confusing, I assume that lots of others must have the same plight. I seem to have guessed right. Someone attempted an ineffective cure in the form of a SD Patrol started in late 2005, even with comment about others' saving good articles from accidental deletion. (Scary: if some have been saved, how many have been lost?!!)
Here's a much better fix: just make the original author responsible for saving the SD'd article. Use a bot similar to Jayden54Bot to announce on the original authors' talkpage the speedy-deletion event, as well as providing some resources for that author. This way legit articles won't just vanish silently with nobody noticing. Sample:
-- Wjbeaty 07:01, 27 January 2007 (UTC)
However, usually the original creator of an article is not present. How about also notifying everyone who's made 4 edits or more, or if it's speedied on sight, how about 3? It should also check whether a user has made an edit within the last 2 weeks also. {Slash -|- Talk} 06:19, 1 February 2007 (UTC)
Wikipedia:WikiProject Victoria Cross Reference Migration moved a heap of content into wikipedia, the old domain is now being squatted and we are providing heaps of links (over 1300) to it. Could someone with access to some sort of automated tool, find and replace (or just get rid of) the links to the external site with a link to the wikiproject instead please? -- Peta 02:01, 30 January 2007 (UTC)
This page has been [[Wikipedia:WikiProject Victoria Cross Reference Migration|migrated]] from the [http://www.victoriacross.net Victoria Cross Reference] '''with permission.'''''
Should be converted to:
This page has been migrated per the [[Wikipedia:WikiProject Victoria Cross Reference Migration|Victoria Cross Reference]] project '''with permission.'''''
And the list of pages to be checked is found here. -- John Broughton (☎☎) 03:20, 2 February 2007 (UTC)
I've removed most of them manually, and will finish the job soon, thanks anyway bot writers. -- Peta 08:33, 6 February 2007 (UTC)
I need a bot/AWBer to go through Special:Whatlinkshere/Stv, and change occurrences in articles of [[stv]] and [[STV|stv]] to [[STV]], in line with WP:MOSTM (the move requires divine intervention, which has been sought). I managed to get a bunch of the direct links out with a change to Template:ITV, but there look to be over 100 occurrences articles affected. Chris cheese whine 15:49, 5 February 2007 (UTC)
I'm not sure if this exists already, but due to the new protection feature I believe a bot that automatically removes protection templates from pages whose protections have expired would be a valuable asset. -- tariqabjotu 23:46, 22 January 2007 (UTC)
Could we have a bot to place {{ cycling project}} on the talk page of all articles within Category:Cycling and its sub-categories please? Mk3severo 14:42, 3 February 2007 (UTC)
Are there any bots currently running that can apply WikiProject tags on all the articles in a particular category and in its entire sub-category tree? If there's one available, there are three (very large) requests listed here that we'd love to have some help with! :-) Kirill Lokshin 19:41, 4 February 2007 (UTC)
I need a bot which adds category tags to people/states/etc. to the respective category. Currently, I am trying to clean up Phi Beta Sigma and am going to make a category for it. I have to put category tags on each person's page, which is EXTREMELY tedious. Can a bot please help out for this? Thanks. Real96 20:50, 4 February 2007 (UTC)
The WikiProject Spain has a list of Spain-related categories here. We will be asking you to place the WikiProject Spain template on the talk page of the articles in these categories. We will be asking you to go through these categories a chunk at a time, so as to cut down on any "over-inclusion" issues.
When that first run is complete, and we have dealt with issues (if any), then we will be aking you to move forward to the next chunk of categories. We would also ask that you not go down into the sub-categories of these at this time. Just add the tag to these categories only.
If all that makes sense:
Please let me know if you have any questions! Espana Viva 19:23, 7 February 2007 (UTC)
Excellent, thank you! Espana Viva 20:03, 7 February 2007 (UTC)
Certainly OK with me! Please let me know if you have any questions on my talk page. Espana Viva 21:17, 7 February 2007 (UTC)
Well, just thought of something, though . . . each category is going to have multiple articles in it. So, if your limit is 50-100 edits you're going to have to do just a handful of categories. Take a look at the number articles in the first few categories (starting with #1), and then select just that number of categories that will give you the number of edits you are aiming for. (This sounds like quite a bit of work, but I doing this in the interest of science!) Espana Viva 21:23, 7 February 2007 (UTC)
What do you think of the idea of a bot for users who are blocked, especially those who completely blocked (i.e. from editing the talk page), are not unblocked and given the chance to insert an appeal/edit WP:ANI or WP:RFAR, and (for whatever reason) don't have e-mail with which to e-mail a request to a higher authority? Specification idea: This bot would use a CGI form page for blocked users to fill out with their username and information such as a reason for requesting the appeal. The bot would require that this is coming from the same IP address the user last used, by comparing the HTTP client's originating IP with the last-known IP with which the user was logged on at Wikipedia and making sure that IP address itself (and possibly its talk page) is/are blocked from editing. If this test passes, the bot would use a designated username to edit either some page such as WP:ANI, WP:RFAR, or some designated subpage specifically for entries by the bot (to "sandbox" it just in case of abuse of the feature, but the page would have to be watched like the other logistical pages). -- Blooper Glooper 01:45, 8 February 2007 (UTC)
An anti vandal / spell check bot, that you tell to "crawl" wikipedia for new changes, and misspellings. A feature would be a tab for it on you wiki tab bar. Has a Database, and is always adding to it. "learns" for your actions. Bayesian style of learning / database.
settings: Auto (auto, fixes spellings, checks for vandalism, uses Learns, tells you what it did on your talk page, incoperation of Lupen's anti vandle tool) Learn (learns from your actions) Off (turns bot off) Vote (on bot's page, lets other's vote on what to do) Actions (List's all the actions made by the bot) Fix this Page (make's the bot run on the current page, and attempt's to fix it)
Has an interface similar to Lupen's Live spell check. But actually corrects words, and fixes broken links, and redirects. can only be controlled by user.
Also adds citations via looking up fact needing cite, in Google, Yahoo, MSN, and Wikipedia; and adds link to site that appears on 2 out of 5 and up.
Bot is also capable of tagging with "Wikify" and "Clean up" tags, if it can not help via any other means.
while in vandle cleanup, report's user or IP to the vandle registry, and leaves the do not vandalize page on the user's talk page.
The bot try's to watch what you do, and learn from your actions. Script for the also needed.
--' •Tbon e 55• (Talk) (Contribs) (UBX) (autographbook) 01:31, 7 February 2007 (UTC)
Look in the rules for bot creation. "No spell checkers." Other part seems doable. RED skunk TALK 02:45, 7 February 2007 (UTC)
23:53, 8 February 2007 (UTC)
The first trial run of the Satyrbot went very well. Satyr47's work is much appreciated! While Satyr47 is waiting for the "official" results of this trial run, we'd like to press forward with a full-fledged run of the following:
Please let me know when this might be done . . . much appreciated! Espana Viva 07:31, 8 February 2007 (UTC)
I made a very minor mistake when I tagged about 600 or 700 articles incorrectly with my AWB as part of WikiProject American Open Wheel Racing. I tagged them with {{Template:WikiProject American Open Wheel Racing}} when I should have tagged them {{WikiProject American Open Wheel Racing}}. The template displays correctly. I'd hate to sit there for several hours to correct this little stupid error that isn't causing any harm. Could this bot do it in an unattended fashion at some low usage time? Royalbroil T : C 14:59, 8 February 2007 (UTC)
I think that a robot that could revert blanked pages would be extremely valuable. Moreover, blanking is an immediately noticable act, so it shouldn't be difficult to program a bot to do this kind of task.-- Orthologist 23:21, 9 February 2007 (UTC)
Forensics was just moved to forensic science. Can someone with AWB access fix up all those redirect links please? -- Selket Talk 00:23, 6 February 2007 (UTC)
I was thinking a new page patrol bot that would use Bayesian filtering and learn from other newpage patrollers (such as me) the general content of vandalism/attack/advert/joke pages.
It would go to new pages, check the length, check it for words like "gay" and "sucks", and if it raised a sufficent number of red flags, it would tag it with something like {{ Botspeedy}}.
And as an added bonus, it could also add tags like {{ cleanup}}, {{ unreferenced}}, and {{ uncategorized}} (which are simple business really).
Now we can't get too headstrong and assume that a bot will be right 100% of the time when tagging pages for deletion. So we would have to make a separate "bot flagged" template like {{tl|Botspeedy}. It would say something like
This page has been flagged for speedy deletion by BayBot, an automated script which patrols Wikiepdia's new pages. If this page has been flagged incorrectly and you are the page creator, it is advisable to not remove this tag youself. A Wikipedia administrator will come by shortly to review the tagging of this page and decide if it should be deleted or not.
So yeah, I'm a genius.
~ Flame vip e r 16:49, 9 February 2007 (UTC)
Table markup conversion from HTML needed on two tables with some 200 rows between them. Chris cheese whine 05:54, 10 February 2007 (UTC)
I've been noticing that it has been necessary to add a lot of {{aero-specs}} and {{aero-table}} tags. It is very tedious. Can someone write a bot for that? I probably could but i'm pretty busy. RED skunk TALK 04:16, 6 February 2007 (UTC)
I've been working on the Dead External Links Wiki Project. Many of the articles about various sessions of the US Congress have links to "Rules of the House" pages on clerk.house.gov. The URL has changed from clerk.house.gov/legisAct/legisProc/etc to clerk.house.gov/legislative/etc. Is there a way to do a global replace to fix this on every Wiki page? Sanfranman59 21:39, 10 February 2007 (UTC)
I've never put out a request for bot help before, so I hope I'm doing it right. I recently moved North Dakota Capitol to North Dakota State Capitol. Any help fixing the double redirects would be greatly appreciated. -- MatthewUND( talk) 06:42, 11 February 2007 (UTC)
At The Moment My User Page Userboxes Have A lot Of Gaps In between Them. If I Could Get This Fixed And Could Be Told How To Prevent This In The Future Would Be Very Helpful. ( Id Rather Be Hated For Who I Am, Than Loved For Who I Am Not 08:45, 11 February 2007 (UTC))
Wikipedia needs a bot that will add automaticly the tag {{ unreferenced}} to articles where the majority of the material is unsourced.-- Sefringle 08:01, 12 February 2007 (UTC)
I believe that there are currently a large number of uncategorized templates that can't be found except by finding pages that they're used on, i.e. they aren't in the template categories - Category:Wikipedia templates and subcategories. I would like to see a bot that can check for these and place them in Category:Uncategorized templates, where they can subsequently be sorted into the appropriate template categories by hand. Are there any bot frameworks out there that would be able to do this, or would anyone be interested in putting together a bot to do this? Mike Peel 16:18, 10 February 2007 (UTC)
I'll do it.-- Balloonguy 22:14, 13 February 2007 (UTC)
Hi, i'm trying to standardize all filmographies etc to the guidelines laid down by Wikipedia:Manual of Style (lists of works), specifically that items in filmographies, discographies etc should be listed in chronological order, earliest first. At the moment i am manually searching for articles that have their filmographies listed upside-down (reverse-chronological) and manually tagging them with the {{ MOSLOW}} tag.
Obviously there will be thousands, including many i would never find manually. I wonder if there would be some way of recognizing an "upside-down filmography" automatically (e.g. by finding a section called "Filmography" then looking at two consecutive lines - if they have differing years, the second being lower than the first, then we can assume its an upside-down filmography). In that case the page just needs to be tagged with {{ MOSLOW}}. The next stage, of actully fixing it, is not needed just yet (although User:Whilding87 has a nice script on his user page for that).
The bot doesnt have to recognize *every single* upside-down filmography (because there are many different formats), but if it can capture a sizeable number of articles with upside-down filmographies, it will be worth it. 82.9.25.163 18:15, 11 February 2007 (UTC)
Just fyi, there is currently no consensus for ordering filmographies chronologically, as pointed out in a ref for the guideline ( Wikipedia:Manual of Style (lists of works)#Ordering), and currently being discussed on its talkpage. (With prior discussion at Wikipedia talk:Filmographies#Chronological ordering.) There is also Wikipedia:Requests for comment/Filmography which could really use some more feedback. -- Quiddity 23:54, 14 February 2007 (UTC)
Its a bit complicated, but the main use would be to help with a mass reversion of unicode characters in sort keys. See a more detailed explanation at Wikipedia:Administrators'_noticeboard/Incidents#Bizarre_category_modifications. Thanks. -- Jeffrey O. Gustafson - Shazaam! - <*> 14:56, 14 February 2007 (UTC)
I just had a few great Ideas. Thought I would bring them here. Cant even code worth crap. . . Do what you want with my ideas let me know if any are going to be useful. -- Darkest Hour 22:15, 14 February 2007 (UTC)
This is a simple, but very great bot idea and I would really like this bot to be made for me since I don't have enough programming language knowledge to make my own.
Anyway, over the month I have noticed numerous article with the word, "although", commonly misspelled as "althought". A bot that would change all the mispelled "although"s, would be useful on Wikipedia and it would save time and effort rather than doing this process manually.
If someone is willing to make this bot, hand the "code" over to me and teach me how to operate/set the bot up, etc. Please contact me on my talk page. Thank you :D -- Bhavesh Chauhan 02:31, 10 February 2007 (UTC)
I made some changes to {{ WikiProject Business & Economics}} and I recreated the categories you made with lower case, e.g., Category:High-importance Business and Economics articles to Category:High-importance business and economics articles. The bot created Wikipedia:Version 1.0 Editorial Team/Business and economics articles by quality which is I think what you wanted. Cheers, Oleg Alexandrov ( talk) 04:46, 16 February 2007 (UTC)
Can a bot free right now start puting the tags: {{WikiProject Business & Economics|class=|importance=}} on the pages in the following categories and if the bot finds that the page is a stub it puts tag: {{WikiProject Business & Economics|class=stub|importance=mid}}. Please? -- Parker007 08:28, 15 February 2007 (UTC)
I've just created the article Christian heresy with text extracted from Heresy. There are lots of articles that link to Heresy. Most of them need to link to Christian heresy. What I need is a bot that will go through a set of links and either automatically or interactively change links to Heresy into links to Christian heresy. Where can I post such a request so that a kindly bot writer will see it and maybe help me with this? Or... is there a bot in existence that will do what I want done? -- Richard 07:13, 15 February 2007 (UTC)
Not sure I need a bot for this, but couldn't think of where else to ask. Wikipedia:Wikipedia is failing got me to thinking about finishing my study of article quality. I'd like a list of X random articles older than a certain threshold, say two years, with information such as the date created, the number of kb of text, the number of images, and the number of edits. Ideally I'd get links to oldids at various stages such as quarterly and the same data on the article at each of those points. Could certainly be run on an offline database copy. Anyway, would love to have the data. Could make a nice paper/Signpost writeup. - Taxman Talk 05:16, 16 February 2007 (UTC)
Is there a bot which checks the 'What links here' of pages and adds the 'orphan page' template if the number is less than a certain number? If not, can i make it? Smomo 22:09, 6 February 2007 (UTC)
I still think this is a good idea. Does anybody want to take this bot on and make it? Smomo 17:05, 12 February 2007 (UTC)
Would it be doable to change aero-spec tags to the newer aero-specs tag. Could you do hat with AWB? RedSkunk 00:45, 17 February 2007 (UTC)
If I make three or four edits in a row, or ten, I would like them to all be condensed into one, as if I had used "preview" correctly. I'd like this mostly so people wouldn't say "hey I notice you don't use preview. Why don't you give it a try?", but I find that I keep fiddling with a sentence or paragraph even after I thought it was good. Martin | talk • contribs 07:31, 17 February 2007 (UTC)
Most Japanese military biography stubs are double-tagged with {{ asia-mil-bio-stub}} and {{ Japan-bio-stub}}. I'd like to request a bot to go through Category:Asian military personnel stubs looking for the Japan-bio-stub tag, and replace the combination of those two template tags with {{ Japan-mil-bio-stub}}. Please see this edit for one example I did manually. Thanks in advance!! Neier 13:39, 17 February 2007 (UTC)
I recently updated the infobox U.S. County template, and I wanted to see that counties are actually using this template. I have just started looking at counties in Alabama, and I see that most of the county articles are not using the template! I manually added the {{infoboxneeded|infobox U.S. County}} infoboxneeded tag, but this will take forever. Is there a way for a BOT to be created that will check each county's page — for all states — and if it is not using the infobox U.S. County template, it will throw the tag on the top of the article? Thanks! Timneu22 ( talk · contribs) / Timneu22 00:50, 19 February 2007 (UTC)
There are a lot of double re-directs that need to be fixed in response to a page move I made. Georgia guy 14:35, 19 February 2007 (UTC)
I was recently nominating an article for deletion when I realized how tedious it is, plus this is something that everyone (not just admins) can do anyway. Here's how it works, a user places a tag on the articles page that looks something like {{subst:afd | cat=Category | text=Reason the page should be deleted}}, the bot is monitoring recent changes, creates the Wikipedia:Articles for deletion/PageName with the preloaded debate, and adds the page to that days listings. Additionally an intelligent bot could easily tell if there had been a previous listing and handle the page naming accordingly. Also, because this process involves creating a new page the bot would simply remove the tag and take no action if an anonymous IP added it. Vicarious 08:49, 19 February 2007 (UTC)
Last night and on Saturday night, there was a vandal who moved pages at about 10 pages per minute. For example, if the page was Wikipedia, the vandal would move the page to Wikipedia2. Do you think someone could create a bot which moves pages back to the original destination once this vandal has been blocked? As always, a custom award and a barnstar for the designer. :-) Also, for further information, please see this. Thanks! Real96 17:44, 19 February 2007 (UTC)
This page is always clogged. It would be great if a bot could remove all deleted links, and move them to an archive section (ordered by day?). Proto ► 12:53, 20 February 2007 (UTC)
Per discussion at WP:ANI#A_Semi-protected_article_that_.22isn.27t.22, it seems like there might be some added value in a bot that would remove templates such as {{ sprotected2}} from pages when their semi-protection or full-protection has expired, so that misleading headers aren't left in place. I have very little experience with programming, and thus I thought that I might be better off leaving a message here. - Hit bull, win steak (Moo!) 14:42, 20 February 2007 (UTC)
This Bot would well wikify. Takes names and dates and wikifies them.
If the above bot isnt approved or this one turns out to be better: Another bot: This bot would report newly created accounts with names containing certain banned words and phrases, or that use certain special characters to WP:AIV. -- Darkest Hour 22:25, 15 February 2007 (UTC)
I have a python tool that helps me filter the IRC feed I could re-configure it to report both username's and WoW attacks to AIV fairly easy if anyone is interested. Betacommand ( talk • contribs • Bot) 18:26, 19 February 2007 (UTC)
What I have in mind is a bot that would scan an article for a section titled References, External links, Sources, or Bibliography. It would also scan an article for a tag providing a link to another WikiMedia project (such as the one on the right).
If the bot couldn't find either an appropriately-titled section or a tag then it would place the following template at the top of the article: {{ Unreferenced}} The bot would also be programmed to leave an article alone if the first word in its title was List, because some people have claimed that lists should be exempt from reference requirements if the articles they link to have references. What do you guys think? Galanskov 06:58, 17 February 2007 (UTC)
This is from Wikipedia_talk:Speedy_deletion_criterion_for_unsourced_articles#A_proposal
...we get a bot to put an unreferenced template onto any article without ref tags or a references section. Mostlyharmless 08:50, 6 January 2007 (UTC)
- I LOVE that idea. Agne 09:09, 6 January 2007 (UTC)
- Don't forget the {{ cite}} template and its kin, which predate the
ref
tag. . -- nae' blis 20:45, 8 January 2007 (UTC)
- Or plain external links, or parenthetical citations, .... Christopher Parham (talk) 01:44, 13 January 2007 (UTC)
- RefBot has recognized all those for several versions, so it can be done. I'll add {{ citation}} when I finish its new core. Tagging unreferenced articles is trivial, but if nobody cared enough to supply citations in the first place then why tag it? ( SEWilco 03:47, 20 February 2007 (UTC))
- Some time ago, I wrote a script to make a list of unreferenced math articles, and while it is not impossible, it is harder than you expect because of natural language issues. The section headings that are used for references in articles are quite varied. Not all referenced articles use any sort of citation template. You should expect several percent false positives the first time you search for unreferenced articles (which I estimate at around a hundred thousand errors if you scan the entire article space and exclude redirects and disambiguation pages). CMummert · talk 04:26, 13 January 2007 (UTC)
- Thanks for the response! I don't imagine it would be easy, and whoever did create such a bot would deserve a few barnstars! I'd think that the text of any template of this kind should include a statement asking for refs and inviting users to remove it where references are provided. Mostlyharmless 10:50, 13 January 2007 (UTC)
- Actually, mass-tagging those pages won't really help unless you've got a cadre of volunteers to do the actual referencing. We have a plethora of cleanup processes and nearly all of them are terminally backlogged. >Radiant< 12:34, 16 January 2007 (UTC
- I agree. Cleanup lists aren't working all that well at the moment. This is more about giving a clear reminder to users that they have a duty to reference their articles, and getting them to do so voluntarily. I certainly don't think that most editors would do so, but a sizable minority would, particularly on anything that isn't an unwatched and unattended stub. And of course the template could go onto every new article. Mostlyharmless 02:11, 24 January 2007 (UTC)
A better idea might be to after a new article is created and the vandel fighters have delt with it, a bot goes and checks for references and if it does not have references it tags the articel and informs the editor.-- Balloonguy 19:21, 19 February 2007 (UTC)
Using this <span class="plainlinks" style="font-size: 100%;">[http://Yahoo.com/search Yahoo!]</span> Which gets: Yahoo! over Yahoo!. Can there be a bot to do this because doing them by hand will be a never ending task. -- Darkest Hour| DarkeBot 00:04, 22 February 2007 (UTC)
Any reference to TNA iMPACT! and variants (suggest using just iMPACT as a case-sensitive search term) needs the iMPACT! changed to Impact!, per WP:MOSTM and WP:MOSCL. Chris cheese whine 09:57, 22 February 2007 (UTC)
To the best of my (limited) understanding AWB is not able to do the following in a reasonably efficient way. A manually assisted bot with a easy user interface so more than the bot creator can use it, that points links to disambig pages to the correct page. First, the bot would find links to disambig pages that are likely to be incorrect, meaning it would ignore links from other disambig pages as well as anything but the main article space. Then it would show the user the few sentences around where the link in the article on one side, and the disambig page on the other, however simply clicking the link on the disambig page completes the operation with no typing involved. Also, a well made bot could guess the correct link frequently enough for it to be worth adding a feature where the bot recommends an option to the user by highlighting it. Vicarious 07:47, 22 February 2007 (UTC)
List of dialing codes of Greece numerically - possible delink all the area code entries automatically? For instance:
:[[Greece dialing code 210|210]] - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
... becomes ...
:210 - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
There are dozens of entries on the page, so if this can be done automatically rather than picking through it manually, it would be much appreciated. Chris cheese whine 16:56, 22 February 2007 (UTC)
Could a bot operator configure a bot to do a single dummy edit (i.e. a single space at the end of a line, an enter carriage return, etc.) on every single file in the image namespace in Category:Non-free image copyright tags? I noticed that the cache has not updated and did not (for the image I tested) until I made a dummy edit. Thanks, Iamunknown 23:27, 22 February 2007 (UTC)
I think we need a bot like commons:User:FlickreviewR here. Images from Flickr are uploaded here all the time. It would be nice if we had a bot that could review these images, verify them, and then either tag them as verified (and {{ Move to Commons}}) or moves them to a directory for either human review or deletion. It would be ideal if editors uploaded Flickr images straight to the Commons but we all know many do not so having this bot that would either mark them for deletion or verify them and tag them to be moved to Commons would be very beneficial.↔ NMajdan• talk 18:33, 23 February 2007 (UTC)
A common mistake when linking to categories is to forgo the initial colon, i.e. write [[Category:Wkipedia]] instead of [[:Category:Wikipedia]]. A related mistake is when users copy an article over to user space and forget to <nowiki> or link the categories and interwiki links rather than including the page in them. A similar thing happens when developing/reworking templates. Could a bot be made to regularly (say every month) scan all wikipedia categories looking for these incidents and sorting them out? Note that extra care would have to be taken with respect to user categories and user-based templates, which obviously don't want to be touched. Mike Peel 23:38, 23 February 2007 (UTC)
Is there a bot which finds fullurls to wikipedia pages? I've been running into cases where they are used for inline links, or to provide what appear to be "references". [4] Gimmetrow 23:00, 24 February 2007 (UTC)
I think there should be a bot that help replace bitmap images to svgs after their svg has uploaded. Often, bitmaps exists in many articles. it is hard to replace all of the bitmap image to svg, right?
--
Jacklau96 09:07, 25 February 2007 (UTC)
Example:
Bitmap:
Image:Wheelchair.png and
SVG:
Image:Wheelchair.svg
I would really like to run a bot that goes through the new user log and automatically welcomes new users - it would make many editors actually edit - as most accounts never get round to this! I would like a bot that puts {{subst:welcome}} ~~~~ (or something to that effect) on new users talk pages. I've got no expertise in creating these kinds of scripts else I would create one myself RyanPostlethwaiteSee the mess I've created or let's have banter 23:06, 25 February 2007 (UTC)
Various good faith move and page creations have resulted in large numbers of Dungeons & Dragons articles having their name suffixed with "(Dungeons & Dragons)" even if there is no article at the unsuffixed title. If a bot could move all articles of the form "Foo (Dungeons & Dragons)" to "Foo" if "Foo" doesn't exist or is a redirect to "Foo (Dungeons & Dragons)", that would save a lot of tedious manual work. This has been suggested at Wikipedia talk:WikiProject Dungeons & Dragons#Disambiguation with no objections. Cheers -- Pak21 13:18, 26 February 2007 (UTC)
To the best of my (limited) understanding AWB is not able to do the following in a reasonably efficient way. A manually assisted bot with a easy user interface so more than the bot creator can use it, that points links to disambig pages to the correct page. First, the bot would find links to disambig pages that are likely to be incorrect, meaning it would ignore links from other disambig pages as well as anything but the main article space. Then it would show the user the few sentences around where the link in the article on one side, and the disambig page on the other, however simply clicking the link on the disambig page completes the operation with no typing involved. Also, a well made bot could guess the correct link frequently enough for it to be worth adding a feature where the bot recommends an option to the user by highlighting it. Vicarious 07:47, 22 February 2007 (UTC)
List of dialing codes of Greece numerically - possible delink all the area code entries automatically? For instance:
:[[Greece dialing code 210|210]] - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
... becomes ...
:210 - [[Athens]]-[[Piraeus]]-[[Eleusis]] area
There are dozens of entries on the page, so if this can be done automatically rather than picking through it manually, it would be much appreciated. Chris cheese whine 16:56, 22 February 2007 (UTC)
Could a bot operator configure a bot to do a single dummy edit (i.e. a single space at the end of a line, an enter carriage return, etc.) on every single file in the image namespace in Category:Non-free image copyright tags? I noticed that the cache has not updated and did not (for the image I tested) until I made a dummy edit. Thanks, Iamunknown 23:27, 22 February 2007 (UTC)
I think we need a bot like commons:User:FlickreviewR here. Images from Flickr are uploaded here all the time. It would be nice if we had a bot that could review these images, verify them, and then either tag them as verified (and {{ Move to Commons}}) or moves them to a directory for either human review or deletion. It would be ideal if editors uploaded Flickr images straight to the Commons but we all know many do not so having this bot that would either mark them for deletion or verify them and tag them to be moved to Commons would be very beneficial.↔ NMajdan• talk 18:33, 23 February 2007 (UTC)
A common mistake when linking to categories is to forgo the initial colon, i.e. write [[Category:Wkipedia]] instead of [[:Category:Wikipedia]]. A related mistake is when users copy an article over to user space and forget to <nowiki> or link the categories and interwiki links rather than including the page in them. A similar thing happens when developing/reworking templates. Could a bot be made to regularly (say every month) scan all wikipedia categories looking for these incidents and sorting them out? Note that extra care would have to be taken with respect to user categories and user-based templates, which obviously don't want to be touched. Mike Peel 23:38, 23 February 2007 (UTC)
Is there a bot which finds fullurls to wikipedia pages? I've been running into cases where they are used for inline links, or to provide what appear to be "references". [5] Gimmetrow 23:00, 24 February 2007 (UTC)
First, I have a request for a new task for a bot which substitutes tags. Per the album assessment project, we assess albums which have the {{albums}} tag. Instead, for assessing, I need to have this tag (shown below) placed on each album talk page, instead of the regular {{albums}} tag...because the regular album's tag makes the album harder to assess.
{{Album
|class=
|importance=
|attention=
|needs-infobox=
|auto=
}}
Second, what happened to Antivandalbot? Thanks. Real96 00:18, 25 February 2007 (UTC)
I think there should be a bot that help replace bitmap images to svgs after their svg has uploaded. Often, bitmaps exists in many articles. it is hard to replace all of the bitmap image to svg, right?
--
Jacklau96 09:07, 25 February 2007 (UTC)
Example:
Bitmap:
Image:Wheelchair.png and
SVG:
Image:Wheelchair.svg
I would really like to run a bot that goes through the new user log and automatically welcomes new users - it would make many editors actually edit - as most accounts never get round to this! I would like a bot that puts {{subst:welcome}} ~~~~ (or something to that effect) on new users talk pages. I've got no expertise in creating these kinds of scripts else I would create one myself RyanPostlethwaiteSee the mess I've created or let's have banter 23:06, 25 February 2007 (UTC)
Various good faith move and page creations have resulted in large numbers of Dungeons & Dragons articles having their name suffixed with "(Dungeons & Dragons)" even if there is no article at the unsuffixed title. If a bot could move all articles of the form "Foo (Dungeons & Dragons)" to "Foo" if "Foo" doesn't exist or is a redirect to "Foo (Dungeons & Dragons)", that would save a lot of tedious manual work. This has been suggested at Wikipedia talk:WikiProject Dungeons & Dragons#Disambiguation with no objections. Cheers -- Pak21 13:18, 26 February 2007 (UTC)
I was hoping that I could enlist some assistance in having a bot slap {{ WikiProject Law}} on the talk page of each article in Category:Law and each of its sub-categories and sub-sub-categories etc. (of which there are many), in connection with WikiProject Law?
Apologies if I have posted the request in the wrong place; we lawyers are not techies, and so are not very good at this sort of thing. But we do give the world lots of jokes at our own expense.
-- Legis ( talk - contributions) 14:47, 26 February 2007 (UTC)
I originally requested bot Ganeshbot to do this, but I haven't got a response and that bot hasn't had a contrib on 2 months so I'm asking here. Can somebody please have a bot place {{WikiProject College basketball|class=|importance=}} on all article talk pages in Category:College basketball and its subdirectories?↔ NMajdan• talk 17:52, 26 February 2007 (UTC)
Will someone please run a bot to change all
Full list can be found HERE, only the articles need changing (no talk/wikispace stuff). Users who browse IGN from uk get put on UK servers, as not everyone who reads wikipedia is from the UK they shouldn't be sent to UK servers. It also uses up a little space in article. Thanks.-- Empire Earth 19:55, 25 February 2007 (UTC)
Someone — don't remember who — had a brilliant suggestion on Wikipedia:Adminship survey (I think) and it's good enough that I thought it should be brought up here: A manually assisted script to help close AfDs more quickly. I could envision it as follows:
This would help a lot (for me at least) in clearing the AfD backlog. Removing the category, manually closing the article and especially removing links to the page is time consuming, and could be done by a bot, I think. I'm a mac user, I don't know if I'd be able to utilize it necessarily but I'm sure other people would appreciate it too. Comments? I don't know jack about programming (though I learned a little Basic and Logo in middle school or earlier.) Grand master ka 09:41, 28 February 2007 (UTC)
A bot is requested to iterate through Category:WikiProjects, find all projects that are inactive, move those to Category:Inactive WikiProjects, and add {{ inactive}} to the top of those pages. A project is defined as inactive if neither the project nor its talk page has been edited for two months. This should ideally be repeated every couple weeks or so. AllyUnion's bot ( User:Kurando-san) used to do this in the past. >Radiant< 12:46, 28 February 2007 (UTC)
Please run a bot to replace all
There is about 8-12. Then an admin can delete Empire Earth (video game). Thanks.-- Empire Earth 01:12, 25 February 2007 (UTC)
I am looking for a bot that will add the following to the mainspace of the article:
{{subst:prod|[[Wikipedia:No_original_research]] & does NOT include [[Wikipedia:Reliable_sources]] for [[WP:Verifiability]] of content. Leaving Message on article creator's talk page regarding this, and will ask him to add sources.}}
& add the following to the creator of the article's (first editor) talk page:
Your article [[---------]] has been proposed for deletion "[[Wikipedia:No_original_research]] & does NOT include [[Wikipedia:Reliable_sources]] for [[WP:Verifiability]] of content." Please add [[Wikipedia:Citing_sources|references]], or it will be deleted. --~~~~
-- Parker007 18:36, 25 February 2007 (UTC)
importScript('User:Dycedarg/easyprod.js'); importScript('Wikipedia:WikiProject User scripts/Scripts/Add LI menu'); importStylesheet('Wikipedia:WikiProject User scripts/Scripts/Add LI menu/css');
Could there be a bot that uses Lupins bad word list and the RSS feed to revert vandalism? Rest 5-15 Sec. Pearl. BadwordBot. -- Darkest Hour 17:35, 14 February 2007 (UTC)
-- Darkest Hour 18:16, 14 February 2007 (UTC)
This bot would greet new users. That is accounts being no more than 5 minutes old. Using the original welcome template.
:-)
Cbrown1023
talk 00:13, 1 March 2007 (UTC)This Bot would well wikify. Takes names and dates and wikifies them.
It would be bassically the same as werdna bot.The difference is that every month it starts a new archive for you.(something I wish werdnabot did)
This bot would need human input. It uses the RSS feed and the recent changes page and updates itself every 1 minute. It then highlights possible vandalism. But a human would need to click a confirmation on wether its vandalism or not. The bot would then act accordingly,if its vandalism revert it and add a warning to the offenders talk page : If not ignore it. It would not be active unless you go to the certian page. (That way you are not bugged every minute by a bot buzzing at you). This might need to be a monobook script to really work out properly.
Wouldn't it be possible to make a bot that works trough Category:Copy to Wikimedia Commons and moves the files to Commons? A sort of combination between Push for commons and FlickrLickr. // Liftarn
Standard links to New York Times articles expire after a while, and direct the user to a paid archival service. These are the links one gets from simply copying the url of an news article they are viewing. However, the Times also provides permanent anchors for blogs. To get the permanent anchor, one can use this service, though there may also be some simple algorithm for it. I've probably fixed dozens of such links using this method, and the NYTimes must be one of the most common citation sources. So, it seems like a good project for a bored bot to run around and change the temporary links to permanent ones. As an example, here's one I just fixed, which inspired me to make this suggestion. Derex 02:56, 1 March 2007 (UTC)
Please would someone tag the talk page of all articles in Category:Urban studies and planning and all subcats with {{Planning|class=|importance=}}? Many thanks. -- Mcginnly | Natter 14:32, 1 March 2007 (UTC)
What about a bot to notify uploaders of images tagged as both {{imagewatermark}} and {{self}}? // Liftarn
We should have a bot that fixes wikilinks within an article. It would do 3 things on an article:
Pyrospirit Flames Fire 16:33, 2 March 2007 (UTC)
Greetings! I was wondering if a bot operator could direct their bot to tag all talk pages of articles within Category:Plants (and its subcategories) with {{Plants|class=|importance=}}. Articles will be assessed later by members of WikiProject Plants. Thanks! -- Rkitko 08:59, 3 March 2007 (UTC)
Hmm, well, I've been busy with random stuff lately, particularly minor edits. I've cleaned up from 1 - 1 BC (After the 19th century in its entirety) in the categories at current. My proposal would be for a bot to...
Now, I understand not all of this might be possible, but it's certainly labour intensive to do this by hand, especially when the same categories can simply popup again and again due to human error. Logical2u Review me! 18:22, 3 March 2007 (UTC)
Do you think HBC Helperbot could put a notice on Defcom in order to tell administrators that WP:AIV is backlogged? Real96 06:02, 4 March 2007 (UTC)
I think a bot that monitors a series of backlogs and adjusts the Defcon is a great idea, but that is not my bot. I think would be a task would be better suited to a bot designed to do this, instead of tacked onto my existing bot. HighInBC (Need help? Ask me) 00:03, 7 March 2007 (UTC)
Would anybody be interested in coding up a bot for the purpose of removing redirects in template space, as to preserve the bolding effect when templates are transcluded on pages with links pointing to that page. Many wikipedians often forget to update these templates after a page move. The bot's logic should also include the ability to change hyphenation, capitalization, accent mark, and macrons to the link text if the redirect is similar. — Dispenser 02:18, 5 March 2007 (UTC)
Deletion review could use a bot to do routine tasks related to creation of daily and monthly log pages, and including completed days on the monthly log at the appropriate time. Monthly log to be created once a month, and presumably updated once a day, daily to be created once a day. See User:Trialsanderrors for further details. GRBerry 23:08, 5 March 2007 (UTC)
Reports to AIV for inappropriate usernames (i.e. words with .com/.org/.net, fuck, shit, damn, fagnut, cunt, whore, Wikipedia sucks, I hate you, you are evil, suck my ass, as well as racial epithets, etc.) Real96 02:38, 6 March 2007 (UTC)
Double redirects is up and running:
http://en.wikipedia.org/wiki/Special:DoubleRedirects
Bots get to work, and stop slacking. -- Parker007 18:09, 6 March 2007 (UTC)
There are Double redirects still left:
http://en.wikipedia.org/?title=Special:DoubleRedirects&limit=500&offset=500
http://en.wikipedia.org/?title=Special:DoubleRedirects&limit=500&offset=0
a simple bot, that places certain templates at the end of an article(s) you specify. -- •Tbon e 55 •( T, C, UBX, Sign Here) 21:13, 6 March 2007 (UTC)
It always removed the langue links which is valid [6].-- Ksyrie 03:58, 7 March 2007 (UTC)
Another bot found a little disfuctional. User:Idioma-bot.See Gulf of Carpentaria and zh:卡奔塔利亚湾.-- Ksyrie 04:40, 7 March 2007 (UTC)
Third bot, User:Soulbot,See Mesoplodont whale and zh:安氏中喙鯨.-- Ksyrie 04:48, 7 March 2007 (UTC)
Fourth bot, User:STBotD,see Direct action and zh:直接行動-- Ksyrie 04:51, 7 March 2007 (UTC)
Is there a way to create a bot to automatically create redirects to American city articles from names that are incorrectly punctuated or capitalized? For example, there are many many city articles for which ONLY searching for "Baltimore, MD" will get you to the right article. "baltimore MD", "Baltimore MD", "baltimore md", sometimes even "baltimore, MD" either take you to a search results page or to nothing at all. As most of these articles were created by a bot to begin with, someone suggested that a bot could create these redirects. Thanks- Dmz5 *Edits* *Talk* 15:56, 8 March 2007 (UTC)
You:
Me:
long list of links |
---|
|
This might be overstepping, but I couldn't find any bot pages on wiktionary so I thought I'd propose an idea here. There are a series of pages here that should only list redlinks and have instructions at the top to remove blue links, this would be an ideal task for a bot. Vicarious 08:25, 9 March 2007 (UTC)
I'm suggesting a bot that simply inserts the template {{talkheader}} at the beginning of every artice's talk page.
As a side note, mabey this bot should also put {{archivebox|auto=yes}} automatically in every talk page.
This bot comes out of a discussion with Ray Saintonge on wikien-l.
The problem: we have a zillion schoolkids vandalizing Wikipedia from shared IP addresses. We clean it up some, and then we block. This a bunch of work for us, people other than the vandals see warning messages that don't apply to them, and then a lot of innocent people suffer until we unblock.
The solution: School administrators put a special template on the IP talk page to show their interest in a given IP. Whenever a warning or a block happens, the administrator gets an email notice. They then take some locally appropriate action, defusing the situation before we block. Or if a block happens, then when people come to them with questions, they already know the answer.
I don't have time to build this in the near future, but if anybody gets interested and wants to chat about it, feel free to get in touch. Thanks, William Pietri 17:12, 10 March 2007 (UTC)
I have a request or idea if it is possible re-make interwiki bots, so as there add {{ commons}} to other languages wiki. For example if the bot see in czech wiki {{ Commons}}, that he replace this to all other wikis (like replace to en, fr, de) by using interwiki links if there this template don´t exist. I´m trying do this by myself but giving myself a question if is not possible use a mechanical power for doing this :) The idea is that all wiki use the same while category on commons are in english.
Hope so that somebody understand what I mean :) Thx for your time and hope that in early time see this new generation of bot working :) -- Chmee2 14:35, 11 March 2007 (UTC)
Per my suggestion at Wikipedia talk:Multilingual coordination#Tool to spread interlanguage links, assuming interlanguage links are symmetric and transitive it seems a bot could add the transitive closure of all interlanguage links to all language versions of an article in the closure set, and step through all articles (in, say, en:) doing this. Does anyone know of some reason not to do this? Is anyone interested in writing such a thing? -- Rick Block ( talk) 17:54, 11 March 2007 (UTC)
Category:WikiProject Irish Republicanism articles has just been renamed, and a large number of articles have disappeared from it that are still in the category according to their talk page. According to this I need to do a null edit, and someone at the VP says there's bots that do that? Thanks. One Night In Hackney 303 03:04, 12 March 2007 (UTC)
A bot is requested to iterate through Category:WikiProjects, find all projects that are inactive, move those to Category:Inactive WikiProjects, and add {{ inactive}} to the top of those pages. A project is defined as inactive if neither the project nor its talk page has been edited for two months. This should ideally be repeated every couple weeks or so. AllyUnion's bot ( User:Kurando-san) used to do this in the past. >Radiant< 13:13, 12 March 2007 (UTC)
I was thinking there should be a bot for automatically updating the count for a RfA e.g. (41/2/0). Sometimes people forget to change them, and it would be handy for something to change them if one forgets. Nol888( Talk)( Review me please) 21:42, 12 March 2007 (UTC)
Make a bot that can scan through the user talk pages of IP addresses and detect a specific vandalism tag which includes the date of vandalism, then have it count the number of tags within a certain time period, and if a minimum number is met, post the IP address and a link to the user page on a "block page" for administrators to look at. Store a list of all IP address vandalism and check to see if vandalism tags are removed from user talk page. To save server access time, the list of IP addresses to block can be loaded to the page every x times.
Outline of the process: Check user page, counting all vandalism tags dated after x date. Also, count overall number of tags. Compare count to y number of allowed tags, if greater, move ip address to list to be updated to the server. Also, check overall number of tags against last overall number of tags count. If new overall number of tags is less than old, move ip address to list to be updated to server with comment about missing tags. If not, store new count of overall tags to client side data file. After z number of ip addresses have been found to be over the limit of number of allowable tags, append list to special "block consideration" page for administrators to view, along with any comments on why they should be blocked and a link to their talk page. D 16:33, 9 March 2007 (UTC)
Trampton 17:19, 13 March 2007 (UTC).
Please could somebody make a bo that would automatically send Welcome Mesages to new Wikipedians. This would help the welcoming committee. Thank You. Djmckee1 20:04, 15 March 2007 (UTC)
I might (try) making something like this myself, but I would like some feedback on wether it is a good idea first. Basically, it would take Category:Wikipedia_external_links_cleanup, and iterate through each article in there, trying to detect if the NoMoreLinks template has been added. If it has been added, it will simply continue to the next article, if not, it will add the "NoMoreLinks" template to the "External Links" section. Tim.bounceback - TaLk 14:31, 12 March 2007 (UTC)
Just a quick question - is it worth running multiple times, and if so, how often (how often are articles added to that category)? Tim.bounceback - TaLk 20:27, 16 March 2007 (UTC)
I'm encouraging anyone to make a bot that monitors bot approval requests. Taxman described the task of such a bot in a discussion on the bot approval time as follows.
The suggestion on a bot to track requests isn't a bad idea, something like WP:RFASUM that lists the last time each bot request has been edited and perhaps the last time edited by a BAG memeber. BAG wikilinked by Ocolon
Such a bot would help bot approval group members and all the other Wikipedians assisting with the bot improvement and approval process to keep track of approval requests and prevent requests from being neglected unintentionally. It would provide a good summary of the current approval process and make this more transparent. — Ocolon 17:09, 16 March 2007 (UTC)
I am wondering if there is a bot available to deliver WikiProject Massively multiplayer online games' newsletter on April 1. Instructions on who to send it to are available on our newsletter page. Thanks! Greeves ( talk • contribs) 17:22, 16 March 2007 (UTC)
Just as a side-note, you put in your bot request for monthly newsletters, ours will probably only be quarterly. Thanks! Greeves ( talk • contribs) 20:11, 16 March 2007 (UTC)
Purpose: The bot will clear off old IP warnings from 2006. Real96 03:20, 15 March 2007 (UTC)
I'd like to see scores for 50 randomly-selected articles, five of them featured, according to my draft scoring system. The system is an early draft and has not been discussed by the community in its present form. (The previous version, which the community rejected, called for user voting in combination with a system like this.) I'm thinking it would be better to come to the VP with an already tested system. Even if it is rejected as an official system, though, there has been enough interest expressed that someone will probably code it as an off-site tool if it proves adequate, in which case I will support them. Neon Merlin 19:18, 16 March 2007 (UTC)
P.S. Yes, I know the Thoroughness rating isn't quite finished yet, but I'm not sure how to improve it. I'd welcome input into the system itself. Neon Merlin 19:32, 16 March 2007 (UTC)
I noticed that there's quite some deleted articles, which still have a wikiproject template on their talk page. This is particularly bad for projects like WP:VG, where there's lots of articles that don't meet the policies for inclusion, and are thus deleted. This clutters up statistics (deleted articles are often poorly rated) and unnecessarily adds extra articles to 'unassessed articles' categories. This could be added to an existing bot that browses talk pages. (Maybe one of the bots that add signatures to unsigned comments?) -- User:Krator ( t c) 17:20, 17 March 2007 (UTC)
I don't know if this is the place to put this, (but I was given a link). Could a bot please put the template {{ hiking-stub}} on all pages in the category Category:trail stubs? And/Or just on every page that has another stub tag in the following categories:
If this is possible, I'd be much abliged as it will be very tedious (sp?) to do this by hand. EDIT: Oh yah, hiking stubs is a subcategory of hiking... probably not a good idea to put any there because obviously it'll loop - Leif902 12:26, 18 March 2007 (UTC)
Thanks in advance, - Leif902 20:59, 18 March 2007 (UTC)
Hi, I would like to run a bot and I am not really experienced with programming, so it told me to post here. I was thinking on a bot (maybe called FAQbot or equivalent) that watches Wikipedia:Help desk for changes and if it finds a similar keyword match to a question in the FAQ pages, it will respond with an answer. This could also work with the {{ helpme}} requests at CAT:HELP. If someone would be kind and very grateful to create this for me, it would be much appreciated. Many thanks, Extranet ( Talk | Contribs) 06:59, 20 March 2007 (UTC)
Would it be possible for someone to run a bot through all transclusions of Template:Infobox UK station and perform the following changes:
These two fields are identical, the intention is to move all articles over to 'usage' which is a better description of the data. This will allow the template to be simplified and so make it easier to maintain.
The current use of the fields is:
Field | Number of uses | |
---|---|---|
usage0203 | 1 | 0.05% |
lowusage0203 | 17 | 0.92% |
usage0405 | 2 | 0.11% |
lowusage0405 | 24 | 1.30% |
exits | 29 | 1.57% |
lowexits | 29 | 1.57% |
lowexits0405 | 658 | 35.55% |
exits0405 | 1091 | 58.94% |
Total | 1851 |
So this change would require in the region of 1749 edits (use of lowexits0405 + use of exits0405). This is of course more than going the other way but as I've said, usage better describes the field. Some of the other fields in the list can also be merged together but I'll work through these manually as they need checking.
If anyone is able to do this, don't hesitate to contact me if there is anything you wish to verify. Thanks. Adambro 14:36, 25 March 2007 (UTC)
I don't know if this has been done before, or is technically feasible, but a bot that scans Wikipedia to locate articles tagged as orphaned and introduces internal links by putting brackets in words same as the name of the article would be extremely helpful. It wouldn't introduce every link available, but it would reduce the amount of work.-- Orthologist
Links to non-existent page Wikipedia is not a soapbox should point at Wikipedia:Wikipedia is not a soapbox redirect page or to its destination, Wikipedia:What Wikipedia is not#Wikipedia is not a soapbox. -- CiaPan 15:15, 20 March 2007 (UTC)
Very simple, I think (at least for people who know how to code). So simple, indeed, that I suspect it had already been suggested and eventually refused.
A bot that transform incoming links when they point to a redirection. Example: [7] has a link to Seven Deadly Sins wich redirects to Seven deadly sins.
Such a “redirectbot” (proposed name) would edit the link in Seven virtues and change it to Seven deadly sins. In the long run, it would save some CPU cycles and from day one, would remove a bit of the feeling of clutterness of the Encyclopedia.
What do you think of it? Has this already been suggested? Is there a place to browse the various proposals and why they have not been implemented?
David Latapie (
✒ |
@) 12:17, 21 March 2007 (UTC)
I recently created an addition to the Country Infobox template, namely the Gini coefficient.
Now, I started to add this coefficient to the infoboxes of 5 countries. But it's rather a repetitive and quite long and boring procedure. Additionally, it would be nice if a bot could take values from the page and update changes of the coefficients.
The pages where one can get the values from is here: List_of_countries_by_income_equality
And the value for each country's infobox to be completed/added is, e.g.:
|Gini = 33 |Gini_year = 2000 |Gini_rank = 33th |Gini_category = <font color="#00c600">low</font>
low (green bc. good), medium (orange), high(red because bad) have these color-intervals (found here
Gini_coefficient#Income_Gini_coefficients_in_the_world):
low:
medium:
high:
Anyone fancy to make this bot (if possible)?
Is there another todo list, taskforce, project I can place this?
Please tell me if I am barking the wrong tree.. —The preceding unsigned comment was added by R U Bn ( talk • contribs) 12:21, 18 March 2007 (UTC).
I had used a bot to create Indian town articles using census data. For the towns that already existed, the bot created the stub article in it's sandbox. These are currently being merged into article space manually. After a merge, the article is striked out from the list. A merged sandbox article will not have a category. The category is removed by adding a colon to the syntax. See User:Ganeshbot/sandbox/Adilabad. Is it possible to go through this list and delete the sandbox articles that have already been merged (striked or not). This would help identify what is complete and what is not. Completed ones will show up as red-links after the delete. Thanks, Ganeshk ( talk) 06:14, 20 March 2007 (UTC)
Would it be possible to create a bot that would go through all articles tagged with a specific WikiProject banner (i.e. all articles under Category:Caribbean articles by quality), find ones that are tagged with a cleanup template, and add Category:Caribbean articles needing attention to their talk pages? Jwillbur 22:00, 22 March 2007 (UTC)
I noticed that many articles in Wikipedia say things like " As of 2007", "As of September 2006", etc., referencing to things that have not yet happened but could happen soon or other similar situations. The problem is that many of these references are terribly outdated: I have found some that say "As of January 2005" and the like. I wondered if it was possible for a bot to update all these situations, so that they remain up to date until those things actually happen, and people can simply remove the "as of"s when it is adequate. The bot would only have to detect the phrase "as of" and if it is followed by a date, then it could update it to the current month and year. (I believe there are no situations where "as of (a specific date)" wouldn't need an update, do you??...) Just my humble first bot proposal... Kreachure 23:11, 22 March 2007 (UTC)
Yeah, the problem is that some of the "As of"'s refer to facts or statistics that are themselves outdated, so there could be instances where updating only the date would be inaccurate. I guess there's no way for a bot to ignore these while updating the others which deserve updating. But what if people put some sort of special marker or template along with the "As of"'s that would actually need an update, so that the bot would recongnize these and update them once in a while? That's the only way I could see this bot working... thanks anyway! Kreachure 00:13, 23 March 2007 (UTC)
It may be useful if a bot could add some kind of tag to the articles that have (or that it thinks has) outdated info, then they could be addressed manually by users Akubhai 20:36, 27 March 2007 (UTC)
See Wikipedia:As of. Get more people to work on that project, and there won't be a problem. -- Rory096 22:36, 27 March 2007 (UTC)
There doesn't seem to be a bot that does this, so I'd like to request one.
The bot I envision would check the "File links" on images which include {{ screenshot}}, {{ logo}}, and other fair use templates. It would then remove the images from the Wikipedia space, template space, portal space, and, most importantly, user space. In the case of user space it would leave a message along the lines of "I have removed a Wikipedia:fair use image (or images), Image:Example.jpg, from your userpage (or user subpage) because Wikipedia policy doesn't allow them to be used on any pages besides articles. If you believe this removal was in error because the image was not a fair use image, please ask at Wikipedia:Media copyright questions."
Overall, it would function something like OrphanBot. How does this sound? Picaroon 21:44, 25 March 2007 (UTC)
I need a bot for my talk page that moves the </div> tag to the bottom of the page every time the </div> tag is not at the bottom. This is needed to combat the issue of new sections being placed below the </div> tag. This is an example of what I need it to do: User Talk:Andrew Hampe (diff) -- Andrew Hampe | Talk 16:33, 26 March 2007 (UTC)
Shoot. Sorry, screwed up. I need it to do this instead. Sorry about the goof up. -- Andrew Hampe | Talk 16:48, 26 March 2007 (UTC)
<!-- please add new sections above this comment --> |} </div> <!-- please add nothing below this comment -->
We really need a bot to start tending to the newly created Wikipedia:Request an account page. At present, it takes several minutes to complete the clerical work surrounding an account creation. Specifically, the bot needs to check whether a user account has been created (either by a user, or by an admin to meet the request). If it is already created when the user posts the request, the bot should tag that account as already existing and post that in the user's entry. If the account is created by an admin, the bot should move that request to the relevant archive ( current archive using the formatting illustrated on that page.
Let me know if you need further clarification. I'd sincerely appreciate anyone's help with this. (There's a barnstar involved :D) alphachimp 18:56, 26 March 2007 (UTC)
I need a bot. Can somone create one for me that will search out words or mispellings on a page and replace them with a different word? Silverpelt 00:51, 27 March 2007 (UTC)
As I mentioned on the Patel Talk Page, the list in the article is not comprehensive. Can a bot search through Wikipedia and create a page such as List of Notable Patels (with the criteria being they have a wikipedia page) with a link to each page. I'm guessing the bot will pick up extraneous pages such as Patel but I can go through and remove those pretty easily. Thanks. Akubhai 20:53, 27 March 2007 (UTC)
Is there a bot that does boolean logic on categories? I'd like to be able to say things like:
IF an article is in a subcategory of Category:Calvinism AND is NOT in Category:WikiProject Calvinism THEN dump the list to such-and-such a place
Is there any bot that will do this sort of thing?
-- TimNelson 06:24, 27 March 2007 (UTC)
Thanks for the information, both of you. Unfortunately AWB doesn't run on Linux, but it's good to know that it can do that. I'll be using CatScan, though.
-- TimNelson 00:46, 29 March 2007 (UTC)
Where is AntiVandalBot? I found lots of test edits in pages listed in the "File Links" of [[Image:example.jpg]]. Although I am using VandalProof to track them, but they keep vandaling (the vandals are mostly by IPs). -- Jacklau96 09:06, 29 March 2007 (UTC)