This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 25 | ← | Archive 28 | Archive 29 | Archive 30 | Archive 31 | Archive 32 | → | Archive 35 |
Hello.
Need add Template:Administrative divisions of South Ossetia in articles included in this template. Advisors ( talk) 12:24, 27 July 2009 (UTC)
I just ran this, it has already been done. Rich Farmbrough, 02:35, 1 August 2009 (UTC).
A bot is needed, please, to convert existing "first broadcast", "foundation", "founded", "opened", "released", or similar dates in infoboxes, to use {{ Start date}}, so that they are emitted as part of the included hCard or hCalendar microformats. Further details on my to-do page. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 22:25, 30 July 2009 (UTC)
There is a proposal at WP:AN that has considerable support, to rename the somewhat historic Wikipedia:Administrators' noticeboard/Incidents to the more obvious Wikipedia:requests for administrator assistance (RAA).
If this passes, then there are two coding or bot related issues related to implementation, and feedback is sought.
Discussion and input sought - when it's done it can then be summarized at AN to inform the discussion there. FT2 ( Talk | email) 17:56, 1 August 2009 (UTC)
<!-- Please see http://en.wikipedia.org/wiki/User:ClueBot_III/Documentation before editing this. --> {{User:ClueBot III/ArchiveThis |archiveprefix=Wikipedia:Requests for administrator assistance/Archives/ |format=Y/F/d |age=72 |index=no |minarchthreads=0 |minkeepthreads=0 |archivenow=<nowiki>{{User:ClueBot III/ArchiveNow}},{{resolved|,{{Resolved|,{{done}},{{Done}}</nowiki> |header=<nowiki>{{Talkarchive}}</nowiki> |headerlevel=2 |nogenerateindex=1 |archivebox=no |box-advert=no }}
Remove lines where the first word is a blue link on the lists at the page Wikipedia:Requested articles pages.-- Otterathome ( talk) 16:33, 2 August 2009 (UTC)
We'd like a bot that can find all counties in the US that don't have the Template:Infobox U.S. County infobox. Articles missing that template should be placed into Category:Missing U.S. County Infobox. Timneu22 ( talk) 12:15, 23 July 2009 (UTC)
BRFA filed: Wikipedia:Bots/Requests for approval/MadmanBot 9 — madman bum and angel 05:13, 26 July 2009 (UTC)
WP:LOMJ should contain a list of articles that dont exist.
It would be good if entries are removed if there is an article, and it contains either:
If a page does exist at the name of a journal, the bot should also apply the above logic for a page disambiguated with " (journal)". John Vandenberg ( chat) 02:31, 4 August 2009 (UTC)
i dont know how to program but all the bot would have to do is: have the script : [1] installed then load articles from [2]
and go to the top of the page and click "check"
then wait to click save
and then start the process over
P.S. thats what im doing now manually but its very tedious. -- Tim1357 ( talk) 03:16, 4 August 2009 (UTC)
Is it possible to create a bot that would change tags with incorrectly formatted date tags?
For example:
{{unreferenced|Date=August 2009}}
would mean that the "Date=" parameter is ignored, as it should be "date=" instead.
So basically, the bot should change "Date=" to "date=". This would be similar to how
Smackbot changes == External Links ==
to == External links ==
I think this would be really useful - I have recently been working on undated unsourced articles, and come across this quite often! I've also done it myself, but normally catch it in the preview!
Regards, -- PhantomSteve ( Contact Me, My Contribs) 10:19, 3 August 2009 (UTC)
Is it possible for a bot to check the articles in each of the following 641 categories and ensure that the Indiana project banner ({{WikiProject Indiana}}) is on the talk page? The bot could make the following decisions:
A. If the Indiana banner currently on the article talk page is {{WPINDIANA}}, change it to {{WikiProject Indiana}}, and if possible leave the assessment parameters in place. (This is something that could be done in pereptuity if there is a bot that does that)
B. If no Indiana banner is present on the talk page of an article, place one. ({{WikiProject Indiana}})
C. If the item in the category is in the file namespace, tag it as such. ({{WikiProject Indiana|class=image}})
D. If the item is in the template namespace, tag it as such. ({{WikiProject Indiana|class=template}})
E. On the categories themselves, also check for a Indiana banner, and if none is present add one ({{WikiProject Indiana|class=category}})
F. If the item is in the portal namespace, tag it is such. ({{WikiProject Indiana|class=portal}})
G. If possible, could a list of all the article that have been altered by the bot for this task also be created?
H. (added by TonyTheTiger ( talk · contribs) who would also like to use the bot for WP:CHICAGO's WP:CHIBOTCATS). Check to see if other projects have listed the article as class=GA, class=FA, or class=FL and use the same class.
I. (added by TonyTheTiger ( talk · contribs) autostub class=stub articles with stub templates.
J. (added by TonyTheTiger ( talk · contribs) If possible, mainspace articles with neither class=GA, FA, FL or stub could be tagged with the most common class used by other projects.
Note: Not all of the subcategories of Category:Indiana are listed here, as some are deemed to not be within the projects scope, so just pointing at Category:Indiana and subcats should not be done. Once the tagging is complete, projects members will be able to go through and assess each article's quality and importance and at that time make a final determiniation if the article is within the scope of the project. The primary benefit of a bot completing this task would be that it will put all the newly tagged articles into a our unassesed category autmotacally (because of the template syntax) making it easy to quickly go through them all. This will save the time of manually checking thousands of articles for banners. I expect that there are between 500-2000 articles that are not tagged with banners, out of an estimated 8-9 thousand articles in the categories. (I have not determined an easy way to count the articles). — Charles Edward ( Talk | Contribs) 14:30, 9 July 2009 (UTC)
Charles Edward authorized my addendum of appropriate BOT actions H-J-- TonyTheTiger ( t/ c/ bio/ WP:CHICAGO/ WP:LOTM) 13:46, 19 July 2009 (UTC)
Ok, I think Xeno can do this task. (and is maybe doing it right now...?) @TonyTheTiger: Are there any bots that do your task in Category:WikiProject tagging bots or Category:Template substitution bots? Thanks. AHRtbA== Talk 17:56, 18 July 2009 (UTC)
Stepshep ( talk · contribs) seems to be inactive. He use to run ShepBot to add {{ ChicagoWikiProject}} to new articles in all the cats at WP:CHIBOTCATS that did not have any of the various redirected forms of our project tag. When he added the tag he checked to see if other project tags listed the article as a WP:GA, WP:FA, or WP:FL. If not he added the most common class that the other projects were using. I need another person to perform this task for us once a week or so.-- TonyTheTiger ( t/ c/ bio/ WP:CHICAGO/ WP:LOTM) 19:02, 15 July 2009 (UTC)
I have an idea for a bot. Three times now when I have tried to create an article on a single released by a rock/alternative band, I kept getting hindered by forum chatter, videos, or lyrics. I was thinking a bot could be created to bypass all that crud and get sources which would be actually necessary, and post the sources found on the talk pages. -- Dylan 620 ( contribs, logs) 20:53, 7 August 2009 (UTC)
Should be fairly easy to do, and should run on a regular basis (daily?, weekly?) after the initial run. Headbomb { ταλκ κοντριβς – WP Physics} 18:07, 5 August 2009 (UTC)
Yes, but the bot would only be permitted to edit once every 10 seconds, which means it would take about 30 (?) days. Also, you since the template isn't rating the article (and can't, since it's a redirect), is there actually a need to tag all of these? You already have all the dismabigs listed at
Category:Disambiguation pages.
Also, in reply to AHRtbA, yes there are already bots to do this, and
AWB is equipped to do it, but there aren't really enough bots approved for WikiProject tagging (I've been considering creating my own bot to do WikiProject tagging), so you go ahead and create one if you like (don't forget to
request approval), even if it doesn't end up doing this particular task. And variants was when you would have been loading translucions to {{
dab}}, possibly Headbomb wasn't aware of the category? (you should just tag pages in
Category:Disambiguation pages rather than those transcluding {{
dab}} and variants -
Kingpin
13 (
talk) 14:51, 6 August 2009 (UTC)
Ok. Actually, I'm writing it in PHP for when I get in TS, but I did write the code for that part. (I didn't know you just have to replace the color (:) with " Talk:") Thanks. AHRtbA== Talk 16:14, 6 August 2009 (UTC)
Ok, Once every day/two days I think would be good. I'll set it up for that, but it won't officially be on that schedule until I get TS access. Thanks. AHRtbA== Talk 19:46, 6 August 2009 (UTC)
I'm not sure if this is the right place, but it'd be awesome if there would be a bot out there that would regularly remove Template:Current and its sister templates from articles that are not a current event. Basically, Template:Current is supposed to be added to articles that are being edited rapidly due to them being a current event. The consequences of that (possibly out-dated information, possibly wrong information, possible vandalism, etc.) are something we need to warn our readers of. 2009 Hudson River mid-air collision would be a recent example. But this and similar templates are regularly added to articles that can be considered "current" by various definitions of the word, even though there's no need for a warning to our readers. So it would be nice if there'd be a bot that would remove these templates from articles that haven't been edited more than, say, 10 times in the last 24 hours or so. -- Conti| ✉ 13:56, 9 August 2009 (UTC)
Headbomb { ταλκ κοντριβς – WP Physics} 02:02, 4 August 2009 (UTC)
Per the consensus at Wikipedia talk:WikiProject Football/Archive 33#Formal petition to change the naming conventions, could someone please create a bot to move any articles in Category:2009 domestic football (soccer) leagues, Category:2009 domestic football (soccer) cups, Category:2009-10 domestic football (soccer) leagues and Category:2009-10 domestic football (soccer) cups so that the years are at the beginning? For example, Serie A 2009–10 should be moved to 2009–10 Serie A, Greek Cup 2009–10 to 2009–10 Greek Cup, Allsvenskan 2009 to 2009 Allsvenskan and Finnish Cup 2009 to 2009 Finnish Cup. -- Soccer-holic I hear voices in my head... 13:49, 11 August 2009 (UTC)
Can anyone fix redirects on the pages that link to "Premiere (pay television network)"? -- JSH-alive talk • cont • mail 12:37, 10 August 2009 (UTC)
I a while ago, I created Wikipedia:WikiProject_Physics/Recognized_content, and I planned to update it every once in a while. Then today, instead of updating it, I got smart and thought, "Hey, that's a nice task for a bot!"
Basically what I do is take the intersection of the WikiProject's main category and that of say Feature articles. What is left is the Featured Articles of the project. So I lump them all in a section called "Featured article", and sort them alphabetically (people by last name, so I guess a bot would use the default sort value). And then I move on to former FAs, then GAs, then former GAs, and so on. Now this would be relatively easy to make this a bot-handled process, which can then be used by ALL wikiprojects (on a subscription basis, much like how
WP:AAlerts work). The bot would get the subscription from a template such as {{
WRCSubscription|ProjectCat=Physics articles}}
(the default lists all FA, FFA, GA, FGA, DYK, and so on), or if a project chooses to opt out of DYKs, then the template would look something like such as {{
WRCSubscription|ProjectCat=Physics articles|DYK=no}}
. The walls could then be updated daily/weekly/whateverly.
Headbomb {
ταλκ
κοντριβς –
WP Physics} 15:26, 25 July 2009 (UTC)
<!--Driver, Mini-->*{{Icon|FA}}[[Minnie Driver]] <!--Myers, Mike-->*{{Icon|FA}}[[Mike Myers]] <!--Nickelodeon-->*{{Icon|FA}}[[Nickelodeon]] <!--Richardson, Matthew-->*{{Icon|FA}}[[Matthew Richardson]]
Not done
I need a bot to deliver a thanks to the people that participated in my recent admin nomination discussion. -- Jeremy ( blah blah • I did it!) 05:33, 4 August 2009 (UTC)
I made a suggestion here: [5] that described the need for a bot in order to get a list of articles that EOL has and Wikipedia doesn't have (the list would be useful since it would be easy to access the missing material from EOL and then create articles that way since the license is compatible). Would anyone with the bot making capacity be able to assist in either helping user:Bob the wikipedian with making the bot or make it yourself. Note that this is only to create the list of articles (with the whole [[ ]] between the names, all in a list format etc.) and not to create the actual articles themselves (those will be created by humans!). Cheers! Calaka ( talk) 13:14, 7 August 2009 (UTC)
Please could someone make a bot which would determine which of the two permitted date formats is predominant in an article (either dd mmmm yyyy or mmmm dd, yyyy), and then converts all the other dates in that article to that format. This would save a vast amount of editing time. Thank you! Alarics ( talk) 19:15, 10 August 2009 (UTC)
Bonjour,
By renaming, Biblical « Codex Vaticanus » (B, 03) is now « Codex Vaticanus Graecus 1209 ». Can a bot modify each internal link « [[Codex Vaticanus]] » or « [[Codex Vaticanus|B]] » into « [[Codex Vaticanus Graecus 1209|Codex Vaticanus]] » and « [[Codex Vaticanus Graecus 1209|B]] » from this list ? :
Thanks,
Budelberger ( ) 13:51, 12 August 2009 (UTC).
Hi,
I am an active user of Hindi Wikipedia (hi)( My user page at hi wiki ). We at Hindi Wikipedia are very short on the number of contributors. Also none of our editors have capability/time to create a bot which I am requesting. So here is the description of the job I wish the new bot to do.
1. It will search for all the articles which do not have the "Discussion" page.
2. It will then create the Discussion page and populate it with a template.
3. This template is a general information template similar to this english wiki template {{ Talkheader}}
4. The Bot would invoke this task periodicaly (say, every week)
We at hindi wiki would highly appreciate help of any sort.
Thanks,
Regards,
Gunjan (
talk) 12:26, 13 August 2009 (UTC)
User:Example is a user account used in various places as an example, or as a placeholder that's guaranteed not to exist as a real user. While the account does have legitimate subpages, most new subpage creations are by inexperienced users who were trying to create a sandbox in their own userspace and got lost. Is there any bot that could watch for these, move them to the creator's own userspace, and leave them a nice note about how to find the proper location? — Gavia immer ( talk) 16:43, 14 August 2009 (UTC)
Template wikicite is deprecated. (The functionality of Wikicite has been completely subsumed by the Cite* family of templates). There are now a few hundred articles in which the template call to Wikicite serves no purpose. Many of these fall into one class:
* {{Wikicite | id=Johnson-2000 | reference ={{Cite book | last=Johnson | first= ... }}}}
These can be replaced with
* {{Cite book | last=Johnson | first= ... }}
I think a regular expression based bot could fix these no problem. The edit summary could read
Removing unnecessary template {{tl|Wikicite}}
Thanks. See also notes at Template Talk:Wikiref. ---- CharlesGillingham ( talk) 15:27, 15 August 2009 (UTC)
Following a discussion on the village pump, {{ Pui}} was changed to place images into categories based on dates (similar to how the proposed deletion category hierarchy is laid out). Only problem is, these categories need to be created. I'm not sure how PROD cats are created. I guess DumbBOT ( BRFA · contribs · actions log · block log · flag log · user rights) creates them. Would it be possible to have PUI cats created in the same way PROD cats are? Protonk ( talk) 08:55, 18 August 2009 (UTC)
Could a bot remove every instance of {{ Former WPFF Article}}? See Wikipedia:Templates for deletion/Log/2009 August 3#Template:FFCOTF candidate. I don't feel like making 200+ AWB edits. -- King of ♥ ♦ ♣ ♠ 17:13, 12 August 2009 (UTC)
There is a need to add "clade Heterobranchia" in every article that contains "informal group Pulmonata" but does not already mention Heterobranchia in its taxobox. Like this: http://en.wikipedia.org/?title=Physella&diff=308737416&oldid=301400813 Such articles can be found within the broad category Category:Gastropod families. -- Snek01 ( talk) 20:26, 18 August 2009 (UTC) Prose in this request was tweaked by Invertzoo ( talk) 21:27, 18 August 2009 (UTC)
informal group Pulmonata
and added the clade to all instances Tim1357 ( talk) 22:54, 18 August 2009 (UTC)
From http://strategy.wikimedia.org/wiki/Proposal:Spider_popular_project_page_interwikis
Please check interwikis to other language versions on important project pages, because someone on IRC during the strategy meeting said that this particular set of interwikis is in bad shape.
Make sure that correct interwikis to and from the Village Pumps, the Help Desks, the Reference Desk, everything else on http://en.wikipedia.org/wiki/Wikipedia:Questions and the project pages which appear on stats.grok.se/en/top (listed below) exist and return good pages with correct titles.
We need a report to show the state of those interwikis, so that problems can be addressed now and if they crop up, perhaps checking weekly or monthly if it's not too much load?
For those that don't exist or return bad page status values when accessed, or which have no page title, can those issues be fixed with a bot?
Please check the wiktionaries too. Thank you! 99.60.0.22 ( talk) 04:24, 19 August 2009 (UTC)
I am doing edits on
Katipunan and {{
Katipunan}}
template recently. I am requesting for bot assistance to add the following categories to the people mentioned on the
Katipunan article and template {{
Katipunan}}
:
On the other hand, for the objects mentioned in Katipunan, please add:
Thanks for the assisting bot.-- JL 09 Talk to me! 06:45, 19 August 2009 (UTC)
There is a large backlog of images that need their size reduced. I propose that this hypothetical bot reduces the image's resolution by a set ammount (perhaps 25%). -- Tim1357 ( talk) 23:23, 17 August 2009 (UTC)
There are several userboxes which have recently been moved from the template namespace to the user namespace per the userbox migration. All transclusions of these userboxes as templates must be replaced with their new locations in userspace. They are listed at Category:Wikipedia GUS userboxes. 95j || talk 22:10, 24 August 2009 (UTC)
Hello,
Can a bot take all the articles in the
Category:Zoroastrianism category and all it's subcategories, as well as
Category:Parsis and place a {{
WikiProject Zoroastrianism}} template on the pages that do not have this template?
Warrior
4321 17:37, 21 August 2009 (UTC)
Wikipedia:VPT#Cascading_template_transclusions, until it is archived.
Any usage of the infobox settlement template that uses a flag in the field "subdivision_name" is causing the template to load EVERY flag everytime the template loads, and makes it impossible to use What Links Here to see flag template usage. Can someone fix this by making a bot that replaces the flag template usage with the linked country name and a hidden comment that that field should not contain flags?
Look at the What Links Here for Template:MEX as a start to see the extent of the problem. SchmuckyTheCat ( talk) 20:21, 24 August 2009 (UTC)
{{
Flagicon Mexico}}
(or "Mexico" where that is consensual). —Preceding
unsigned comment added by
Rich Farmbrough (
talk •
contribs) 16:51, 26 August 2009 (UTC) User:ProcseeBot seems to be inactive, and I was wondering if anyone would be able to create a clone, for use on Simple English Wikipedia. Thanks, Majorly talk 18:17, 21 August 2009 (UTC)
Either
Or
becasue the archives have no history which makes life difficult. Rich Farmbrough, 15:10, 23 August 2009 (UTC).
.
Could someone add:
{{talkheader}} {{WikiProject Palaeontology|class=stub|importance=low}}
...to the talk pages of the articles in Category:Synapsid stubs that need it? Abyssal ( talk) 14:19, 26 August 2009 (UTC)
Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
I need a bot to noinclude categories on the POTD templates. Basically make this edit to all the subtemplates of Template:POTD. You can see Template:POTD protected/2007-08-08 for the breakage that this is causing. — RockMFR 23:07, 29 August 2009 (UTC)
Hi fellas,
This is a task I tried to carry out with my own bot, but I had some issues programming it so I need your help. Basically, I think it would be interesting to add the CMOnline template and the BrahmsOnline template to the External Links section of contemporary classical music composers.
The first template links to exceprts from sound archives. There is a web service (SOAP method) that can tell the bot if there is content to be linked about a composer.
The second one links to a bio. I will provide a list of composers for which bios are available on the linked side, and the corresponding link in an array.
So basically the task would be:
- parsing categories like "Contemporary music"
- for each composer name, check out if a link is to be added, and add it if it has to.
I can only think of this as a benefit for the encylopedia. Regards, -- LeMiklos ( talk) 10:04, 24 August 2009 (UTC)
There are over a hundred such links that can be created (search gives 519 hits). Changes should be from \[\[floodplain]] forest(s?) and floodplain forest(s?) to [[floodplain forest]]\1. I'm working on a translation from it:Foresta inondata. Balabiot ( talk) 13:57, 30 August 2009 (UTC)
This would be a user assisted bot.
The bot would operate on the thousands of genus stub pages that exist on wikipedia. The bot would load the genus page, and assess weather or not the page had a species section. if not, then it would go to the corresponding (raw) page on wikispecies. Then, it would copy all information between the end of the colon on Species: to the first equals mark. Then, it would go back to the wikipedia article, create a new ==species== heading, and paste the species there. I know this is possible, i just lack the know-how to build it. Tim1357 ( talk) 04:04, 31 August 2009 (UTC)
Since June 2009, the Slovenian geographic information system geopedia.si is also available in English. Could someone please run a bot and update links to the English version of geopedia.si like it has been done for Log pod Mangartom ( diff) or Arčoni ( diff)? The articles to be corrected are mainly located in Category:Cities, towns and villages in Slovenia. Thanks a lot. -- Eleassar my talk 21:08, 1 September 2009 (UTC)
I'm not entirely sure this is a bot request or can be done another way...but....
Basically, what I'd like to do is keep track of changes in page size for all articles tagged with {{ WikiProject Pharmacology}} (or that exists in a subcat of Category:Pharmacology articles by quality, whichever's easier) to have an idea of when an article might need its quality assessment changed. This would be done in 2 phases: one massive dump at first, and then an analysis every month or so. The output would be the article's name, the date it was last tagged, the size of the article then, the size of the article at runtime, and the change in size.
The first phase would scan ALL the articles, get their histories, find when they were assessed, compare page sizes, and so on. After that, it would only need to be done on articles updated within the last month (or whatever the frequency we choose is).
The actual criteria are more complex but, basically, it would let us know if there's an article assessed as a stub that is 20KB!
I'd also like it to dump a list of articles over 25KB as candidates for being split.
None of this would involve any editing at all.
Can this be done? I've been searching around a lot, but I can't find any tools or bots that work based on page size!
Thanks, Skittleys ( talk) 19:27, 1 September 2009 (UTC)
mysql> SELECT MIN(rev_len), MAX(rev_len), AVG(rev_len), STD(rev_len)
-> FROM page
-> JOIN revision on page_id = rev_page
-> WHERE /* Specify the page */
-> page_namespace = 0 AND page_title = "Tourette_syndrome"
-> AND /* Date range */ /* From now till 1 Jan 2009 */
-> "20090101004415" < rev_timestamp;
+--------------+--------------+--------------+--------------+
| MIN(rev_len) | MAX(rev_len) | AVG(rev_len) | STD(rev_len) |
+--------------+--------------+--------------+--------------+
| 68345 | 72617 | 70322.1774 | 689.5260 |
+--------------+--------------+--------------+--------------+
1 row in set (0.01 sec)
Rich Farmbrough, 02:57, 3 September 2009 (UTC).
Would this be the right place to make a request for a bot to replace about 1500 translusions of Template:Importance with Template:Notability?
A few points:
Many thanks in advance for your attention, — Martin ( MSGJ · talk) 13:33, 26 August 2009 (UTC)
Actually SamckBot does this when it comes across the template. I will look into it. Rich Farmbrough, 12:36, 2 September 2009 (UTC).
I'm not sure if this has been proposed here before, but at WebCite FAQ, they have a suggestion that a wikipedia bot be devolped to provide archive links to urls.
develop a wikipedia bot which scans new wikipedia articles for cited URLs, submits an archiving request to WebCite®, and then adds a link to the archived URL behind the cited URL
This seems like a feasible idea. Smallman12q ( talk) 12:18, 2 September 2009 (UTC)
There are bots already doing it. But can someone run it for articles transcluding {{ Cleanup-link rot}} and then remove the template from the articles? Thanks. -- Magioladitis ( talk) 00:18, 30 August 2009 (UTC)
Could a bot be written to use Yahoo's Smush.it to optimize(in a lossless way) the most heavily used images? Smallman12q ( talk) 17:01, 2 September 2009 (UTC)
There is a large backlog of images that need their size reduced. I propose that this hypothetical bot reduces the image's resolution by a set ammount (perhaps 25%). -- Tim1357 ( talk) 23:23, 17 August 2009 (UTC)
all these images can then be resized to 300px.
Note: this process can be then applied to a number of other uses, such as film posters, dvd covers, and video game covers.
Tim1357 ( talk) 22:08, 3 September 2009 (UTC)
Not sure this is bot work per se but asking here is certainly my best shot at getting this done. Could someone give me a rough count on the number of lists on the en.wiki? There are currently 1500 featured lists and I was wondering how that compared to the total. Better yet, is there something on toolserver that allows counting the number of unique articles in a category and its subcategories? Thanks, Pichpich ( talk) 21:08, 3 September 2009 (UTC)
Hi, The database Municipality Atlas Netherlands ca. 1868 (Kuyper Atlas) has been moved. That's why 500+ external links on the English language wikipedia became dead links. It will take a long time to update them one by one, so I was wondering if a Bot could be helpfull to take this action. I dont know much about the technics, so maybe sombody could help me out.
example http://en.wikipedia.org/wiki/Vlodrop The external link * Map of the former municipality in 1868 * The old (dead) link is http://www.kuijsten.de/atlas/li/vlodrop.html The new link is http://www.atlas1868.nl/li/vlodrop.html
The directory structure is still the same, so if the string "kuijsten.de/atlas" can be changed in "atlas1868.nl" on all the 500+ wikipedia pages, the links will be alive alive again!
Regards, Quarium, The Netherlands -- Quarium ( talk) 20:06, 4 September 2009 (UTC)
Easy to fix,fixing. Rich Farmbrough, 04:00, 7 September 2009 (UTC).
In August, a large batch of stub articles were created by bot for New Zealand rivers, using information from the LINZ website. They are all marked with {{
LINZ}}, showing "accessdate=07/12/09". In New Zealand, that means December 7th. A bot would be useful to change all articles in
Category:Rivers of New Zealand which have that access date on the LINZ template to read "12/07/09" (or "12 AugustJuly 2009", for that matter). Cheers,
Grutness...
wha? 22:44, 6 September 2009 (UTC)
I don't know if this has been suggested before, but would it be possible to create a bot to move pages because there are lots that have hyphenated titles and should have spaced ndash titles. I have moved many manually in the past (e.g. this fun-filled morning) but it is very tedious. I was thinking this would be a simple task for a bot, and had hoped AWB might do it but I don't think it can. Is there anyone who could make or help a complete bot noob make one as there are hundreds more similar titles that need fixing. Thanks in advance, Rambo's Revenge (talk) 22:41, 5 September 2009 (UTC)
Well it's not urgent so it could run off database dumps. Rich Farmbrough, 19:54, 6 September 2009 (UTC).
This is not a very difficult task in terms of actually getting the list, the difficulty is in choosing whether or not to move the article. Either way, Coding... ( X! · talk) · @988 · 22:43, 7 September 2009 (UTC)
I really appreciate you guys taking this on in ways I don't know. Just to confirm (possibly going over some of what Rich says) I only request the " - "s be dealt with because these should never exist as articles (only as redirects). That solves 1 & 5 (both ignored). Movie & TV titles (2) should be dealt with the same, and redirects (3) should be skipped. I don't know much about coding but I guess this need to come first. Target pages shouldn't exist for non-redirects, because they are nearly all moved from hyphen to ndash, and most (all?) directly creating articles at the spaced ndash know to create a convenience redirect. So 4 shouldn't be a problem, but I guess skip if it happens because if it is a non-redirect hyphened page and ndahsed one there will most likely be a parallel history problem there too (can these "(4) skipped cases" be categorised if they happen?). Thanks all, Rambo's Revenge (talk) 16:27, 8 September 2009 (UTC)
Ok, this is probably a bit of a pain to program but it would be a big help. Wikipedia:Find-A-Grave famous people is a subproject of WP:MISSING that tries to identify missing articles about notable dead people. There's been quite a lot of work on these lists to classify the individuals and in many cases articles exist in wikis in other languages. When this is the case, a link has been created to such articles. For instance, at the top of the list Wikipedia:Find-A-Grave famous people/A you find links to the es.wiki articles about Anny Ahlers and Anselmo Aieta. What I would like is to have a bot create translation requests for such entries. I estimate that there are a few hundred of these. I'll check the translation requests manually but I'd be grateful if a bot can generate them. Let me know if you need any xtra info for the task. Thanks, Pichpich ( talk) 15:31, 9 September 2009 (UTC)
Many citation template have parameters such as |journal=[[Science]]. These should be replaced by |journal=[[Science (journal)|Science]].
In general the logic should be
Headbomb { ταλκ κοντριβς – WP Physics} 21:06, 25 August 2009 (UTC)
There are 6000 odd links to Science appoximately 1% are non-dabbed journal=. To the total un-dabbed pages of the 400 journal tittles there are 25,525 links, we may expect that about 1% are journal = entries at most, so this can be safely AWB'd being unlikly to reach 200 edits. Rich Farmbrough, 17:45, 8 September 2009 (UTC).
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
The use of the {{ t}} template, outside of meta, is no different than that of {{ tl}}. Even on meta, it's {{ t0}} instead. Given the number of pages which use {{ t}} as {{ tl}}, it'd be good to have a bot change over instances where the former is used to the latter.
-- coldacid ( talk| contrib) 18:45, 10 September 2009 (UTC)
Hi All,
Please tag all articles in Category:Redirects from Digimon's talk pages with {{ WikiProject DIGI|class=redirect}} (note that most are already tagged with {{ WikiProject DIGI}}). This is required as most of the talk pages in Category:WikiProject Digimon articles are for redirects, making it impossible to distinguish redirects from actual articles. This would also allow future categorisation by quality and by importance, as these redirects would not require assessment.
Regards,
G.A.S talk 08:39, 3 September 2009 (UTC)
{{WikiProject DIGI|class=redirect}}
prepended on them.class=redirect
using the "find and replace" method.{{WikiProject DIGI|class=redirect}}
prepended on them.Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
"Go" Go to next page in list [What Transcludes this page:Template Dead Link] look for "{{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] {{ Dead Link}}</ref>" ---we call this "refeference" if not found, "Go"--- it skips and starts over if found: search [:anycharacters Two:] for 4 numbers, starting with eitther 20, or 199 -- as the internet existed only in the 90's and 2000's Copy those 4 numbers --we'll call them year lookup web.archive.org/"year"/url.--we call this "archive" If not exist, "Go"----skips and stars over if exist search artice for "refrence"-----remember? replace with: {{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] |archice url:"refrence"{{ Dead Link}}</ref> "Go"
I guess persistence pays off, as I will go ahead and put this on my to-do list. That doesn't mean I'll get to it soon as I have several projects I'm working on, but it does mean I will do it. :) -- ThaddeusB ( talk) 01:42, 8 September 2009 (UTC)
Just a suggestion, as I'm all for automation, but I have seen sites change purpose completly as domain names are bought and sold, among other things, and the archive is very fallible.
(*) Log pages
The idea here is
Hi, Tim. Your inquiry got passed to me based on the assumption that when you speak of dead links, you mean to link to content in the Wayback Machine archive of web content.
Please feel free to have your automated checker/link-fixer make whatever requests are necessary. Our blanket robots.txt block is to prevent
indiscriminate crawling, and especially the case where archived content could be mistaken for original sites if indexed by search engines.
I do suggest you use an identifiable User-Agent on such requests, with a contact URL or email address, just in case your volume of requests creates a problem,
though I doubt that will be the case.
Also, please start slow -- say only one request pending at a time -- unless/until you absolutely need to go faster. Let me know if you have any other questions! - Gordon @ IA Web Archive
Tim1357 (
talk) 00:38, 10 September 2009 (UTC)
{{Infobox MMAstats}}
is now converted to call {{
Infobox martial artist}}, and can be substituted. There are over 500 transclusions to deal with, please.
Andy Mabbett (User:Pigsonthewing);
Andy's talk;
Andy's edits 17:40, 10 September 2009 (UTC)
Can a bot be used to put the {{Uncategorized}} tag on pages without categories (logically). It sounds like it would be editing new pages a lot, too.-- OsirisV ( talk) 20:24, 9 September 2009 (UTC)
Bot should browse through all pages that contain templates with astronomical coordinates (i.e. Template:Starbox_observe, Template:Infobox galaxy)
While parsing "dec" parameter please be aware of sign before first number as it can be in many forms i.e. "-", "—", "−", "& minus;", "+", "& plus;". If parser confused then mark page for manual review. There are 1000s of pages need to be browsed and fixed. Examples of pages I fixed manually - [9], [10]. Thanks. friendlystar ( talk) 23:01, 13 September 2009 (UTC)
Something I just thought of; I don't really have the time to develop it myself, but I think it's a worthy goal. We have a few pairs of templates where one is supposed to be replaced by the other: the most prominent example is {{
coord}}
and {{
coord missing}}
: if coordinates are needed on a page, the template is added, and then when the coordinates are found they overwrite the needed template. But I bet there are some pages which have both templates, and there I expect the coord-needed template can be safely removed, certainly semi-auto. I had a SQL query running on the toolserver to see if I could count the situations where this is the case, but it died :( Thoughts?
Happy‑
melon 23:02, 10 September 2009 (UTC)
Would it be possible to write a bot that looked for articles without photos and first checked if they had a link to another language wiki that used a commons image; and if that wasn't the case searched by article name in commons. Then produced lists of articles with possible photos and reasons why they might match, much as per Wikipedia:WikiProject Red Link Recovery? We could then set up a project to go through add the photos or mark the suggestion as spurious, so the bot would also need to have a facility to suppress suggestions previously marked as wrong. Ϣere SpielChequers 22:24, 16 September 2009 (UTC)
This Wikiproject category is full of articles and subcategories, instead of article talk pages and category talk pages. Can someone please organise a bot to transfer the category tags across to the talk pages? Hesperian 00:00, 15 September 2009 (UTC)
|MAIN_CAT = WikiProject Ancient Near East articles
Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
"Go" Go to next page in list [What Transcludes this page:Template Dead Link] look for "{{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] {{ Dead Link}}</ref>" ---we call this "refeference" if not found, "Go"--- it skips and starts over if found: search [:anycharacters Two:] for 4 numbers, starting with eitther 20, or 199 -- as the internet existed only in the 90's and 2000's Copy those 4 numbers --we'll call them year lookup web.archive.org/"year"/url.--we call this "archive" If not exist, "Go"----skips and stars over if exist search artice for "refrence"-----remember? replace with: {{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] |archice url:"refrence"{{ Dead Link}}</ref> "Go"
I guess persistence pays off, as I will go ahead and put this on my to-do list. That doesn't mean I'll get to it soon as I have several projects I'm working on, but it does mean I will do it. :) -- ThaddeusB ( talk) 01:42, 8 September 2009 (UTC)
Just a suggestion, as I'm all for automation, but I have seen sites change purpose completly as domain names are bought and sold, among other things, and the archive is very fallible.
(*) Log pages
The idea here is
Hi, Tim. Your inquiry got passed to me based on the assumption that when you speak of dead links, you mean to link to content in the Wayback Machine archive of web content.
Please feel free to have your automated checker/link-fixer make whatever requests are necessary. Our blanket robots.txt block is to prevent
indiscriminate crawling, and especially the case where archived content could be mistaken for original sites if indexed by search engines.
I do suggest you use an identifiable User-Agent on such requests, with a contact URL or email address, just in case your volume of requests creates a problem,
though I doubt that will be the case.
Also, please start slow-- say only one request pending at a time -- unless/until you absolutely need to go faster. Let me know if you have any other questions! - Gordon @ IA Web Archive
Tim1357 (
talk) 00:38, 10 September 2009 (UTC)
Is it possible to have all the articles in Category:Sub-Roman Britain added to Category:Sub-Roman Britain task force articles? Thanks. Dougweller ( talk) 13:57, 18 September 2009 (UTC)
Hey, I want to build a list of CERN experiments, but it would be long and tedious, since there's a lot of it. So take a look at my sandbox (and how it's written) to see what sort of end result I'd want to get.
What the bot would need to do is go through http://www.slac.stanford.edu/spires/find/experiments/www2?ee=CERN-NA-001 through http://www.slac.stanford.edu/spires/find/experiments/www2?ee=CERN-NA-63, and retrieve the relevant information. (It's possible that not all the links will produce a result, since the latest experiments are rather new, and the older ones are not all documented).
Obviously this should be done for all the experiments found in the SPIRES database, namely EMU, IS, NA, PS, R, T, UA, and WA experiments. When all's said and done it needs to run up to EMU20 experiment, IS494 experiment (lots missing), NA63 experiment, PS215 experiment (lots missing), R808 experiment (lots missing), T250 experiment (lots missing), UA9 experiment, and WA103 experiment respectively. There'll be a shiny barnstar for whoever codes this. You can make the bot write straight in my User:Headbomb/Sandbox8/NA experiments and so on if you want. Headbomb { ταλκ κοντριβς – WP Physics} 17:46, 14 September 2009 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 25 | ← | Archive 28 | Archive 29 | Archive 30 | Archive 31 | Archive 32 | → | Archive 35 |
Hello.
Need add Template:Administrative divisions of South Ossetia in articles included in this template. Advisors ( talk) 12:24, 27 July 2009 (UTC)
I just ran this, it has already been done. Rich Farmbrough, 02:35, 1 August 2009 (UTC).
A bot is needed, please, to convert existing "first broadcast", "foundation", "founded", "opened", "released", or similar dates in infoboxes, to use {{ Start date}}, so that they are emitted as part of the included hCard or hCalendar microformats. Further details on my to-do page. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 22:25, 30 July 2009 (UTC)
There is a proposal at WP:AN that has considerable support, to rename the somewhat historic Wikipedia:Administrators' noticeboard/Incidents to the more obvious Wikipedia:requests for administrator assistance (RAA).
If this passes, then there are two coding or bot related issues related to implementation, and feedback is sought.
Discussion and input sought - when it's done it can then be summarized at AN to inform the discussion there. FT2 ( Talk | email) 17:56, 1 August 2009 (UTC)
<!-- Please see http://en.wikipedia.org/wiki/User:ClueBot_III/Documentation before editing this. --> {{User:ClueBot III/ArchiveThis |archiveprefix=Wikipedia:Requests for administrator assistance/Archives/ |format=Y/F/d |age=72 |index=no |minarchthreads=0 |minkeepthreads=0 |archivenow=<nowiki>{{User:ClueBot III/ArchiveNow}},{{resolved|,{{Resolved|,{{done}},{{Done}}</nowiki> |header=<nowiki>{{Talkarchive}}</nowiki> |headerlevel=2 |nogenerateindex=1 |archivebox=no |box-advert=no }}
Remove lines where the first word is a blue link on the lists at the page Wikipedia:Requested articles pages.-- Otterathome ( talk) 16:33, 2 August 2009 (UTC)
We'd like a bot that can find all counties in the US that don't have the Template:Infobox U.S. County infobox. Articles missing that template should be placed into Category:Missing U.S. County Infobox. Timneu22 ( talk) 12:15, 23 July 2009 (UTC)
BRFA filed: Wikipedia:Bots/Requests for approval/MadmanBot 9 — madman bum and angel 05:13, 26 July 2009 (UTC)
WP:LOMJ should contain a list of articles that dont exist.
It would be good if entries are removed if there is an article, and it contains either:
If a page does exist at the name of a journal, the bot should also apply the above logic for a page disambiguated with " (journal)". John Vandenberg ( chat) 02:31, 4 August 2009 (UTC)
i dont know how to program but all the bot would have to do is: have the script : [1] installed then load articles from [2]
and go to the top of the page and click "check"
then wait to click save
and then start the process over
P.S. thats what im doing now manually but its very tedious. -- Tim1357 ( talk) 03:16, 4 August 2009 (UTC)
Is it possible to create a bot that would change tags with incorrectly formatted date tags?
For example:
{{unreferenced|Date=August 2009}}
would mean that the "Date=" parameter is ignored, as it should be "date=" instead.
So basically, the bot should change "Date=" to "date=". This would be similar to how
Smackbot changes == External Links ==
to == External links ==
I think this would be really useful - I have recently been working on undated unsourced articles, and come across this quite often! I've also done it myself, but normally catch it in the preview!
Regards, -- PhantomSteve ( Contact Me, My Contribs) 10:19, 3 August 2009 (UTC)
Is it possible for a bot to check the articles in each of the following 641 categories and ensure that the Indiana project banner ({{WikiProject Indiana}}) is on the talk page? The bot could make the following decisions:
A. If the Indiana banner currently on the article talk page is {{WPINDIANA}}, change it to {{WikiProject Indiana}}, and if possible leave the assessment parameters in place. (This is something that could be done in pereptuity if there is a bot that does that)
B. If no Indiana banner is present on the talk page of an article, place one. ({{WikiProject Indiana}})
C. If the item in the category is in the file namespace, tag it as such. ({{WikiProject Indiana|class=image}})
D. If the item is in the template namespace, tag it as such. ({{WikiProject Indiana|class=template}})
E. On the categories themselves, also check for a Indiana banner, and if none is present add one ({{WikiProject Indiana|class=category}})
F. If the item is in the portal namespace, tag it is such. ({{WikiProject Indiana|class=portal}})
G. If possible, could a list of all the article that have been altered by the bot for this task also be created?
H. (added by TonyTheTiger ( talk · contribs) who would also like to use the bot for WP:CHICAGO's WP:CHIBOTCATS). Check to see if other projects have listed the article as class=GA, class=FA, or class=FL and use the same class.
I. (added by TonyTheTiger ( talk · contribs) autostub class=stub articles with stub templates.
J. (added by TonyTheTiger ( talk · contribs) If possible, mainspace articles with neither class=GA, FA, FL or stub could be tagged with the most common class used by other projects.
Note: Not all of the subcategories of Category:Indiana are listed here, as some are deemed to not be within the projects scope, so just pointing at Category:Indiana and subcats should not be done. Once the tagging is complete, projects members will be able to go through and assess each article's quality and importance and at that time make a final determiniation if the article is within the scope of the project. The primary benefit of a bot completing this task would be that it will put all the newly tagged articles into a our unassesed category autmotacally (because of the template syntax) making it easy to quickly go through them all. This will save the time of manually checking thousands of articles for banners. I expect that there are between 500-2000 articles that are not tagged with banners, out of an estimated 8-9 thousand articles in the categories. (I have not determined an easy way to count the articles). — Charles Edward ( Talk | Contribs) 14:30, 9 July 2009 (UTC)
Charles Edward authorized my addendum of appropriate BOT actions H-J-- TonyTheTiger ( t/ c/ bio/ WP:CHICAGO/ WP:LOTM) 13:46, 19 July 2009 (UTC)
Ok, I think Xeno can do this task. (and is maybe doing it right now...?) @TonyTheTiger: Are there any bots that do your task in Category:WikiProject tagging bots or Category:Template substitution bots? Thanks. AHRtbA== Talk 17:56, 18 July 2009 (UTC)
Stepshep ( talk · contribs) seems to be inactive. He use to run ShepBot to add {{ ChicagoWikiProject}} to new articles in all the cats at WP:CHIBOTCATS that did not have any of the various redirected forms of our project tag. When he added the tag he checked to see if other project tags listed the article as a WP:GA, WP:FA, or WP:FL. If not he added the most common class that the other projects were using. I need another person to perform this task for us once a week or so.-- TonyTheTiger ( t/ c/ bio/ WP:CHICAGO/ WP:LOTM) 19:02, 15 July 2009 (UTC)
I have an idea for a bot. Three times now when I have tried to create an article on a single released by a rock/alternative band, I kept getting hindered by forum chatter, videos, or lyrics. I was thinking a bot could be created to bypass all that crud and get sources which would be actually necessary, and post the sources found on the talk pages. -- Dylan 620 ( contribs, logs) 20:53, 7 August 2009 (UTC)
Should be fairly easy to do, and should run on a regular basis (daily?, weekly?) after the initial run. Headbomb { ταλκ κοντριβς – WP Physics} 18:07, 5 August 2009 (UTC)
Yes, but the bot would only be permitted to edit once every 10 seconds, which means it would take about 30 (?) days. Also, you since the template isn't rating the article (and can't, since it's a redirect), is there actually a need to tag all of these? You already have all the dismabigs listed at
Category:Disambiguation pages.
Also, in reply to AHRtbA, yes there are already bots to do this, and
AWB is equipped to do it, but there aren't really enough bots approved for WikiProject tagging (I've been considering creating my own bot to do WikiProject tagging), so you go ahead and create one if you like (don't forget to
request approval), even if it doesn't end up doing this particular task. And variants was when you would have been loading translucions to {{
dab}}, possibly Headbomb wasn't aware of the category? (you should just tag pages in
Category:Disambiguation pages rather than those transcluding {{
dab}} and variants -
Kingpin
13 (
talk) 14:51, 6 August 2009 (UTC)
Ok. Actually, I'm writing it in PHP for when I get in TS, but I did write the code for that part. (I didn't know you just have to replace the color (:) with " Talk:") Thanks. AHRtbA== Talk 16:14, 6 August 2009 (UTC)
Ok, Once every day/two days I think would be good. I'll set it up for that, but it won't officially be on that schedule until I get TS access. Thanks. AHRtbA== Talk 19:46, 6 August 2009 (UTC)
I'm not sure if this is the right place, but it'd be awesome if there would be a bot out there that would regularly remove Template:Current and its sister templates from articles that are not a current event. Basically, Template:Current is supposed to be added to articles that are being edited rapidly due to them being a current event. The consequences of that (possibly out-dated information, possibly wrong information, possible vandalism, etc.) are something we need to warn our readers of. 2009 Hudson River mid-air collision would be a recent example. But this and similar templates are regularly added to articles that can be considered "current" by various definitions of the word, even though there's no need for a warning to our readers. So it would be nice if there'd be a bot that would remove these templates from articles that haven't been edited more than, say, 10 times in the last 24 hours or so. -- Conti| ✉ 13:56, 9 August 2009 (UTC)
Headbomb { ταλκ κοντριβς – WP Physics} 02:02, 4 August 2009 (UTC)
Per the consensus at Wikipedia talk:WikiProject Football/Archive 33#Formal petition to change the naming conventions, could someone please create a bot to move any articles in Category:2009 domestic football (soccer) leagues, Category:2009 domestic football (soccer) cups, Category:2009-10 domestic football (soccer) leagues and Category:2009-10 domestic football (soccer) cups so that the years are at the beginning? For example, Serie A 2009–10 should be moved to 2009–10 Serie A, Greek Cup 2009–10 to 2009–10 Greek Cup, Allsvenskan 2009 to 2009 Allsvenskan and Finnish Cup 2009 to 2009 Finnish Cup. -- Soccer-holic I hear voices in my head... 13:49, 11 August 2009 (UTC)
Can anyone fix redirects on the pages that link to "Premiere (pay television network)"? -- JSH-alive talk • cont • mail 12:37, 10 August 2009 (UTC)
I a while ago, I created Wikipedia:WikiProject_Physics/Recognized_content, and I planned to update it every once in a while. Then today, instead of updating it, I got smart and thought, "Hey, that's a nice task for a bot!"
Basically what I do is take the intersection of the WikiProject's main category and that of say Feature articles. What is left is the Featured Articles of the project. So I lump them all in a section called "Featured article", and sort them alphabetically (people by last name, so I guess a bot would use the default sort value). And then I move on to former FAs, then GAs, then former GAs, and so on. Now this would be relatively easy to make this a bot-handled process, which can then be used by ALL wikiprojects (on a subscription basis, much like how
WP:AAlerts work). The bot would get the subscription from a template such as {{
WRCSubscription|ProjectCat=Physics articles}}
(the default lists all FA, FFA, GA, FGA, DYK, and so on), or if a project chooses to opt out of DYKs, then the template would look something like such as {{
WRCSubscription|ProjectCat=Physics articles|DYK=no}}
. The walls could then be updated daily/weekly/whateverly.
Headbomb {
ταλκ
κοντριβς –
WP Physics} 15:26, 25 July 2009 (UTC)
<!--Driver, Mini-->*{{Icon|FA}}[[Minnie Driver]] <!--Myers, Mike-->*{{Icon|FA}}[[Mike Myers]] <!--Nickelodeon-->*{{Icon|FA}}[[Nickelodeon]] <!--Richardson, Matthew-->*{{Icon|FA}}[[Matthew Richardson]]
Not done
I need a bot to deliver a thanks to the people that participated in my recent admin nomination discussion. -- Jeremy ( blah blah • I did it!) 05:33, 4 August 2009 (UTC)
I made a suggestion here: [5] that described the need for a bot in order to get a list of articles that EOL has and Wikipedia doesn't have (the list would be useful since it would be easy to access the missing material from EOL and then create articles that way since the license is compatible). Would anyone with the bot making capacity be able to assist in either helping user:Bob the wikipedian with making the bot or make it yourself. Note that this is only to create the list of articles (with the whole [[ ]] between the names, all in a list format etc.) and not to create the actual articles themselves (those will be created by humans!). Cheers! Calaka ( talk) 13:14, 7 August 2009 (UTC)
Please could someone make a bot which would determine which of the two permitted date formats is predominant in an article (either dd mmmm yyyy or mmmm dd, yyyy), and then converts all the other dates in that article to that format. This would save a vast amount of editing time. Thank you! Alarics ( talk) 19:15, 10 August 2009 (UTC)
Bonjour,
By renaming, Biblical « Codex Vaticanus » (B, 03) is now « Codex Vaticanus Graecus 1209 ». Can a bot modify each internal link « [[Codex Vaticanus]] » or « [[Codex Vaticanus|B]] » into « [[Codex Vaticanus Graecus 1209|Codex Vaticanus]] » and « [[Codex Vaticanus Graecus 1209|B]] » from this list ? :
Thanks,
Budelberger ( ) 13:51, 12 August 2009 (UTC).
Hi,
I am an active user of Hindi Wikipedia (hi)( My user page at hi wiki ). We at Hindi Wikipedia are very short on the number of contributors. Also none of our editors have capability/time to create a bot which I am requesting. So here is the description of the job I wish the new bot to do.
1. It will search for all the articles which do not have the "Discussion" page.
2. It will then create the Discussion page and populate it with a template.
3. This template is a general information template similar to this english wiki template {{ Talkheader}}
4. The Bot would invoke this task periodicaly (say, every week)
We at hindi wiki would highly appreciate help of any sort.
Thanks,
Regards,
Gunjan (
talk) 12:26, 13 August 2009 (UTC)
User:Example is a user account used in various places as an example, or as a placeholder that's guaranteed not to exist as a real user. While the account does have legitimate subpages, most new subpage creations are by inexperienced users who were trying to create a sandbox in their own userspace and got lost. Is there any bot that could watch for these, move them to the creator's own userspace, and leave them a nice note about how to find the proper location? — Gavia immer ( talk) 16:43, 14 August 2009 (UTC)
Template wikicite is deprecated. (The functionality of Wikicite has been completely subsumed by the Cite* family of templates). There are now a few hundred articles in which the template call to Wikicite serves no purpose. Many of these fall into one class:
* {{Wikicite | id=Johnson-2000 | reference ={{Cite book | last=Johnson | first= ... }}}}
These can be replaced with
* {{Cite book | last=Johnson | first= ... }}
I think a regular expression based bot could fix these no problem. The edit summary could read
Removing unnecessary template {{tl|Wikicite}}
Thanks. See also notes at Template Talk:Wikiref. ---- CharlesGillingham ( talk) 15:27, 15 August 2009 (UTC)
Following a discussion on the village pump, {{ Pui}} was changed to place images into categories based on dates (similar to how the proposed deletion category hierarchy is laid out). Only problem is, these categories need to be created. I'm not sure how PROD cats are created. I guess DumbBOT ( BRFA · contribs · actions log · block log · flag log · user rights) creates them. Would it be possible to have PUI cats created in the same way PROD cats are? Protonk ( talk) 08:55, 18 August 2009 (UTC)
Could a bot remove every instance of {{ Former WPFF Article}}? See Wikipedia:Templates for deletion/Log/2009 August 3#Template:FFCOTF candidate. I don't feel like making 200+ AWB edits. -- King of ♥ ♦ ♣ ♠ 17:13, 12 August 2009 (UTC)
There is a need to add "clade Heterobranchia" in every article that contains "informal group Pulmonata" but does not already mention Heterobranchia in its taxobox. Like this: http://en.wikipedia.org/?title=Physella&diff=308737416&oldid=301400813 Such articles can be found within the broad category Category:Gastropod families. -- Snek01 ( talk) 20:26, 18 August 2009 (UTC) Prose in this request was tweaked by Invertzoo ( talk) 21:27, 18 August 2009 (UTC)
informal group Pulmonata
and added the clade to all instances Tim1357 ( talk) 22:54, 18 August 2009 (UTC)
From http://strategy.wikimedia.org/wiki/Proposal:Spider_popular_project_page_interwikis
Please check interwikis to other language versions on important project pages, because someone on IRC during the strategy meeting said that this particular set of interwikis is in bad shape.
Make sure that correct interwikis to and from the Village Pumps, the Help Desks, the Reference Desk, everything else on http://en.wikipedia.org/wiki/Wikipedia:Questions and the project pages which appear on stats.grok.se/en/top (listed below) exist and return good pages with correct titles.
We need a report to show the state of those interwikis, so that problems can be addressed now and if they crop up, perhaps checking weekly or monthly if it's not too much load?
For those that don't exist or return bad page status values when accessed, or which have no page title, can those issues be fixed with a bot?
Please check the wiktionaries too. Thank you! 99.60.0.22 ( talk) 04:24, 19 August 2009 (UTC)
I am doing edits on
Katipunan and {{
Katipunan}}
template recently. I am requesting for bot assistance to add the following categories to the people mentioned on the
Katipunan article and template {{
Katipunan}}
:
On the other hand, for the objects mentioned in Katipunan, please add:
Thanks for the assisting bot.-- JL 09 Talk to me! 06:45, 19 August 2009 (UTC)
There is a large backlog of images that need their size reduced. I propose that this hypothetical bot reduces the image's resolution by a set ammount (perhaps 25%). -- Tim1357 ( talk) 23:23, 17 August 2009 (UTC)
There are several userboxes which have recently been moved from the template namespace to the user namespace per the userbox migration. All transclusions of these userboxes as templates must be replaced with their new locations in userspace. They are listed at Category:Wikipedia GUS userboxes. 95j || talk 22:10, 24 August 2009 (UTC)
Hello,
Can a bot take all the articles in the
Category:Zoroastrianism category and all it's subcategories, as well as
Category:Parsis and place a {{
WikiProject Zoroastrianism}} template on the pages that do not have this template?
Warrior
4321 17:37, 21 August 2009 (UTC)
Wikipedia:VPT#Cascading_template_transclusions, until it is archived.
Any usage of the infobox settlement template that uses a flag in the field "subdivision_name" is causing the template to load EVERY flag everytime the template loads, and makes it impossible to use What Links Here to see flag template usage. Can someone fix this by making a bot that replaces the flag template usage with the linked country name and a hidden comment that that field should not contain flags?
Look at the What Links Here for Template:MEX as a start to see the extent of the problem. SchmuckyTheCat ( talk) 20:21, 24 August 2009 (UTC)
{{
Flagicon Mexico}}
(or "Mexico" where that is consensual). —Preceding
unsigned comment added by
Rich Farmbrough (
talk •
contribs) 16:51, 26 August 2009 (UTC) User:ProcseeBot seems to be inactive, and I was wondering if anyone would be able to create a clone, for use on Simple English Wikipedia. Thanks, Majorly talk 18:17, 21 August 2009 (UTC)
Either
Or
becasue the archives have no history which makes life difficult. Rich Farmbrough, 15:10, 23 August 2009 (UTC).
.
Could someone add:
{{talkheader}} {{WikiProject Palaeontology|class=stub|importance=low}}
...to the talk pages of the articles in Category:Synapsid stubs that need it? Abyssal ( talk) 14:19, 26 August 2009 (UTC)
Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
I need a bot to noinclude categories on the POTD templates. Basically make this edit to all the subtemplates of Template:POTD. You can see Template:POTD protected/2007-08-08 for the breakage that this is causing. — RockMFR 23:07, 29 August 2009 (UTC)
Hi fellas,
This is a task I tried to carry out with my own bot, but I had some issues programming it so I need your help. Basically, I think it would be interesting to add the CMOnline template and the BrahmsOnline template to the External Links section of contemporary classical music composers.
The first template links to exceprts from sound archives. There is a web service (SOAP method) that can tell the bot if there is content to be linked about a composer.
The second one links to a bio. I will provide a list of composers for which bios are available on the linked side, and the corresponding link in an array.
So basically the task would be:
- parsing categories like "Contemporary music"
- for each composer name, check out if a link is to be added, and add it if it has to.
I can only think of this as a benefit for the encylopedia. Regards, -- LeMiklos ( talk) 10:04, 24 August 2009 (UTC)
There are over a hundred such links that can be created (search gives 519 hits). Changes should be from \[\[floodplain]] forest(s?) and floodplain forest(s?) to [[floodplain forest]]\1. I'm working on a translation from it:Foresta inondata. Balabiot ( talk) 13:57, 30 August 2009 (UTC)
This would be a user assisted bot.
The bot would operate on the thousands of genus stub pages that exist on wikipedia. The bot would load the genus page, and assess weather or not the page had a species section. if not, then it would go to the corresponding (raw) page on wikispecies. Then, it would copy all information between the end of the colon on Species: to the first equals mark. Then, it would go back to the wikipedia article, create a new ==species== heading, and paste the species there. I know this is possible, i just lack the know-how to build it. Tim1357 ( talk) 04:04, 31 August 2009 (UTC)
Since June 2009, the Slovenian geographic information system geopedia.si is also available in English. Could someone please run a bot and update links to the English version of geopedia.si like it has been done for Log pod Mangartom ( diff) or Arčoni ( diff)? The articles to be corrected are mainly located in Category:Cities, towns and villages in Slovenia. Thanks a lot. -- Eleassar my talk 21:08, 1 September 2009 (UTC)
I'm not entirely sure this is a bot request or can be done another way...but....
Basically, what I'd like to do is keep track of changes in page size for all articles tagged with {{ WikiProject Pharmacology}} (or that exists in a subcat of Category:Pharmacology articles by quality, whichever's easier) to have an idea of when an article might need its quality assessment changed. This would be done in 2 phases: one massive dump at first, and then an analysis every month or so. The output would be the article's name, the date it was last tagged, the size of the article then, the size of the article at runtime, and the change in size.
The first phase would scan ALL the articles, get their histories, find when they were assessed, compare page sizes, and so on. After that, it would only need to be done on articles updated within the last month (or whatever the frequency we choose is).
The actual criteria are more complex but, basically, it would let us know if there's an article assessed as a stub that is 20KB!
I'd also like it to dump a list of articles over 25KB as candidates for being split.
None of this would involve any editing at all.
Can this be done? I've been searching around a lot, but I can't find any tools or bots that work based on page size!
Thanks, Skittleys ( talk) 19:27, 1 September 2009 (UTC)
mysql> SELECT MIN(rev_len), MAX(rev_len), AVG(rev_len), STD(rev_len)
-> FROM page
-> JOIN revision on page_id = rev_page
-> WHERE /* Specify the page */
-> page_namespace = 0 AND page_title = "Tourette_syndrome"
-> AND /* Date range */ /* From now till 1 Jan 2009 */
-> "20090101004415" < rev_timestamp;
+--------------+--------------+--------------+--------------+
| MIN(rev_len) | MAX(rev_len) | AVG(rev_len) | STD(rev_len) |
+--------------+--------------+--------------+--------------+
| 68345 | 72617 | 70322.1774 | 689.5260 |
+--------------+--------------+--------------+--------------+
1 row in set (0.01 sec)
Rich Farmbrough, 02:57, 3 September 2009 (UTC).
Would this be the right place to make a request for a bot to replace about 1500 translusions of Template:Importance with Template:Notability?
A few points:
Many thanks in advance for your attention, — Martin ( MSGJ · talk) 13:33, 26 August 2009 (UTC)
Actually SamckBot does this when it comes across the template. I will look into it. Rich Farmbrough, 12:36, 2 September 2009 (UTC).
I'm not sure if this has been proposed here before, but at WebCite FAQ, they have a suggestion that a wikipedia bot be devolped to provide archive links to urls.
develop a wikipedia bot which scans new wikipedia articles for cited URLs, submits an archiving request to WebCite®, and then adds a link to the archived URL behind the cited URL
This seems like a feasible idea. Smallman12q ( talk) 12:18, 2 September 2009 (UTC)
There are bots already doing it. But can someone run it for articles transcluding {{ Cleanup-link rot}} and then remove the template from the articles? Thanks. -- Magioladitis ( talk) 00:18, 30 August 2009 (UTC)
Could a bot be written to use Yahoo's Smush.it to optimize(in a lossless way) the most heavily used images? Smallman12q ( talk) 17:01, 2 September 2009 (UTC)
There is a large backlog of images that need their size reduced. I propose that this hypothetical bot reduces the image's resolution by a set ammount (perhaps 25%). -- Tim1357 ( talk) 23:23, 17 August 2009 (UTC)
all these images can then be resized to 300px.
Note: this process can be then applied to a number of other uses, such as film posters, dvd covers, and video game covers.
Tim1357 ( talk) 22:08, 3 September 2009 (UTC)
Not sure this is bot work per se but asking here is certainly my best shot at getting this done. Could someone give me a rough count on the number of lists on the en.wiki? There are currently 1500 featured lists and I was wondering how that compared to the total. Better yet, is there something on toolserver that allows counting the number of unique articles in a category and its subcategories? Thanks, Pichpich ( talk) 21:08, 3 September 2009 (UTC)
Hi, The database Municipality Atlas Netherlands ca. 1868 (Kuyper Atlas) has been moved. That's why 500+ external links on the English language wikipedia became dead links. It will take a long time to update them one by one, so I was wondering if a Bot could be helpfull to take this action. I dont know much about the technics, so maybe sombody could help me out.
example http://en.wikipedia.org/wiki/Vlodrop The external link * Map of the former municipality in 1868 * The old (dead) link is http://www.kuijsten.de/atlas/li/vlodrop.html The new link is http://www.atlas1868.nl/li/vlodrop.html
The directory structure is still the same, so if the string "kuijsten.de/atlas" can be changed in "atlas1868.nl" on all the 500+ wikipedia pages, the links will be alive alive again!
Regards, Quarium, The Netherlands -- Quarium ( talk) 20:06, 4 September 2009 (UTC)
Easy to fix,fixing. Rich Farmbrough, 04:00, 7 September 2009 (UTC).
In August, a large batch of stub articles were created by bot for New Zealand rivers, using information from the LINZ website. They are all marked with {{
LINZ}}, showing "accessdate=07/12/09". In New Zealand, that means December 7th. A bot would be useful to change all articles in
Category:Rivers of New Zealand which have that access date on the LINZ template to read "12/07/09" (or "12 AugustJuly 2009", for that matter). Cheers,
Grutness...
wha? 22:44, 6 September 2009 (UTC)
I don't know if this has been suggested before, but would it be possible to create a bot to move pages because there are lots that have hyphenated titles and should have spaced ndash titles. I have moved many manually in the past (e.g. this fun-filled morning) but it is very tedious. I was thinking this would be a simple task for a bot, and had hoped AWB might do it but I don't think it can. Is there anyone who could make or help a complete bot noob make one as there are hundreds more similar titles that need fixing. Thanks in advance, Rambo's Revenge (talk) 22:41, 5 September 2009 (UTC)
Well it's not urgent so it could run off database dumps. Rich Farmbrough, 19:54, 6 September 2009 (UTC).
This is not a very difficult task in terms of actually getting the list, the difficulty is in choosing whether or not to move the article. Either way, Coding... ( X! · talk) · @988 · 22:43, 7 September 2009 (UTC)
I really appreciate you guys taking this on in ways I don't know. Just to confirm (possibly going over some of what Rich says) I only request the " - "s be dealt with because these should never exist as articles (only as redirects). That solves 1 & 5 (both ignored). Movie & TV titles (2) should be dealt with the same, and redirects (3) should be skipped. I don't know much about coding but I guess this need to come first. Target pages shouldn't exist for non-redirects, because they are nearly all moved from hyphen to ndash, and most (all?) directly creating articles at the spaced ndash know to create a convenience redirect. So 4 shouldn't be a problem, but I guess skip if it happens because if it is a non-redirect hyphened page and ndahsed one there will most likely be a parallel history problem there too (can these "(4) skipped cases" be categorised if they happen?). Thanks all, Rambo's Revenge (talk) 16:27, 8 September 2009 (UTC)
Ok, this is probably a bit of a pain to program but it would be a big help. Wikipedia:Find-A-Grave famous people is a subproject of WP:MISSING that tries to identify missing articles about notable dead people. There's been quite a lot of work on these lists to classify the individuals and in many cases articles exist in wikis in other languages. When this is the case, a link has been created to such articles. For instance, at the top of the list Wikipedia:Find-A-Grave famous people/A you find links to the es.wiki articles about Anny Ahlers and Anselmo Aieta. What I would like is to have a bot create translation requests for such entries. I estimate that there are a few hundred of these. I'll check the translation requests manually but I'd be grateful if a bot can generate them. Let me know if you need any xtra info for the task. Thanks, Pichpich ( talk) 15:31, 9 September 2009 (UTC)
Many citation template have parameters such as |journal=[[Science]]. These should be replaced by |journal=[[Science (journal)|Science]].
In general the logic should be
Headbomb { ταλκ κοντριβς – WP Physics} 21:06, 25 August 2009 (UTC)
There are 6000 odd links to Science appoximately 1% are non-dabbed journal=. To the total un-dabbed pages of the 400 journal tittles there are 25,525 links, we may expect that about 1% are journal = entries at most, so this can be safely AWB'd being unlikly to reach 200 edits. Rich Farmbrough, 17:45, 8 September 2009 (UTC).
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
Can't believe there is not a bot that does this already, but I regularly come across links formatted as full urls, i.e. http://en.wikipedia.org/wiki/Some_Article that instead should be Some Article. It seems like having a bot make this change would help us with article size issues, would correctly show red vs. blue links, and just be cleaner (and who doesn't like that?). Possible expansion to also convert urls for non-english WP pages (and other projects) to the appropriate interwikilink. Has this suggestion come up before? Thoughts, ideas, reactions? UnitedStatesian ( talk) 04:37, 9 September 2009 (UTC)
The use of the {{ t}} template, outside of meta, is no different than that of {{ tl}}. Even on meta, it's {{ t0}} instead. Given the number of pages which use {{ t}} as {{ tl}}, it'd be good to have a bot change over instances where the former is used to the latter.
-- coldacid ( talk| contrib) 18:45, 10 September 2009 (UTC)
Hi All,
Please tag all articles in Category:Redirects from Digimon's talk pages with {{ WikiProject DIGI|class=redirect}} (note that most are already tagged with {{ WikiProject DIGI}}). This is required as most of the talk pages in Category:WikiProject Digimon articles are for redirects, making it impossible to distinguish redirects from actual articles. This would also allow future categorisation by quality and by importance, as these redirects would not require assessment.
Regards,
G.A.S talk 08:39, 3 September 2009 (UTC)
{{WikiProject DIGI|class=redirect}}
prepended on them.class=redirect
using the "find and replace" method.{{WikiProject DIGI|class=redirect}}
prepended on them.Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
"Go" Go to next page in list [What Transcludes this page:Template Dead Link] look for "{{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] {{ Dead Link}}</ref>" ---we call this "refeference" if not found, "Go"--- it skips and starts over if found: search [:anycharacters Two:] for 4 numbers, starting with eitther 20, or 199 -- as the internet existed only in the 90's and 2000's Copy those 4 numbers --we'll call them year lookup web.archive.org/"year"/url.--we call this "archive" If not exist, "Go"----skips and stars over if exist search artice for "refrence"-----remember? replace with: {{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] |archice url:"refrence"{{ Dead Link}}</ref> "Go"
I guess persistence pays off, as I will go ahead and put this on my to-do list. That doesn't mean I'll get to it soon as I have several projects I'm working on, but it does mean I will do it. :) -- ThaddeusB ( talk) 01:42, 8 September 2009 (UTC)
Just a suggestion, as I'm all for automation, but I have seen sites change purpose completly as domain names are bought and sold, among other things, and the archive is very fallible.
(*) Log pages
The idea here is
Hi, Tim. Your inquiry got passed to me based on the assumption that when you speak of dead links, you mean to link to content in the Wayback Machine archive of web content.
Please feel free to have your automated checker/link-fixer make whatever requests are necessary. Our blanket robots.txt block is to prevent
indiscriminate crawling, and especially the case where archived content could be mistaken for original sites if indexed by search engines.
I do suggest you use an identifiable User-Agent on such requests, with a contact URL or email address, just in case your volume of requests creates a problem,
though I doubt that will be the case.
Also, please start slow -- say only one request pending at a time -- unless/until you absolutely need to go faster. Let me know if you have any other questions! - Gordon @ IA Web Archive
Tim1357 (
talk) 00:38, 10 September 2009 (UTC)
{{Infobox MMAstats}}
is now converted to call {{
Infobox martial artist}}, and can be substituted. There are over 500 transclusions to deal with, please.
Andy Mabbett (User:Pigsonthewing);
Andy's talk;
Andy's edits 17:40, 10 September 2009 (UTC)
Can a bot be used to put the {{Uncategorized}} tag on pages without categories (logically). It sounds like it would be editing new pages a lot, too.-- OsirisV ( talk) 20:24, 9 September 2009 (UTC)
Bot should browse through all pages that contain templates with astronomical coordinates (i.e. Template:Starbox_observe, Template:Infobox galaxy)
While parsing "dec" parameter please be aware of sign before first number as it can be in many forms i.e. "-", "—", "−", "& minus;", "+", "& plus;". If parser confused then mark page for manual review. There are 1000s of pages need to be browsed and fixed. Examples of pages I fixed manually - [9], [10]. Thanks. friendlystar ( talk) 23:01, 13 September 2009 (UTC)
Something I just thought of; I don't really have the time to develop it myself, but I think it's a worthy goal. We have a few pairs of templates where one is supposed to be replaced by the other: the most prominent example is {{
coord}}
and {{
coord missing}}
: if coordinates are needed on a page, the template is added, and then when the coordinates are found they overwrite the needed template. But I bet there are some pages which have both templates, and there I expect the coord-needed template can be safely removed, certainly semi-auto. I had a SQL query running on the toolserver to see if I could count the situations where this is the case, but it died :( Thoughts?
Happy‑
melon 23:02, 10 September 2009 (UTC)
Would it be possible to write a bot that looked for articles without photos and first checked if they had a link to another language wiki that used a commons image; and if that wasn't the case searched by article name in commons. Then produced lists of articles with possible photos and reasons why they might match, much as per Wikipedia:WikiProject Red Link Recovery? We could then set up a project to go through add the photos or mark the suggestion as spurious, so the bot would also need to have a facility to suppress suggestions previously marked as wrong. Ϣere SpielChequers 22:24, 16 September 2009 (UTC)
This Wikiproject category is full of articles and subcategories, instead of article talk pages and category talk pages. Can someone please organise a bot to transfer the category tags across to the talk pages? Hesperian 00:00, 15 September 2009 (UTC)
|MAIN_CAT = WikiProject Ancient Near East articles
Ok, so i hate it when people start discussions that have already exitsted, so please tell me if there is anything like this that i can research.
I want someone to help me build a bot that fixes dead links. The bot would work like this. It would find the said dead link from Wikipedia:Dead external links and look for them in the internet archive [archive.org]. Then, it would change the link to the newly found web page. If you have not used archive.org, it works like this: take the url of any dead link. Then simple paste http://web.archive.org/web/ in front of it. Then press enter. The archive searches for the most recent back up of the page, and then it produces it. All one must do is replace the dead link in the article with the new webarchive url. Please try this yourself. Try any of the following links here Wikipedia:Dead external links/404/a and it will work. I have NO programing experience, so please help me with this. Tim1357 ( talk) 03:07, 29 August 2009 (UTC)
"Go" Go to next page in list [What Transcludes this page:Template Dead Link] look for "{{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] {{ Dead Link}}</ref>" ---we call this "refeference" if not found, "Go"--- it skips and starts over if found: search [:anycharacters Two:] for 4 numbers, starting with eitther 20, or 199 -- as the internet existed only in the 90's and 2000's Copy those 4 numbers --we'll call them year lookup web.archive.org/"year"/url.--we call this "archive" If not exist, "Go"----skips and stars over if exist search artice for "refrence"-----remember? replace with: {{Cite web [:any characters:] |url= [anycharacters one] |accessdate=[:any characters two:] |archice url:"refrence"{{ Dead Link}}</ref> "Go"
I guess persistence pays off, as I will go ahead and put this on my to-do list. That doesn't mean I'll get to it soon as I have several projects I'm working on, but it does mean I will do it. :) -- ThaddeusB ( talk) 01:42, 8 September 2009 (UTC)
Just a suggestion, as I'm all for automation, but I have seen sites change purpose completly as domain names are bought and sold, among other things, and the archive is very fallible.
(*) Log pages
The idea here is
Hi, Tim. Your inquiry got passed to me based on the assumption that when you speak of dead links, you mean to link to content in the Wayback Machine archive of web content.
Please feel free to have your automated checker/link-fixer make whatever requests are necessary. Our blanket robots.txt block is to prevent
indiscriminate crawling, and especially the case where archived content could be mistaken for original sites if indexed by search engines.
I do suggest you use an identifiable User-Agent on such requests, with a contact URL or email address, just in case your volume of requests creates a problem,
though I doubt that will be the case.
Also, please start slow-- say only one request pending at a time -- unless/until you absolutely need to go faster. Let me know if you have any other questions! - Gordon @ IA Web Archive
Tim1357 (
talk) 00:38, 10 September 2009 (UTC)
Is it possible to have all the articles in Category:Sub-Roman Britain added to Category:Sub-Roman Britain task force articles? Thanks. Dougweller ( talk) 13:57, 18 September 2009 (UTC)
Hey, I want to build a list of CERN experiments, but it would be long and tedious, since there's a lot of it. So take a look at my sandbox (and how it's written) to see what sort of end result I'd want to get.
What the bot would need to do is go through http://www.slac.stanford.edu/spires/find/experiments/www2?ee=CERN-NA-001 through http://www.slac.stanford.edu/spires/find/experiments/www2?ee=CERN-NA-63, and retrieve the relevant information. (It's possible that not all the links will produce a result, since the latest experiments are rather new, and the older ones are not all documented).
Obviously this should be done for all the experiments found in the SPIRES database, namely EMU, IS, NA, PS, R, T, UA, and WA experiments. When all's said and done it needs to run up to EMU20 experiment, IS494 experiment (lots missing), NA63 experiment, PS215 experiment (lots missing), R808 experiment (lots missing), T250 experiment (lots missing), UA9 experiment, and WA103 experiment respectively. There'll be a shiny barnstar for whoever codes this. You can make the bot write straight in my User:Headbomb/Sandbox8/NA experiments and so on if you want. Headbomb { ταλκ κοντριβς – WP Physics} 17:46, 14 September 2009 (UTC)