This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 35 | ← | Archive 37 | Archive 38 | Archive 39 | Archive 40 | Archive 41 | → | Archive 45 |
As decided in the WikiProject Basque discussion page, this Basque province's name is Labourd in the English-language Wikipedia, and not Lapurdi. So we need that all the contents in Category:Lapurdi be moved to Category:Labourd. Thanks in advance. -- Xabier Armendaritz (talk) 12:32, 29 October 2010 (UTC)
I don't know if this has been requested before, but there should be a bot that replaces bare URLs (in those articles that are marked with
Template:bare URLs) with
Template:Cite web. Essentially like this: http://www.website.com
replaced with {{Cite web |url=http://www.website.com |title=Website title <!-- added by bot ---> |accessdate={{subst:DATE}} <!-- added by bot --> }}
. —
bender235 (
talk) 18:56, 29 October 2010 (UTC)
WildBot had tagged talk pages of articles which had disambiguation links. However, its operator is busy in real life and these notices have become dated. Using the Toolserver, I have create a list where these tags can be safely removed. We now need a bot to remove those tags (up to 14,000). Any takers? — Dispenser 05:14, 30 October 2010 (UTC)
On occasion, I see articles with English-language references which contain {{
en icon}}; see, for example,
Task Force Danbi. It can be assumed that English-language sources are the default on the English-language Wikipedia, and so it seems unnecessary to identify a source as being in English. Assuming that large-scale removal of {{
en icon}} in these situations would not be controversial, could a bot check for and remove instances of {{en icon}}
contained in <ref>...</ref>
tags? --
Black Falcon (
talk) 06:24, 31 October 2010 (UTC)
I'm looking for a very simple bot that searches for a word within pages and deletes it. Can you make it with the word "test"? I can't seem to find a simple program that just does that. Thanks! → ΑΧΧΟΝΝ fi re 21:47, 1 November 2010 (UTC)
user:CitationTool was a very useful bot that isn't running anymore. It automated the job of archiving external links in citations, among other tasks. The source code has been released by its owner, User:Lulu of the Lotus-Eaters. See User talk:CitationTool#How do I make it go? Is anyone willing to take over this bot and get it going again? A standalone tool with the same functions would also be useful. Will Beback talk 00:07, 4 November 2010 (UTC)
The usage instructions for Template:Talk header indicate that the template "should not be added to otherwise empty talk pages". Could a bot check transclusions of the template (and its redirects, see here) and generate two lists:
Furthermore, could the bot delete all pages in the first list (i.e., where the only edit was to add the template to an otherwise empty page)?
Thank you, -- Black Falcon ( talk) 03:38, 4 November 2010 (UTC)
SELECT IF(rev_parent_id=0, "Single", "Many") AS "revs",
CONCAT('[[Talk:', page_title, ']]')
FROM page
JOIN templatelinks ON tl_from=page_id
JOIN revision ON rev_id=page_latest
WHERE tl_namespace=10 /*Template:*/ AND tl_title="Talk_header"
AND page_namespace= 1 /*Talk: */ AND page_len < 30;
The Romanian Wikipedia has implemented a new policy this year of transitioning to the correct versions of ș and ț. The transitions has practically been completed on the ro.wiki. I have started my own miniproject to modify the diacritics manually whenever I find them, but it's way to tedious for articles belongings to categories like Romanian place names, personalities, etc. where there are a lot of characters to change and pages to move. I recommend building a bot based on Strainu's bot for the en.wiki (and possibly extended to other wikis) that will correct the diacritics in articles within certain categories and move them if necessary. Ayceman ( talk)
Hi. The mentioned infobox went through a big history of renamings and mergers. All articles using the old names and boxes use old fields which needs to be updated. There are five old templates (which its links are now replaced):
These templates use old parameters with are now included in {{ Infobox power station}} as DEPRECIATED (in edit mode). These parameters should be "find and replaced" by given parameters. Could this be done?
If so, could someone join Template talk:Infobox power station to help discuss the updates required? Kind regards. Reh man 13:05, 7 November 2010 (UTC)
Per Talk:Highways in Croatia#Recent title changes, a mess has been created by Nono64 ( talk · contribs), which is a huge pain to undo manually. For example, A1 (Croatia) was moved to A1 road (Croatia) and then to A1 autocesta (Croatia). Now that I moved it back to A1 (Croatia), the talk page move also needs to be reverted manually because the destination one already exists, doubling the amount of work. I'd appreciate it if someone make a bot that would just roll back all moves done by Nono64 to articles named "[A-Z][0-9]+ (Croatia)". Afterwards we might actually use the same kind of bot to pursue a similar move, but with the distinct difference that it would be a product of consensus and not a random user's whim. -- Joy [shallot] ( talk) 16:17, 20 November 2010 (UTC)
User:RSElectionBot was employed for the purposes of maintaining this voter log for last year's ArbCom elections. We'd like to have the same thing running this year, but the bot's operator, User:Rspeer, seems to have taken a leave of absence. I imagine it's a relatively simple bot to run; would someone here be able to volunteer? Skomorokh 14:00, 15 November 2010 (UTC)
On behalf of the Indian WikiProject, I'd like to request if a bot could be made available to perform the delivery of the project's newsletter to its participants. The newsletter is located here and the list of participants on whose talk pages it is to be delivered is here. Any help would be appreciated, since our regular bot is down for a while. Regards, SBC-YPR ( talk) 13:07, 22 November 2010 (UTC)
Hello. Some time ago, SoxBot stopped monitoring new usernames for potential username violations. The bot-op has been pinged a couple of times ( here), but there hasn't been a response. Is it possible to have a new bot monitor usernames? Thanks! TN X Man 15:59, 17 November 2010 (UTC)
Would it be possible to create a bot that can unlink pages that have been deleted via an XfD discussion without having to resort to using Twinkle to do it instead? :| TelCo NaSp Ve :| 00:52, 21 November 2010 (UTC)
Per the discussion at Wikipedia:VPR#Input_randomisation (and others in the past which were rather more enthusiastic, but I forget where), can we please have a bot which will, in essence, randomly assign requests for input to RFC to members of a list of people who want such requests.
Some elements of how this will work:
Any takers? Rd232 talk 01:17, 27 November 2010 (UTC)
The service will manifest itself in the form of Wikipedia:Requests for comment/Comment duty. harej 03:28, 27 November 2010 (UTC)
This bot request is still outstanding, the orginal offer to complete it having lapsed. If its too big a job for one person, perhaps it could be sub-divided? Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:08, 28 November 2010 (UTC)
Okay, so some of these are false positives, but I have been finding many uses of {{start box}}
without the corresponding {{end box}}
or {{
end}} or whatever. This often goes unnoticed, since it is at the end of the page, but it is a real problem if it happens before the PERSONDATA template, since it then makes the "person data" information visible. Now, there are many different "end" templates which can effectively close the {{start box}}
, which is why finding the open ones is a bit difficult. However, I was able to come up with
this as a first start at a list. Perhaps someone can refine it. I would imagine with such a list, it would be a relatively straight forward task with AWB.
Plastikspork
―Œ(talk) 04:50, 2 December 2010 (UTC)
I just updated
Template:FeaturedPicture to include a {{{type}}}
parameter to distinguish animations and videos from pictures, as they are all nominated via
WP:FPC but there are separate categories:
Category:Featured videos,
Category:Featured animations, and
Category:Featured pictures. Could a bot go through
Category:Featured videos and
Category:Featured animations and add |type=video or |type=animation to
Template:FeaturedPicture? /
ƒETCH
COMMS
/ 03:37, 3 December 2010 (UTC)
|type=
parameter? -
EdoDodo
talk 07:58, 4 December 2010 (UTC)
Is there a way to possibly modify User:SoxBot so that it catches and reports inappropriate usernames that trip one or more of the edit filters listed near the top of WP:UAA and make a note of it in the bot's report instead of waiting till the user(s) edit? :| TelCo NaSp Ve :| 02:30, 5 December 2010 (UTC)
Is there a bot that can tag a talk page of the next higher taxon when a new article taxon article is added? It would have to scan for the taxa articles every week or so. Say I add the Senecio biglovii article to wikipedia one day. Later on that week, the bot finds that article, scans its taxobox, finds out the article I added was a species article, then tags the genus article talk page ( Talk:Senecio with a simple one line alert, with proper edit summary "New species article," in a new section:
"The species Senecio biglovii article was created on 2010-12-03."
For new family article, the bot would tag the order page if it exists, if not, the class page, if not the phylum page, if not the kingdom page.
The same with dabs. An article with a word in the name is created, say Manija Fairbanks, and the bot adds the line to either Talk:Fairbanks (disambiguation) or Talk:Fairbanks (surname) (creating the page in the latter case) if the article is included in a biography category and the surname dab exists:
"The Manija Fairbanks article was created on 2010-12-01."
Not necessary it be the same bot for both tasks, and maybe these bots exist already? -- Kleopatra ( talk) 18:58, 5 December 2010 (UTC)
I don't know whether this task would be possible for a bot & I've really come to ask the experts before developing a full request. Today English Heritage appears to have withdrawn the Images of England web site - see discussion at Wikipedia talk:WikiProject UK geography#www.imagesofengland.org.uk. A quick Google search for http://www.imagesofengland.org.uk within the domain http://en.wikipedia.org/ finds 7,610 wikipedia articles (some of which eg lists may link to more than one Images of England page)!. The format of the new URL is very different but all keep the same reference number within it & add resourceID=5 on the end. Example:
Is this the sort of task a bot could handle?— Rod talk 21:09, 3 December 2010 (UTC)
As per a discussion at WP:Tennis I would like you to help in fixing all the daviscup.com links on English Wikipedia. The official Davis Cup site got revamped causing hundreds of referring links to go dead. Fortunately most of these links have their equivalent on the new site. We've already put together a list of these links and I filtered them to two major categories that encompass 75% of aforementioned links and are still repairable.
Lajbi Holla @ me • CP 23:24, 5 December 2010 (UTC)
MerlLinkBot: als-10: 1 Page; fr-0: 3 Pages; it-0: 92 Pages; pt-0: 108 Pages; es-0: 93 Pages; he-0: 38 Pages; cs-0: 191 Pages; fi-0: 68 Pages; en-0: 29 Pages; nl-0: 15 Pages; de-0: 20 Pages; th-0: 2 Pages; ja-0: 167 Pages; bg-0: 115 Pages; vi-0: 1 Page; ast-0: 3 Pages; ka-0: 1 Page; no-0: 3 Pages; sv-0: 60 Pages; ru-0: 4 Pages; es-0: 4 Pages; ar-0: 15 Pages; ko-0: 5 Pages; kn-0: 2 Pages; ca-0: 5 Pages; sr-0: 23 Pages; tr-0: 4 Pages; ro-0: 12 Pages; zh-0: 7 Pages; pl-0: 36 Pages; ml-10: 1 Page; es-10: 1 Page; nl-10: 1 Page; ca-10: 1 Page; eo-10: 1 Page; fr-0: 71 Pages; pt-0: 24 Pages; eo-0: 1 Page; hr-0: 9 Pages; it-0: 18 Pages; nds-0: 9 Pages; en-0: 145 Pages; id-0: 8 Pages; lv-0: 7 Pages; eu-0: 6 Pages; cv-0: 5 Pages; pl-0: 1 Page; sl-0: 5 Pages; da-0: 4 Pages; uk-0: 3 Pages; mk-0: 2 Pages; ta-0: 2 Pages; te-0: 2 Pages; sh-0: 2 Pages; el-0: 2 Pages; hu-0: 1 Page; gu-0: 1 Page; cy-0: 1 Page; lb-0: 1 Page; hi-0: 1 Page; jv-0: 1 Page; also fixed serveral templates on other wikis. Merl issimo 05:08, 7 December 2010 (UTC)
Per the discussion at Wikipedia talk:Adopt-a-user#Bot to track progress and pairs, would it be possible to have a bot like that created? Note that there really aren't that many people active in the behind the scenes tasks at adopt-a-user, so those comments are pretty much all we're going to get from the program. The bot's task would be to periodically check a user's contributions for the latest timestamp and report it to a centralized tracking page at adopt-a-user. The centralized page would include every involved user's last edit information (date) and ideally, whether or not they are blocked. Users in the category Category:Wikipedians adopted in Adopt-a-user and other critical categories would be checked for activity so we can know whether or not the pairs are active. Thanks. Netalarm talk 04:12, 8 December 2010 (UTC)
Currently, CRWP articles are being tagged with {{
WikiProject Canada}} with the |roads=yes
parameter. We're resurrected and expanded {{
Canada Roads WikiProject}}. We could use a bot to run through the articles and add the CRWP-specific template using the province, class and importance information from the Canada template. If an article is tagged with multiple provinces, the CRWP template can handle them as province1=XX through province10=XX where the input is the postal abbreviation. The Ontario articles are tagged with {{
WikiProject Ontario Roads}} as well, and those tags can be removed. (That template is being retired in this update.) Any assistance is appreciated.
Imzadi
1979
→ 01:14, 3 December 2010 (UTC)
How about a bot that goes through all of the Maintenance categories that have "undated articles/files" sections (i.e. Category:Articles lacking sources), finds the relevant tag in all of the undated articles, and adds the current month to each of them. -- vgmddg ( look | talk | do) 18:41, 2 November 2010 (UTC)
Hello? Anybody there? -- vgmddg ( look | talk | do) 20:30, 9 November 2010 (UTC)
Hello? (again) -- vgmddg ( look | talk | do) 20:32, 19 November 2010 (UTC)
- Go to category in question. ( Category:Articles lacking sources, Category:Articles with topics of unclear notability, etc.)
- Get every list item under section "Pages in category '{{PAGENAME}}'"
- Weed out whitelisted pages that are meant to be at the top level. (For example, the page Wikipedia:Requests for expansion is supposed to be in Category:Articles to be expanded)
- Repeat with every article in the remaining list:
- Go to article in question.
- Scan through entire article for template(s) that don't have dates on them that are supposed to have them. (You could probably talk to User:Rich Farmbrough (maker of SmackBot) for a list of the templates to look for.)
- Add date parameter to offending template(s).
- Return to category and repeat.
Once all of the categories are cleared, the bot will run in the background and wait for another article to be added to one of the categories on its list. The bot would be used for pretty much any category in Category:Wikipedia maintenance categories sorted by month that that is added to via template. I have laid out my plans. All that is needed is for someone to convert it into executable code. -- vgmddg ( look | talk | do) 01:09, 20 November 2010 (UTC)
|date=month XXXX
to |date=Month XXXX
and |date=Day Month XXXX
to |date=Month XXXX
making method more efficient. --
Magioladitis (
talk) 14:32, 22 November 2010 (UTC)Wikipedia:Bots/Requests for approval/KarlsenBot 6 asks to perform the task. AWB already fixes some of common mistakes found on dated templates, improving success of this task. -- Magioladitis ( talk) 00:00, 29 November 2010 (UTC)
|date=
for the current date?
Anomie
⚔ 21:31, 9 December 2010 (UTC)
I spent my entire day to add more dated templates, more redirects and fix many templates, like this one, which were supposed to be dated but they don't. I even fixed Multiple issues a bit. The bot part isn't the only one. We need someone to check all the templates and see if they work fine. -- Magioladitis ( talk) 20:14, 12 December 2010 (UTC)
Per Wikipedia:Categories for discussion/Log/2010 November 29#Category:2100, I am formally requesting that categories Category:2031 through Category:2098 be created as follows: Category:20''ab'' should have:
{{ portal|History}}
{{ yearcat}}
{{ Decade category header|decade=20a0}}
[[Category:Years in the future]]
[[Category:Years]]
I argued for deletion of Category:2100, but if it is to be there, let's make it part of a pattern: Category:2099 was created (with incorrect sort tags) during the process, and I corrected the sort tags. I have been unable to get AWB to work for some time, or I'd attempt to do this, myself.
Once this is done, Category:Years in the future should be removed from the decade categories Category:2030s through Category:2090s. — Arthur Rubin (talk) 16:41, 13 December 2010 (UTC)
I have generated a list of talk pages with obsolete WildBot tags. They can be safely removed with the following regex:
\{\{(User:WildBot/m01|User:WildBot/msg)\|([^{}]|\{\{User:WildBot/[^{}]*\}\})*\}\}\n?
If nothing is left you may wish to speedy delete the page (WildBot used G7-author). — Dispenser 01:21, 24 November 2010 (UTC)
Wildbot would be doing this itself, but I think this edit turned it off. Could someone who is bot-clueful turn it back on? (Or explain what's involved in having it get back to work?)-- W☯W t/ c 20:13, 15 December 2010 (UTC)
In dealing with WP:LINKROT, one of the suggestions is to use the on-demand web archiving service WebCite. A fine idea, but how often does it really happen that people do that? Linkrot is a particularly pressing problem for BLPs. Our article about Chante Jawan Mallard is a good example. Three of the five references have gone bad, leaving a CNN article and a Snopes page to confirm that she was convicted of an unusual murder. Could a script be written to trawl the external links in BLPs and request archiving for all of them at WebCite? A bot could then place a comment or template on the talk page noting that the webpages have been webcited. I'm not sure yet how we would need to coordinate with WebCite, given the demands this might make on their program.-- Chaser ( talk) 10:44, 12 December 2010 (UTC)
Amongst my routine cleanup, I often come across poorly declared infoboxes (see example). This makes them very editor-unfriendly (for both new and expert editors). So I was wondering if it would be possible to create an "infobox cleaner bot" that would crawl across infoboxes and do the following fixes
That is, convert
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = 528 | }} '''Bob the Magic Disease''' is a magic disease named "Bob".
into
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = med | eMedicineTopic = 528 }} '''Bob the Magic Disease''' is a magic disease named "Bob".
Now most of these fixes could be incorporated into AWB for general cleanup, and would not warrant a edit on their own, but I think a bot restricting itself to articles which don't have pipes on the left, and/or which have more than one parameter per line, goes beyond general cleanup. A bot editing them would makes the infoboxes much more editor-friendly and much MUCH less intimidating to newcomers, and would go a long way in preventing the propagation of horribleness through copy-pasting across different articles. Headbomb { talk / contribs / physics / books} 15:47, 13 December 2010 (UTC)
I have a framework for this somewhere that I used for Infobox Album and Infobox French commune. However I was looking for somewhat more thorough clean up. I will keep a watching brief on this while I do other stuff.
Rich
Farmbrough, 21:20, 15 December 2010 (UTC).
I'm going to have to put this off until sometime in January, there's too much I need to still get done. There are some rules already implemented in the commonfixes library (used by reflinks) to move pipes, but I intend to finish this up eventually. — Dispenser 12:58, 17 December 2010 (UTC)
Update articles in certain category/subcategories. Example of work needs to be done: [6]. Instructions: Wikipedia_talk:WikiProject_Gastropods#Taxonomy_of_Heterobranchia. -- Snek01 ( talk) 12:10, 16 December 2010 (UTC)
Add missing taxon. Follow the link: Wikipedia_talk:WikiProject_Gastropods#Error in Littorinimorpha articles by Ganeshbot. -- Snek01 ( talk) 20:00, 20 December 2010 (UTC)
We're not going to require reconfirmation edits. This was a silly idea. Moving on. -- MZMcBride ( talk) 19:01, 21 December 2010 (UTC) |
---|
The following discussion has been closed. Please do not modify it. |
Can we construct some way for messagebots to stop sending messages to retired users? For example, if there was some way to turn off EdwardsBot so that it could just stop sending useless Signpost editions to some retired users' talkpage and clutter the entire page with useless threads? I find it wastes a lot of resources and the talkpage would then become some dumping ground for these editions, and furthermore, it hinders someone's navigation when they are looking for a particular thread related to the users' histories (e.g. an ANI incident) and have to shuffle through the multiple Signpost articles to look for the right thread to link to ANI. To compensate for these, I would prefer that we use some sort of confirmation page to ensure that the recipients of the message bots are still active enough to respond to them. :| TelCo NaSp Ve :| 16:16, 6 December 2010 (UTC)
Uh, the bot is definitely sending this message to active users, ex. Brad101 and Catalan. May want to tweak your coding. Ed [talk] [majestic titan] 10:13, 21 December 2010 (UTC) |
The WPMED project has a list of ~1400 articles currently tagged with {{ unref}} (including redirects and the multiple issues template). Recent experience indicates that about 20% of these actually contain a reference, but nobody bothered to remove the unref tag. Per Template:Unreferenced/doc, the template should not be placed on any page containing any sort of citation.
Would it be easy to have a bot remove the unref template from any article in my list that (1) contains a <ref> tag or (2) contains any URL (http://)? WhatamIdoing ( talk) 23:53, 7 December 2010 (UTC)
<ref></ref>
, however, is most likely indicative of article having a real reference. There was a bot proposal some time ago about this, cannot find it now. —
HELLKNOWZ ▎
TALK 13:02, 8 December 2010 (UTC)Similar to the previous request, could someone please have a bot remove the depricated |article=yes
from {{
Multiple issues}}? Thanks!
GoingBatty (
talk) 01:32, 20 December 2010 (UTC)
I 'll need a list of all unknown parameters. I worked a bit with the list I got in August 24. Check also
User_talk:Magioladitis#FR:_.23New_alert:_Unknown_parameters_to_Multiple_issues. I think we need to remove the |article=
by bot (not only with value yes) and fix the rest manually. I'll start doing it. --
Magioladitis (
talk) 10:28, 20 December 2010 (UTC)
Doing... Removing article, do-attempt. Renaming OR and or. -- Magioladitis ( talk) 13:40, 21 December 2010 (UTC) Done Magioladitis ( talk) 01:05, 22 December 2010 (UTC)
|bot=yes
from {{
BLP unsourced}}Could someone please have a bot remove the deprecated |bot=yes
from {{
BLP unsourced}}? This would allow AWB to make fixes to articles with this template, including merging it into {{
Multiple issues}}. Thanks!
GoingBatty (
talk) 22:31, 19 December 2010 (UTC)
Do we have an estimate of how many pages that this parameter? Is there some tracking category? -- Magioladitis ( talk) 22:41, 19 December 2010 (UTC)
Could someone make another bot to replace the late Article Alert Bot? It would be extremely helpful. Wikipedia:Article alerts/Specification looks like a good reference. Arlen22 ( talk) 20:12, 31 August 2010 (UTC)
Oh, and to give a hint at what needs fixing, it was due to an API change back in April. User:Legoktm may know more about that. Headbomb { talk / contribs / physics / books} 03:35, 2 September 2010 (UTC)
If I could get the source emailed to me, via the link on my userpage or talk page I would gladly fix the issues mentioned above and take over the bot if someone has already not volunteered. Joe Gazz84
user•
talk•
contribs•
Editor Review 15:31, 4 September 2010 (UTC)
I don't know, this may take while. Some other WikiProjects have removed Article Alerts because they feel that ArticleAlertBot is no longer active because of Wikipedia API changes. JJ98 ( Talk) 04:02, 18 October 2010 (UTC)
Almost ready! Drummers, get ready! Arlen22 ( talk) 23:16, 22 October 2010 (UTC)
Can we put the code publicly somewhere (meta wiki...?) so that if it goes down again it will be easy to get the source again? -- Piotr Konieczny aka Prokonsul Piotrus| talk 20:02, 17 November 2010 (UTC)
Still working on Java related problems. Arlen22 ( talk) 12:41, 24 November 2010 (UTC)
Since it's been three months now, and there's been plenty of time to update the bot, and it still isn't running, H3llkn0wz and I have decided to begin rewriting the bot from scratch (he codes, I comment). If the original bot is back up again before we're done, great. The new bot will be open-sourced and could be deployed on the other Wikis regardless of whether we use it on the English wikipedia. Headbomb { talk / contribs / physics / books} 02:11, 1 December 2010 (UTC)
We have given up on the old code. All stops pulled, let's go! Arlen22 ( talk) 00:41, 23 December 2010 (UTC)
See above for initial discussion of this problem. User:Magioladitis's Yobot blanked old WildBot tags from, as you can read above, 3782 talkpages. Many of these talkpages contained only the Wildbot tags, which leaves them completely blanked now. Some examples of this behavior from my watchlist: [10] [11] [12] [13]. I originally brought this to M's talkpage, and he asked me to post this here. If the only edits are by Wildbot and then by Yobot, and the page is blanked, the page should be deleted (There is no reason to have a page if there are no tags or discussion headings). If there are intermediate edits, but the page is still blanked, (Like this) then maybe the pages could be listed in a holding tank for review (Which I would be happy to do). I would estimate that a little less than 2000 pages would be affected, but my estimate could be far off. -- Fiftytwo thirty ( talk) 22:00, 1 December 2010 (UTC)
/* SLOW_OK - Empty talk pages with mostly bot edits */
SELECT CONCAT('[[Talk:',page_title,']]'), COUNT(*) AS Edits, COUNT(ug_group) AS BotEdits
FROM page
JOIN revision ON rev_page=page_id
LEFT JOIN user_groups ON ug_user=rev_user AND ug_group="bot"
WHERE page_namespace = 1
AND page_is_redirect = 0
AND page_len IS NOT NULL
AND page_len<5
GROUP BY page_id
HAVING BotEdits > 0 AND BotEdits + 1 >= Edits
Many of these are talk pages of redirects. A simple (and sane) option is to make them redirect to the appropriate talk page.
Rich
Farmbrough, 21:15, 15 December 2010 (UTC).
Recently I came across a template where {{ collapsible option}} was included in the documentation, but the template was not relaying its {{{state}}} parameter over to {{ navbox}}. It would be a good idea for a bot to monitor and auto-fix or untag these. -- Joy [shallot] ( talk) 10:40, 24 December 2010 (UTC)
There is a discussion at CFD on renaming a large group of categories to use endashes in their titles per WP:DASH.
To put this change in perspective, it would be useful to know how many category titles already use an endash. I tried using a search to get a figure, but it finds nothing at all.
Can anyone with access to bot-like tools do a count, without too much work? -- BrownHairedGirl (talk) • ( contribs) 15:35, 23 December 2010 (UTC)
[^0-9 ] *– * [^0-9 ]
. —
Dispenser 13:23, 24 December 2010 (UTC)
Could some one write, set up, and run, a bot equivalent to that that generated WP:AFDSUM ? There have been several requests for AFDSUM to be restarted since it stopped running, on the related talk pages. 65.95.13.158 ( talk) 07:51, 21 December 2010 (UTC)
AfD Keep # Delete # Merge # Redirect # Neutral # Expired Filed
for
tools:~betacommand/AFD.html and nothing happened (no sort). Can that sort feature be added to/fixed for AFD.html? --
Uzma Gamal (
talk) 01:42, 26 December 2010 (UTC)
User:Jasy jatere has
requested to add
ISO 639-3 language codes to three letter disambiguations, e.g. a link to
Amal language from
AAD (aad
is the ISO code for this language).
Here is a list of dabs missing such entries. Lines with no link in front of the colon mean that my bot didn't find a dab; they can be ignored (maybe I'll create redirects for these).
My problem is that a bot (as far as I can see) cannot find out which section in a given dab is the right one to add the link. Any ideas how this could done (semi)automatically? Thanks, ἀνυπόδητος ( talk) 09:04, 20 December 2010 (UTC)
I was thinking of writing a film article and thought of asking a Wikipedia for help. One of the best Wikipedians to ask for help would seem to be the ones who have contributed to the most Category:FA-Class film articles. There are a bunch of articles in the FA-Class film article category and I wasn't sure of the easiest way to find those Wikipedians who have contributed to the most FA-Class film articles. Then I thought that it would be great to have such a list. We do have Wikipedia:List of Wikipedians by featured article nominations, but that doesn't associate the editor with the topic. Here is where you guys come in as I think a bot can help generate such a table. The bot would:
There might be an easier way to do this, so please feel free to take that route. Here is how I see the table being formatted from the above 300 (film) example (if you can think of other stats to add to the table, please do so).
Featured Article | FA-Class film category | Significant contributor to the article | Notes |
---|---|---|---|
300 (film) | FA-Class film articles | Arcayne | |
300 (film) | FA-Class film articles | Alientraveller |
Please place the results in a new subpage in my user space. I'll add it to project space once it is ready. Also, my request at Tips for writing filmologies inspired this bot reuest as well. -- Uzma Gamal ( talk) 13:26, 24 December 2010 (UTC)
Could someone create a new version of WildBot? The bot hasn't run since September, and Josh Parris, who maintained it, hasn't been around since July. The source code is available at the Toolserver, so you wouldn't need to write everything anew. Nyttend ( talk) 01:53, 17 December 2010 (UTC)
Would a mass upload of USGS publications which are PD to the commons be feasible? (I'm asking here since the material is primarily English)? Smallman12q ( talk) 16:15, 25 December 2010 (UTC)
Would it be possible to have a bot go through Filespace and identify local file description pages for a)Commons files and b)non-existent files? The pages should then be added to Category:Description pages missing files. The vast majority of these will be eligible for speedy deletion under WP:CSD#F2 - however, they do need to be reviewed by a human, as there are some legit uses ({{ FeaturedPicture}} and so forth). Most, though, are either created in error or are simple vandalism. Kelly hi! 18:18, 29 December 2010 (UTC)
Per Wikipedia_talk:AutoWikiBrowser/Feature_requests#Remove_.7B.7Bcoord_missing.7D.7D_if_.7B.7Bcoord.7D.7D_exists, "Speaking of Bharati Bhavan Library, could a feature be added to AWB to remove {{ coord missing}} from an article if the article also contains {{ coord}}? GoingBatty ( talk) 03:34, 29 December 2010 (UTC)"
Since this is a trivial bot job, I made list of pages bot having {{ coord}} and {{ coord missing}}. I found 2,569 pages and I am fixing right away. -- Magioladitis ( talk) 19:29, 29 December 2010 (UTC)
Done Magioladitis ( talk) 23:56, 29 December 2010 (UTC)
I have no idea how to create or use bots, but after receiving a comment at a Peer Review I wanted to ask if it would be possible to engineer a bot that could locate and fix links that go to a redirect page so that when users/visitors click on a link in a wikipedia article they go to the correct page the first time. Since this would only effect links in article pages it would (in theory, anyway) not effect pages created as redirects since those would still need to exist to make sure that terms entered into the search box go to the right page. I am aware of WP:NOTBROKEN, but think this could help Wikipedia in the long run by providing a degree of spell checking and by providing increased accuracy to make sure that articles links do not encounter double redirects or redirects that simply return a user or visitor back to the page in the first place. In the interest of fairness I will concede that the bot does not have to be automatic, if at all possible it could be designed to run only on request pages, but I think that it could be helpful to Wikipedia as whole. TomStar81 ( Talk) 00:01, 30 December 2010 (UTC)
Hi. I recently learned of the existence of this log. Alison made a request on AN that any time IPBE is granted, it be logged. Easy enough, but I was wondering if there's a way to either a) have a bot get the list up-to-date, b) update the list as the right is granted, or c) both. Thanks! TN X Man 16:28, 24 December 2010 (UTC)
I need a bot that will automatically update a "Hot articles" list for a Wikiproject once a day. The bot should look through all of the articles under the purview of the project and compile a list of the 5 articles receiving the most edits in the last 3 days. For an example of a manually-updated version see Wikipedia:WikiProject_United_States_Public_Policy/Leaderboard/What's_hot. This will more than likely be run on the toolserver. It would also be nice (but not required) if the bot could be configured per project. For example, to show the top 10 articles receiving the most edits in the past 14 days. Kaldari ( talk) 22:42, 28 December 2010 (UTC)
Just be warned, the longer the timespan and the bigger the wikiproject, the longer the tool will take. Tim 1357 talk 03:06, 2 January 2011 (UTC)
I'm not sure where to post this suggestion, so please move if it isn't fit for bots. I'd like to suggest a bot run to change "not be able" to "be unable" in articles. Smallman12q ( talk) 00:10, 2 January 2011 (UTC)
250,000 leaked "cables" are expected to be published in the next few years by WikiLeaks. Main story: United States diplomatic cables leak. I checked the whole discussion about the policy for linking to the leaks, discussing their contents etc. There's a huge work going on to keep Contents of the United States diplomatic cables leak updated with summaries of the most relevant cables.
In my opinion, each of these cables is going to trigger important discussions worldwide and affect the future of world diplomacy. All mainstream media are giving frequent reports on the latest releases referring to individual items of the list.
Each cable has a unique ID (e.g. "10MADRID86") that follows a well-defined grammar (see Template:Cablegate). This is the way those cables are referred to in citations by secondary and tertiary sources. In my opinion, it makes sense to create an article for each cable with the ID as the article name. All such articles would contain basic information about the leak, like its origin, the date it was sent and the date it was leaked, its secrecy classification, a link to the WikiLeaks page and a few wikilinks (not the cable content). This is information a bot can fetch from WikiLeaks. Humans can add more information like a summary, background data and reactions to the leak, for those cables that end up impacting more on the public opinion.
I can also take care of creating the bot if there is consensus. I also have a few ideas about the details but let's discuss them only if the community thinks that this is a good idea. It is pretty unusual for me to google for an individual cable on the basis of its "official name" and not find a Wikipedia article explaining what it is. -- MauroVan ( talk) 16:19, 3 January 2011 (UTC)
I oppose the use of bots to create stubs. It's created a quagmire of non-notables specks from atlases, and there's no need to create another set of articles about memos. After each memo has been discussed in multiple independent sources, human editors can weigh those sources and create meaningful articles about them.— Kww( talk) 06:54, 5 January 2011 (UTC)
Discussed out at Wikipedia talk:WikiProject National Register of Historic Places#Please change the standard citation to omit the link, there's a need to replace several variations of references to the National Register of Historic Places (NRHP)'s NRIS database, by a call to template:NRISref instead. All background work done, consensus on the change accomplished. Draft articles being started using two outside systems (Elkman's NRHP infobox generator and my own batches-of-NRHP-articles-generator) are coming in with the template calls. User:Kumioko was helpful in the discussion and might comment here.
What's needed in general is to address 20,000-30,000 instances in wikipedia of references like:
<ref name="nris"> {{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13 |work=National Register of Historic Places |publisher=National Park Service}} </ref>
or in most cases all in one line:
<ref name="nris">{{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13|work=National Register of Historic Places |publisher=National Park Service}}</ref>
and having various dates. Half or more will have 2009-03-13 date. Most of the rest will have 2008-04-14 or 2007-01-23.
These are to be replaced by:
where 2009a, 2008a, 2007a are versions specifically programmed in the template:NRISref. If the bot operator could identify any other commonly used dates, then versions for those could be added to the template, but this is most of them.
In many of the articles, there are later invocations of the reference by <ref name=nris/> or by <ref name="nris"/>, which should be left unchanged. If there are multiple outright definitions of the NRIS reference in one article, that's an error to be noted (listed in an errors page?) and fixed manually.
Also there are some cases where a URL other than url= http://www.nr.nps.gov/ is provided. These should be treated as errors and listed or put into a category somehow, too.
Can i provide any more info? Your consideration is appreciated. -- Doncram ( talk) 16:45, 30 December 2010 (UTC)
|version=Error (date)
or anything else besides the specific dates we have identified, the template will spit out exactly what you put in. If we want to categorize these later, we can just tack a category onto the end of the code where I made the note. Really, they don't even have to be tagged with an error at all.. if they don't have one of the prescribed dates, they will trigger an error anyway.--
Dudemanfellabra (
talk) 17:16, 7 January 2011 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 35 | ← | Archive 37 | Archive 38 | Archive 39 | Archive 40 | Archive 41 | → | Archive 45 |
As decided in the WikiProject Basque discussion page, this Basque province's name is Labourd in the English-language Wikipedia, and not Lapurdi. So we need that all the contents in Category:Lapurdi be moved to Category:Labourd. Thanks in advance. -- Xabier Armendaritz (talk) 12:32, 29 October 2010 (UTC)
I don't know if this has been requested before, but there should be a bot that replaces bare URLs (in those articles that are marked with
Template:bare URLs) with
Template:Cite web. Essentially like this: http://www.website.com
replaced with {{Cite web |url=http://www.website.com |title=Website title <!-- added by bot ---> |accessdate={{subst:DATE}} <!-- added by bot --> }}
. —
bender235 (
talk) 18:56, 29 October 2010 (UTC)
WildBot had tagged talk pages of articles which had disambiguation links. However, its operator is busy in real life and these notices have become dated. Using the Toolserver, I have create a list where these tags can be safely removed. We now need a bot to remove those tags (up to 14,000). Any takers? — Dispenser 05:14, 30 October 2010 (UTC)
On occasion, I see articles with English-language references which contain {{
en icon}}; see, for example,
Task Force Danbi. It can be assumed that English-language sources are the default on the English-language Wikipedia, and so it seems unnecessary to identify a source as being in English. Assuming that large-scale removal of {{
en icon}} in these situations would not be controversial, could a bot check for and remove instances of {{en icon}}
contained in <ref>...</ref>
tags? --
Black Falcon (
talk) 06:24, 31 October 2010 (UTC)
I'm looking for a very simple bot that searches for a word within pages and deletes it. Can you make it with the word "test"? I can't seem to find a simple program that just does that. Thanks! → ΑΧΧΟΝΝ fi re 21:47, 1 November 2010 (UTC)
user:CitationTool was a very useful bot that isn't running anymore. It automated the job of archiving external links in citations, among other tasks. The source code has been released by its owner, User:Lulu of the Lotus-Eaters. See User talk:CitationTool#How do I make it go? Is anyone willing to take over this bot and get it going again? A standalone tool with the same functions would also be useful. Will Beback talk 00:07, 4 November 2010 (UTC)
The usage instructions for Template:Talk header indicate that the template "should not be added to otherwise empty talk pages". Could a bot check transclusions of the template (and its redirects, see here) and generate two lists:
Furthermore, could the bot delete all pages in the first list (i.e., where the only edit was to add the template to an otherwise empty page)?
Thank you, -- Black Falcon ( talk) 03:38, 4 November 2010 (UTC)
SELECT IF(rev_parent_id=0, "Single", "Many") AS "revs",
CONCAT('[[Talk:', page_title, ']]')
FROM page
JOIN templatelinks ON tl_from=page_id
JOIN revision ON rev_id=page_latest
WHERE tl_namespace=10 /*Template:*/ AND tl_title="Talk_header"
AND page_namespace= 1 /*Talk: */ AND page_len < 30;
The Romanian Wikipedia has implemented a new policy this year of transitioning to the correct versions of ș and ț. The transitions has practically been completed on the ro.wiki. I have started my own miniproject to modify the diacritics manually whenever I find them, but it's way to tedious for articles belongings to categories like Romanian place names, personalities, etc. where there are a lot of characters to change and pages to move. I recommend building a bot based on Strainu's bot for the en.wiki (and possibly extended to other wikis) that will correct the diacritics in articles within certain categories and move them if necessary. Ayceman ( talk)
Hi. The mentioned infobox went through a big history of renamings and mergers. All articles using the old names and boxes use old fields which needs to be updated. There are five old templates (which its links are now replaced):
These templates use old parameters with are now included in {{ Infobox power station}} as DEPRECIATED (in edit mode). These parameters should be "find and replaced" by given parameters. Could this be done?
If so, could someone join Template talk:Infobox power station to help discuss the updates required? Kind regards. Reh man 13:05, 7 November 2010 (UTC)
Per Talk:Highways in Croatia#Recent title changes, a mess has been created by Nono64 ( talk · contribs), which is a huge pain to undo manually. For example, A1 (Croatia) was moved to A1 road (Croatia) and then to A1 autocesta (Croatia). Now that I moved it back to A1 (Croatia), the talk page move also needs to be reverted manually because the destination one already exists, doubling the amount of work. I'd appreciate it if someone make a bot that would just roll back all moves done by Nono64 to articles named "[A-Z][0-9]+ (Croatia)". Afterwards we might actually use the same kind of bot to pursue a similar move, but with the distinct difference that it would be a product of consensus and not a random user's whim. -- Joy [shallot] ( talk) 16:17, 20 November 2010 (UTC)
User:RSElectionBot was employed for the purposes of maintaining this voter log for last year's ArbCom elections. We'd like to have the same thing running this year, but the bot's operator, User:Rspeer, seems to have taken a leave of absence. I imagine it's a relatively simple bot to run; would someone here be able to volunteer? Skomorokh 14:00, 15 November 2010 (UTC)
On behalf of the Indian WikiProject, I'd like to request if a bot could be made available to perform the delivery of the project's newsletter to its participants. The newsletter is located here and the list of participants on whose talk pages it is to be delivered is here. Any help would be appreciated, since our regular bot is down for a while. Regards, SBC-YPR ( talk) 13:07, 22 November 2010 (UTC)
Hello. Some time ago, SoxBot stopped monitoring new usernames for potential username violations. The bot-op has been pinged a couple of times ( here), but there hasn't been a response. Is it possible to have a new bot monitor usernames? Thanks! TN X Man 15:59, 17 November 2010 (UTC)
Would it be possible to create a bot that can unlink pages that have been deleted via an XfD discussion without having to resort to using Twinkle to do it instead? :| TelCo NaSp Ve :| 00:52, 21 November 2010 (UTC)
Per the discussion at Wikipedia:VPR#Input_randomisation (and others in the past which were rather more enthusiastic, but I forget where), can we please have a bot which will, in essence, randomly assign requests for input to RFC to members of a list of people who want such requests.
Some elements of how this will work:
Any takers? Rd232 talk 01:17, 27 November 2010 (UTC)
The service will manifest itself in the form of Wikipedia:Requests for comment/Comment duty. harej 03:28, 27 November 2010 (UTC)
This bot request is still outstanding, the orginal offer to complete it having lapsed. If its too big a job for one person, perhaps it could be sub-divided? Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:08, 28 November 2010 (UTC)
Okay, so some of these are false positives, but I have been finding many uses of {{start box}}
without the corresponding {{end box}}
or {{
end}} or whatever. This often goes unnoticed, since it is at the end of the page, but it is a real problem if it happens before the PERSONDATA template, since it then makes the "person data" information visible. Now, there are many different "end" templates which can effectively close the {{start box}}
, which is why finding the open ones is a bit difficult. However, I was able to come up with
this as a first start at a list. Perhaps someone can refine it. I would imagine with such a list, it would be a relatively straight forward task with AWB.
Plastikspork
―Œ(talk) 04:50, 2 December 2010 (UTC)
I just updated
Template:FeaturedPicture to include a {{{type}}}
parameter to distinguish animations and videos from pictures, as they are all nominated via
WP:FPC but there are separate categories:
Category:Featured videos,
Category:Featured animations, and
Category:Featured pictures. Could a bot go through
Category:Featured videos and
Category:Featured animations and add |type=video or |type=animation to
Template:FeaturedPicture? /
ƒETCH
COMMS
/ 03:37, 3 December 2010 (UTC)
|type=
parameter? -
EdoDodo
talk 07:58, 4 December 2010 (UTC)
Is there a way to possibly modify User:SoxBot so that it catches and reports inappropriate usernames that trip one or more of the edit filters listed near the top of WP:UAA and make a note of it in the bot's report instead of waiting till the user(s) edit? :| TelCo NaSp Ve :| 02:30, 5 December 2010 (UTC)
Is there a bot that can tag a talk page of the next higher taxon when a new article taxon article is added? It would have to scan for the taxa articles every week or so. Say I add the Senecio biglovii article to wikipedia one day. Later on that week, the bot finds that article, scans its taxobox, finds out the article I added was a species article, then tags the genus article talk page ( Talk:Senecio with a simple one line alert, with proper edit summary "New species article," in a new section:
"The species Senecio biglovii article was created on 2010-12-03."
For new family article, the bot would tag the order page if it exists, if not, the class page, if not the phylum page, if not the kingdom page.
The same with dabs. An article with a word in the name is created, say Manija Fairbanks, and the bot adds the line to either Talk:Fairbanks (disambiguation) or Talk:Fairbanks (surname) (creating the page in the latter case) if the article is included in a biography category and the surname dab exists:
"The Manija Fairbanks article was created on 2010-12-01."
Not necessary it be the same bot for both tasks, and maybe these bots exist already? -- Kleopatra ( talk) 18:58, 5 December 2010 (UTC)
I don't know whether this task would be possible for a bot & I've really come to ask the experts before developing a full request. Today English Heritage appears to have withdrawn the Images of England web site - see discussion at Wikipedia talk:WikiProject UK geography#www.imagesofengland.org.uk. A quick Google search for http://www.imagesofengland.org.uk within the domain http://en.wikipedia.org/ finds 7,610 wikipedia articles (some of which eg lists may link to more than one Images of England page)!. The format of the new URL is very different but all keep the same reference number within it & add resourceID=5 on the end. Example:
Is this the sort of task a bot could handle?— Rod talk 21:09, 3 December 2010 (UTC)
As per a discussion at WP:Tennis I would like you to help in fixing all the daviscup.com links on English Wikipedia. The official Davis Cup site got revamped causing hundreds of referring links to go dead. Fortunately most of these links have their equivalent on the new site. We've already put together a list of these links and I filtered them to two major categories that encompass 75% of aforementioned links and are still repairable.
Lajbi Holla @ me • CP 23:24, 5 December 2010 (UTC)
MerlLinkBot: als-10: 1 Page; fr-0: 3 Pages; it-0: 92 Pages; pt-0: 108 Pages; es-0: 93 Pages; he-0: 38 Pages; cs-0: 191 Pages; fi-0: 68 Pages; en-0: 29 Pages; nl-0: 15 Pages; de-0: 20 Pages; th-0: 2 Pages; ja-0: 167 Pages; bg-0: 115 Pages; vi-0: 1 Page; ast-0: 3 Pages; ka-0: 1 Page; no-0: 3 Pages; sv-0: 60 Pages; ru-0: 4 Pages; es-0: 4 Pages; ar-0: 15 Pages; ko-0: 5 Pages; kn-0: 2 Pages; ca-0: 5 Pages; sr-0: 23 Pages; tr-0: 4 Pages; ro-0: 12 Pages; zh-0: 7 Pages; pl-0: 36 Pages; ml-10: 1 Page; es-10: 1 Page; nl-10: 1 Page; ca-10: 1 Page; eo-10: 1 Page; fr-0: 71 Pages; pt-0: 24 Pages; eo-0: 1 Page; hr-0: 9 Pages; it-0: 18 Pages; nds-0: 9 Pages; en-0: 145 Pages; id-0: 8 Pages; lv-0: 7 Pages; eu-0: 6 Pages; cv-0: 5 Pages; pl-0: 1 Page; sl-0: 5 Pages; da-0: 4 Pages; uk-0: 3 Pages; mk-0: 2 Pages; ta-0: 2 Pages; te-0: 2 Pages; sh-0: 2 Pages; el-0: 2 Pages; hu-0: 1 Page; gu-0: 1 Page; cy-0: 1 Page; lb-0: 1 Page; hi-0: 1 Page; jv-0: 1 Page; also fixed serveral templates on other wikis. Merl issimo 05:08, 7 December 2010 (UTC)
Per the discussion at Wikipedia talk:Adopt-a-user#Bot to track progress and pairs, would it be possible to have a bot like that created? Note that there really aren't that many people active in the behind the scenes tasks at adopt-a-user, so those comments are pretty much all we're going to get from the program. The bot's task would be to periodically check a user's contributions for the latest timestamp and report it to a centralized tracking page at adopt-a-user. The centralized page would include every involved user's last edit information (date) and ideally, whether or not they are blocked. Users in the category Category:Wikipedians adopted in Adopt-a-user and other critical categories would be checked for activity so we can know whether or not the pairs are active. Thanks. Netalarm talk 04:12, 8 December 2010 (UTC)
Currently, CRWP articles are being tagged with {{
WikiProject Canada}} with the |roads=yes
parameter. We're resurrected and expanded {{
Canada Roads WikiProject}}. We could use a bot to run through the articles and add the CRWP-specific template using the province, class and importance information from the Canada template. If an article is tagged with multiple provinces, the CRWP template can handle them as province1=XX through province10=XX where the input is the postal abbreviation. The Ontario articles are tagged with {{
WikiProject Ontario Roads}} as well, and those tags can be removed. (That template is being retired in this update.) Any assistance is appreciated.
Imzadi
1979
→ 01:14, 3 December 2010 (UTC)
How about a bot that goes through all of the Maintenance categories that have "undated articles/files" sections (i.e. Category:Articles lacking sources), finds the relevant tag in all of the undated articles, and adds the current month to each of them. -- vgmddg ( look | talk | do) 18:41, 2 November 2010 (UTC)
Hello? Anybody there? -- vgmddg ( look | talk | do) 20:30, 9 November 2010 (UTC)
Hello? (again) -- vgmddg ( look | talk | do) 20:32, 19 November 2010 (UTC)
- Go to category in question. ( Category:Articles lacking sources, Category:Articles with topics of unclear notability, etc.)
- Get every list item under section "Pages in category '{{PAGENAME}}'"
- Weed out whitelisted pages that are meant to be at the top level. (For example, the page Wikipedia:Requests for expansion is supposed to be in Category:Articles to be expanded)
- Repeat with every article in the remaining list:
- Go to article in question.
- Scan through entire article for template(s) that don't have dates on them that are supposed to have them. (You could probably talk to User:Rich Farmbrough (maker of SmackBot) for a list of the templates to look for.)
- Add date parameter to offending template(s).
- Return to category and repeat.
Once all of the categories are cleared, the bot will run in the background and wait for another article to be added to one of the categories on its list. The bot would be used for pretty much any category in Category:Wikipedia maintenance categories sorted by month that that is added to via template. I have laid out my plans. All that is needed is for someone to convert it into executable code. -- vgmddg ( look | talk | do) 01:09, 20 November 2010 (UTC)
|date=month XXXX
to |date=Month XXXX
and |date=Day Month XXXX
to |date=Month XXXX
making method more efficient. --
Magioladitis (
talk) 14:32, 22 November 2010 (UTC)Wikipedia:Bots/Requests for approval/KarlsenBot 6 asks to perform the task. AWB already fixes some of common mistakes found on dated templates, improving success of this task. -- Magioladitis ( talk) 00:00, 29 November 2010 (UTC)
|date=
for the current date?
Anomie
⚔ 21:31, 9 December 2010 (UTC)
I spent my entire day to add more dated templates, more redirects and fix many templates, like this one, which were supposed to be dated but they don't. I even fixed Multiple issues a bit. The bot part isn't the only one. We need someone to check all the templates and see if they work fine. -- Magioladitis ( talk) 20:14, 12 December 2010 (UTC)
Per Wikipedia:Categories for discussion/Log/2010 November 29#Category:2100, I am formally requesting that categories Category:2031 through Category:2098 be created as follows: Category:20''ab'' should have:
{{ portal|History}}
{{ yearcat}}
{{ Decade category header|decade=20a0}}
[[Category:Years in the future]]
[[Category:Years]]
I argued for deletion of Category:2100, but if it is to be there, let's make it part of a pattern: Category:2099 was created (with incorrect sort tags) during the process, and I corrected the sort tags. I have been unable to get AWB to work for some time, or I'd attempt to do this, myself.
Once this is done, Category:Years in the future should be removed from the decade categories Category:2030s through Category:2090s. — Arthur Rubin (talk) 16:41, 13 December 2010 (UTC)
I have generated a list of talk pages with obsolete WildBot tags. They can be safely removed with the following regex:
\{\{(User:WildBot/m01|User:WildBot/msg)\|([^{}]|\{\{User:WildBot/[^{}]*\}\})*\}\}\n?
If nothing is left you may wish to speedy delete the page (WildBot used G7-author). — Dispenser 01:21, 24 November 2010 (UTC)
Wildbot would be doing this itself, but I think this edit turned it off. Could someone who is bot-clueful turn it back on? (Or explain what's involved in having it get back to work?)-- W☯W t/ c 20:13, 15 December 2010 (UTC)
In dealing with WP:LINKROT, one of the suggestions is to use the on-demand web archiving service WebCite. A fine idea, but how often does it really happen that people do that? Linkrot is a particularly pressing problem for BLPs. Our article about Chante Jawan Mallard is a good example. Three of the five references have gone bad, leaving a CNN article and a Snopes page to confirm that she was convicted of an unusual murder. Could a script be written to trawl the external links in BLPs and request archiving for all of them at WebCite? A bot could then place a comment or template on the talk page noting that the webpages have been webcited. I'm not sure yet how we would need to coordinate with WebCite, given the demands this might make on their program.-- Chaser ( talk) 10:44, 12 December 2010 (UTC)
Amongst my routine cleanup, I often come across poorly declared infoboxes (see example). This makes them very editor-unfriendly (for both new and expert editors). So I was wondering if it would be possible to create an "infobox cleaner bot" that would crawl across infoboxes and do the following fixes
That is, convert
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = 528 | }} '''Bob the Magic Disease''' is a magic disease named "Bob".
into
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = med | eMedicineTopic = 528 }} '''Bob the Magic Disease''' is a magic disease named "Bob".
Now most of these fixes could be incorporated into AWB for general cleanup, and would not warrant a edit on their own, but I think a bot restricting itself to articles which don't have pipes on the left, and/or which have more than one parameter per line, goes beyond general cleanup. A bot editing them would makes the infoboxes much more editor-friendly and much MUCH less intimidating to newcomers, and would go a long way in preventing the propagation of horribleness through copy-pasting across different articles. Headbomb { talk / contribs / physics / books} 15:47, 13 December 2010 (UTC)
I have a framework for this somewhere that I used for Infobox Album and Infobox French commune. However I was looking for somewhat more thorough clean up. I will keep a watching brief on this while I do other stuff.
Rich
Farmbrough, 21:20, 15 December 2010 (UTC).
I'm going to have to put this off until sometime in January, there's too much I need to still get done. There are some rules already implemented in the commonfixes library (used by reflinks) to move pipes, but I intend to finish this up eventually. — Dispenser 12:58, 17 December 2010 (UTC)
Update articles in certain category/subcategories. Example of work needs to be done: [6]. Instructions: Wikipedia_talk:WikiProject_Gastropods#Taxonomy_of_Heterobranchia. -- Snek01 ( talk) 12:10, 16 December 2010 (UTC)
Add missing taxon. Follow the link: Wikipedia_talk:WikiProject_Gastropods#Error in Littorinimorpha articles by Ganeshbot. -- Snek01 ( talk) 20:00, 20 December 2010 (UTC)
We're not going to require reconfirmation edits. This was a silly idea. Moving on. -- MZMcBride ( talk) 19:01, 21 December 2010 (UTC) |
---|
The following discussion has been closed. Please do not modify it. |
Can we construct some way for messagebots to stop sending messages to retired users? For example, if there was some way to turn off EdwardsBot so that it could just stop sending useless Signpost editions to some retired users' talkpage and clutter the entire page with useless threads? I find it wastes a lot of resources and the talkpage would then become some dumping ground for these editions, and furthermore, it hinders someone's navigation when they are looking for a particular thread related to the users' histories (e.g. an ANI incident) and have to shuffle through the multiple Signpost articles to look for the right thread to link to ANI. To compensate for these, I would prefer that we use some sort of confirmation page to ensure that the recipients of the message bots are still active enough to respond to them. :| TelCo NaSp Ve :| 16:16, 6 December 2010 (UTC)
Uh, the bot is definitely sending this message to active users, ex. Brad101 and Catalan. May want to tweak your coding. Ed [talk] [majestic titan] 10:13, 21 December 2010 (UTC) |
The WPMED project has a list of ~1400 articles currently tagged with {{ unref}} (including redirects and the multiple issues template). Recent experience indicates that about 20% of these actually contain a reference, but nobody bothered to remove the unref tag. Per Template:Unreferenced/doc, the template should not be placed on any page containing any sort of citation.
Would it be easy to have a bot remove the unref template from any article in my list that (1) contains a <ref> tag or (2) contains any URL (http://)? WhatamIdoing ( talk) 23:53, 7 December 2010 (UTC)
<ref></ref>
, however, is most likely indicative of article having a real reference. There was a bot proposal some time ago about this, cannot find it now. —
HELLKNOWZ ▎
TALK 13:02, 8 December 2010 (UTC)Similar to the previous request, could someone please have a bot remove the depricated |article=yes
from {{
Multiple issues}}? Thanks!
GoingBatty (
talk) 01:32, 20 December 2010 (UTC)
I 'll need a list of all unknown parameters. I worked a bit with the list I got in August 24. Check also
User_talk:Magioladitis#FR:_.23New_alert:_Unknown_parameters_to_Multiple_issues. I think we need to remove the |article=
by bot (not only with value yes) and fix the rest manually. I'll start doing it. --
Magioladitis (
talk) 10:28, 20 December 2010 (UTC)
Doing... Removing article, do-attempt. Renaming OR and or. -- Magioladitis ( talk) 13:40, 21 December 2010 (UTC) Done Magioladitis ( talk) 01:05, 22 December 2010 (UTC)
|bot=yes
from {{
BLP unsourced}}Could someone please have a bot remove the deprecated |bot=yes
from {{
BLP unsourced}}? This would allow AWB to make fixes to articles with this template, including merging it into {{
Multiple issues}}. Thanks!
GoingBatty (
talk) 22:31, 19 December 2010 (UTC)
Do we have an estimate of how many pages that this parameter? Is there some tracking category? -- Magioladitis ( talk) 22:41, 19 December 2010 (UTC)
Could someone make another bot to replace the late Article Alert Bot? It would be extremely helpful. Wikipedia:Article alerts/Specification looks like a good reference. Arlen22 ( talk) 20:12, 31 August 2010 (UTC)
Oh, and to give a hint at what needs fixing, it was due to an API change back in April. User:Legoktm may know more about that. Headbomb { talk / contribs / physics / books} 03:35, 2 September 2010 (UTC)
If I could get the source emailed to me, via the link on my userpage or talk page I would gladly fix the issues mentioned above and take over the bot if someone has already not volunteered. Joe Gazz84
user•
talk•
contribs•
Editor Review 15:31, 4 September 2010 (UTC)
I don't know, this may take while. Some other WikiProjects have removed Article Alerts because they feel that ArticleAlertBot is no longer active because of Wikipedia API changes. JJ98 ( Talk) 04:02, 18 October 2010 (UTC)
Almost ready! Drummers, get ready! Arlen22 ( talk) 23:16, 22 October 2010 (UTC)
Can we put the code publicly somewhere (meta wiki...?) so that if it goes down again it will be easy to get the source again? -- Piotr Konieczny aka Prokonsul Piotrus| talk 20:02, 17 November 2010 (UTC)
Still working on Java related problems. Arlen22 ( talk) 12:41, 24 November 2010 (UTC)
Since it's been three months now, and there's been plenty of time to update the bot, and it still isn't running, H3llkn0wz and I have decided to begin rewriting the bot from scratch (he codes, I comment). If the original bot is back up again before we're done, great. The new bot will be open-sourced and could be deployed on the other Wikis regardless of whether we use it on the English wikipedia. Headbomb { talk / contribs / physics / books} 02:11, 1 December 2010 (UTC)
We have given up on the old code. All stops pulled, let's go! Arlen22 ( talk) 00:41, 23 December 2010 (UTC)
See above for initial discussion of this problem. User:Magioladitis's Yobot blanked old WildBot tags from, as you can read above, 3782 talkpages. Many of these talkpages contained only the Wildbot tags, which leaves them completely blanked now. Some examples of this behavior from my watchlist: [10] [11] [12] [13]. I originally brought this to M's talkpage, and he asked me to post this here. If the only edits are by Wildbot and then by Yobot, and the page is blanked, the page should be deleted (There is no reason to have a page if there are no tags or discussion headings). If there are intermediate edits, but the page is still blanked, (Like this) then maybe the pages could be listed in a holding tank for review (Which I would be happy to do). I would estimate that a little less than 2000 pages would be affected, but my estimate could be far off. -- Fiftytwo thirty ( talk) 22:00, 1 December 2010 (UTC)
/* SLOW_OK - Empty talk pages with mostly bot edits */
SELECT CONCAT('[[Talk:',page_title,']]'), COUNT(*) AS Edits, COUNT(ug_group) AS BotEdits
FROM page
JOIN revision ON rev_page=page_id
LEFT JOIN user_groups ON ug_user=rev_user AND ug_group="bot"
WHERE page_namespace = 1
AND page_is_redirect = 0
AND page_len IS NOT NULL
AND page_len<5
GROUP BY page_id
HAVING BotEdits > 0 AND BotEdits + 1 >= Edits
Many of these are talk pages of redirects. A simple (and sane) option is to make them redirect to the appropriate talk page.
Rich
Farmbrough, 21:15, 15 December 2010 (UTC).
Recently I came across a template where {{ collapsible option}} was included in the documentation, but the template was not relaying its {{{state}}} parameter over to {{ navbox}}. It would be a good idea for a bot to monitor and auto-fix or untag these. -- Joy [shallot] ( talk) 10:40, 24 December 2010 (UTC)
There is a discussion at CFD on renaming a large group of categories to use endashes in their titles per WP:DASH.
To put this change in perspective, it would be useful to know how many category titles already use an endash. I tried using a search to get a figure, but it finds nothing at all.
Can anyone with access to bot-like tools do a count, without too much work? -- BrownHairedGirl (talk) • ( contribs) 15:35, 23 December 2010 (UTC)
[^0-9 ] *– * [^0-9 ]
. —
Dispenser 13:23, 24 December 2010 (UTC)
Could some one write, set up, and run, a bot equivalent to that that generated WP:AFDSUM ? There have been several requests for AFDSUM to be restarted since it stopped running, on the related talk pages. 65.95.13.158 ( talk) 07:51, 21 December 2010 (UTC)
AfD Keep # Delete # Merge # Redirect # Neutral # Expired Filed
for
tools:~betacommand/AFD.html and nothing happened (no sort). Can that sort feature be added to/fixed for AFD.html? --
Uzma Gamal (
talk) 01:42, 26 December 2010 (UTC)
User:Jasy jatere has
requested to add
ISO 639-3 language codes to three letter disambiguations, e.g. a link to
Amal language from
AAD (aad
is the ISO code for this language).
Here is a list of dabs missing such entries. Lines with no link in front of the colon mean that my bot didn't find a dab; they can be ignored (maybe I'll create redirects for these).
My problem is that a bot (as far as I can see) cannot find out which section in a given dab is the right one to add the link. Any ideas how this could done (semi)automatically? Thanks, ἀνυπόδητος ( talk) 09:04, 20 December 2010 (UTC)
I was thinking of writing a film article and thought of asking a Wikipedia for help. One of the best Wikipedians to ask for help would seem to be the ones who have contributed to the most Category:FA-Class film articles. There are a bunch of articles in the FA-Class film article category and I wasn't sure of the easiest way to find those Wikipedians who have contributed to the most FA-Class film articles. Then I thought that it would be great to have such a list. We do have Wikipedia:List of Wikipedians by featured article nominations, but that doesn't associate the editor with the topic. Here is where you guys come in as I think a bot can help generate such a table. The bot would:
There might be an easier way to do this, so please feel free to take that route. Here is how I see the table being formatted from the above 300 (film) example (if you can think of other stats to add to the table, please do so).
Featured Article | FA-Class film category | Significant contributor to the article | Notes |
---|---|---|---|
300 (film) | FA-Class film articles | Arcayne | |
300 (film) | FA-Class film articles | Alientraveller |
Please place the results in a new subpage in my user space. I'll add it to project space once it is ready. Also, my request at Tips for writing filmologies inspired this bot reuest as well. -- Uzma Gamal ( talk) 13:26, 24 December 2010 (UTC)
Could someone create a new version of WildBot? The bot hasn't run since September, and Josh Parris, who maintained it, hasn't been around since July. The source code is available at the Toolserver, so you wouldn't need to write everything anew. Nyttend ( talk) 01:53, 17 December 2010 (UTC)
Would a mass upload of USGS publications which are PD to the commons be feasible? (I'm asking here since the material is primarily English)? Smallman12q ( talk) 16:15, 25 December 2010 (UTC)
Would it be possible to have a bot go through Filespace and identify local file description pages for a)Commons files and b)non-existent files? The pages should then be added to Category:Description pages missing files. The vast majority of these will be eligible for speedy deletion under WP:CSD#F2 - however, they do need to be reviewed by a human, as there are some legit uses ({{ FeaturedPicture}} and so forth). Most, though, are either created in error or are simple vandalism. Kelly hi! 18:18, 29 December 2010 (UTC)
Per Wikipedia_talk:AutoWikiBrowser/Feature_requests#Remove_.7B.7Bcoord_missing.7D.7D_if_.7B.7Bcoord.7D.7D_exists, "Speaking of Bharati Bhavan Library, could a feature be added to AWB to remove {{ coord missing}} from an article if the article also contains {{ coord}}? GoingBatty ( talk) 03:34, 29 December 2010 (UTC)"
Since this is a trivial bot job, I made list of pages bot having {{ coord}} and {{ coord missing}}. I found 2,569 pages and I am fixing right away. -- Magioladitis ( talk) 19:29, 29 December 2010 (UTC)
Done Magioladitis ( talk) 23:56, 29 December 2010 (UTC)
I have no idea how to create or use bots, but after receiving a comment at a Peer Review I wanted to ask if it would be possible to engineer a bot that could locate and fix links that go to a redirect page so that when users/visitors click on a link in a wikipedia article they go to the correct page the first time. Since this would only effect links in article pages it would (in theory, anyway) not effect pages created as redirects since those would still need to exist to make sure that terms entered into the search box go to the right page. I am aware of WP:NOTBROKEN, but think this could help Wikipedia in the long run by providing a degree of spell checking and by providing increased accuracy to make sure that articles links do not encounter double redirects or redirects that simply return a user or visitor back to the page in the first place. In the interest of fairness I will concede that the bot does not have to be automatic, if at all possible it could be designed to run only on request pages, but I think that it could be helpful to Wikipedia as whole. TomStar81 ( Talk) 00:01, 30 December 2010 (UTC)
Hi. I recently learned of the existence of this log. Alison made a request on AN that any time IPBE is granted, it be logged. Easy enough, but I was wondering if there's a way to either a) have a bot get the list up-to-date, b) update the list as the right is granted, or c) both. Thanks! TN X Man 16:28, 24 December 2010 (UTC)
I need a bot that will automatically update a "Hot articles" list for a Wikiproject once a day. The bot should look through all of the articles under the purview of the project and compile a list of the 5 articles receiving the most edits in the last 3 days. For an example of a manually-updated version see Wikipedia:WikiProject_United_States_Public_Policy/Leaderboard/What's_hot. This will more than likely be run on the toolserver. It would also be nice (but not required) if the bot could be configured per project. For example, to show the top 10 articles receiving the most edits in the past 14 days. Kaldari ( talk) 22:42, 28 December 2010 (UTC)
Just be warned, the longer the timespan and the bigger the wikiproject, the longer the tool will take. Tim 1357 talk 03:06, 2 January 2011 (UTC)
I'm not sure where to post this suggestion, so please move if it isn't fit for bots. I'd like to suggest a bot run to change "not be able" to "be unable" in articles. Smallman12q ( talk) 00:10, 2 January 2011 (UTC)
250,000 leaked "cables" are expected to be published in the next few years by WikiLeaks. Main story: United States diplomatic cables leak. I checked the whole discussion about the policy for linking to the leaks, discussing their contents etc. There's a huge work going on to keep Contents of the United States diplomatic cables leak updated with summaries of the most relevant cables.
In my opinion, each of these cables is going to trigger important discussions worldwide and affect the future of world diplomacy. All mainstream media are giving frequent reports on the latest releases referring to individual items of the list.
Each cable has a unique ID (e.g. "10MADRID86") that follows a well-defined grammar (see Template:Cablegate). This is the way those cables are referred to in citations by secondary and tertiary sources. In my opinion, it makes sense to create an article for each cable with the ID as the article name. All such articles would contain basic information about the leak, like its origin, the date it was sent and the date it was leaked, its secrecy classification, a link to the WikiLeaks page and a few wikilinks (not the cable content). This is information a bot can fetch from WikiLeaks. Humans can add more information like a summary, background data and reactions to the leak, for those cables that end up impacting more on the public opinion.
I can also take care of creating the bot if there is consensus. I also have a few ideas about the details but let's discuss them only if the community thinks that this is a good idea. It is pretty unusual for me to google for an individual cable on the basis of its "official name" and not find a Wikipedia article explaining what it is. -- MauroVan ( talk) 16:19, 3 January 2011 (UTC)
I oppose the use of bots to create stubs. It's created a quagmire of non-notables specks from atlases, and there's no need to create another set of articles about memos. After each memo has been discussed in multiple independent sources, human editors can weigh those sources and create meaningful articles about them.— Kww( talk) 06:54, 5 January 2011 (UTC)
Discussed out at Wikipedia talk:WikiProject National Register of Historic Places#Please change the standard citation to omit the link, there's a need to replace several variations of references to the National Register of Historic Places (NRHP)'s NRIS database, by a call to template:NRISref instead. All background work done, consensus on the change accomplished. Draft articles being started using two outside systems (Elkman's NRHP infobox generator and my own batches-of-NRHP-articles-generator) are coming in with the template calls. User:Kumioko was helpful in the discussion and might comment here.
What's needed in general is to address 20,000-30,000 instances in wikipedia of references like:
<ref name="nris"> {{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13 |work=National Register of Historic Places |publisher=National Park Service}} </ref>
or in most cases all in one line:
<ref name="nris">{{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13|work=National Register of Historic Places |publisher=National Park Service}}</ref>
and having various dates. Half or more will have 2009-03-13 date. Most of the rest will have 2008-04-14 or 2007-01-23.
These are to be replaced by:
where 2009a, 2008a, 2007a are versions specifically programmed in the template:NRISref. If the bot operator could identify any other commonly used dates, then versions for those could be added to the template, but this is most of them.
In many of the articles, there are later invocations of the reference by <ref name=nris/> or by <ref name="nris"/>, which should be left unchanged. If there are multiple outright definitions of the NRIS reference in one article, that's an error to be noted (listed in an errors page?) and fixed manually.
Also there are some cases where a URL other than url= http://www.nr.nps.gov/ is provided. These should be treated as errors and listed or put into a category somehow, too.
Can i provide any more info? Your consideration is appreciated. -- Doncram ( talk) 16:45, 30 December 2010 (UTC)
|version=Error (date)
or anything else besides the specific dates we have identified, the template will spit out exactly what you put in. If we want to categorize these later, we can just tack a category onto the end of the code where I made the note. Really, they don't even have to be tagged with an error at all.. if they don't have one of the prescribed dates, they will trigger an error anyway.--
Dudemanfellabra (
talk) 17:16, 7 January 2011 (UTC)