This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 50 | ← | Archive 54 | Archive 55 | Archive 56 | Archive 57 | Archive 58 | → | Archive 60 |
Could somebody insert {{mobile IP|[[Vodafone]] UK (212.183.128.0/20)}} to the top of any existing user_talk: in the aforementioned range? Of course, if it is not already there, like in user talk:212.183.140.15. I hoped to do it manually, but became discouraged when looked on Special:PrefixIndex/user talk:212.183.128.
It would be especially fine if the bot replaces all alternative templates such as {{ shared IP}} or {{ dynamic IP}}. Incnis Mrsi ( talk) 16:22, 29 July 2013 (UTC)
I think a bot that messaged a user when they added a typo to a page would be a good idea and help nip a lot of typos in the bud. This would work similar to how BracketBot (by @ A930913:) does now, posting a message to user's talk page with a link to their edit and a snapshot of all typos on the page. Potentially some typos would be omitted by the bot, particularly those with a high chance of being a false-positive. Jamesmcmahon0 ( talk) 14:46, 28 July 2013 (UTC) The message added to the user's talk page should also explain how to use templates such as {{ Typo}}, {{ Not a typo}}, {{ As written}} etc. to reduce false positive in the future. Jamesmcmahon0 ( talk) 14:52, 28 July 2013 (UTC)
Let me tell my opinion as a person who has many years of experience in correcting spelling mistakes by bot in Hungarian Wikipedia. (See this page to learn what kind of experience I speak about.) I may seem unconstructive but the best thing is to realize that this task is not worth the effort. Either you include such a small set of typos that the project will be ridiculous, or it will be slower and slower and soon it won't be able to follow the flow of new edits. On the other side, as you add newer typos to the same bot, likeliness of false positives will increase. I began with 2 fixes (small packages containing the pattern of the known typical mistakes), and now I have several dozens of them which are run separately and I always make a lot of experiments to invent new ones. It's not so simple. Bináris ( talk) 20:13, 30 July 2013 (UTC)
Could some respond at Wikipedia:Bots/Requests for approval/Theo's Little Bot 24, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:05, 30 July 2013 (UTC)
A bot is needed to make a list of all articles in the category Minor planets that are just like 11451_Aarongolden in that they don't have enough in them to pass WP:NASTRO. Also, you might want to tag each one with some sort of appropriate tag that you judge appropriate or helpful in terms of keeping track of them or whatever. You be the judge; just don't delete them or anything like that, just yet.
Please give the list a descriptive title with a date, such as "Minor Planet articles that might fail NASTRO - Phase One" with the date the list was completed.
To explain, at the moment, this is just to know how many such articles there still are, despite earlier efforts, so we can know the size of the situation and discuss what to do, if anything, with the articles. Later, if we decide to pursue it, WP:NASTRO requires a multi-step "good faith effort" to establish notability that you can read about there if you wish, but if that fails, NASTRO asks that they be converted into redirects to the List of minor planets and the info in the articles transfered there and so on - suffice it to say that it looks like a big job, so let's not get ahead of ourselves. I just offer this second paragraph in case you want to know the reason for this request.
If you need/want more information or more specific instructions or something, please do ask. Chrisrus ( talk) 04:44, 30 July 2013 (UTC)
<ref
{{
Infobox astro object}}
or {{
Infobox planet}}
tagged with {{
Notability}}
(which should be {{
Notability|Astro}}
).
GoingBatty (
talk) 13:59, 30 July 2013 (UTC)The bot is generating a list now; be forewarned that it might take a while. Theopolisme ( talk) 19:47, 30 July 2013 (UTC)
really would be nice to have a bot to update Wikipedia:Peer review/backlog/items automatically, rather than relying upon editors to do it. perhaps there is already a bot for backlogs that can merely have that page added to it's tasks.... -- Aunva6 talk - contribs 18:59, 2 August 2013 (UTC)
Over at Wikipedia:Dispute resolution noticeboard we have accrued an ad-hock combination of scripts and templates, assisted by EarwigBot and MiszaBot. In particular, we seem to be asking The Earwig for a lot. He has been very responsive and has been great about our constant stream of requests, but rather than dumping more and more on him I am wondering whether someone who is really good at automation has the time and inclination to do a proper job of re-engineering all of our DRN automation tools from top to bottom. If we manage to get a smooth-running system working, other noticeboards might be interested in using the same system. Is anyone interested in working on this? -- Guy Macon ( talk) 06:59, 28 July 2013 (UTC)
A bot is needed to perform the good faith effort to establish notability specified in the "Dealing with minor planets" section of NASTRO: Wikipedia:NASTRO#Dealing_with_minor_planets Here is the list: Wikipedia:Minor_planet_articles_that_might_fail_NASTRO. The bot should add to that list how many, if any, hits the name of the object gets on that database NASTRO specifies.
Thank you for your kind attention to this matter. Chrisrus ( talk) 04:12, 6 August 2013 (UTC)
Hi, while going through many of the medical stubs, I encountered a lot of stubs (and some other articles) which have been turned into redirects and are still classified in the assessment (WPMED) on their talkpage as an article. Could a bot set the class to redirect in those articles and remove any importance rating? -- WS ( talk) 14:27, 8 August 2013 (UTC) And, as a secondary request, would it be possible to find all articles which have a Infobox disease, Infobox symptom, Interventions infobox or Diagnostic infobox and tag them with WPMED if not already done? -- WS ( talk) 11:19, 9 August 2013 (UTC)
For some time now, I've been doing this manually, as I edited templates or their documentation, but I've hardly scratched the surface. Can someone assist, please?
The process is:
We should also so the same thing for /sandbox and /testcases and other subpages' talk pages.
This prevents fragmentation of discussion between the various talk pages; particularly as documentation pages seem to be under-watched. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:35, 2 August 2013 (UTC)
Over the past several months, those of us working in WP:SPI have confirmed a whole raft of related paid-editor sockpuppets — over 300 so far, according to SPI reports, and that's probably the tip of the iceberg.
These socks all have a similar editing pattern: make some minor random edits to get autoconfirmed status, sleep for a while, maybe a month or two, and then build a seemingly well-referenced but often-promotional article in their sandbox, which they then move to main space. The identity of the company creating these socks is known but I won't reveal it here.
Right now we're in whack-a-mole mode. The socks aren't identified until they actually post an article into main space and a patroller familiar with the SPI case notices the pattern and reports it.
Therefore, it would be really useful if there were a bot that maintained a page that listed new sandbox creations. Such a list would be most useful if it showed the date of creation, link to the sandbox, and a snippet of the lead sentence.
Patrollers could then more easily report potential socks in advance to SPI, and a checkuser could verify them and block them, before the material goes to main space.
A bot maintaining such a page would help us get a jump on this army of socks, thereby denying them their revenue and perhaps, eventually, convincing this company to work with the community rather than resort to block evasion and other tactics. ~ Amatulić ( talk) 23:03, 7 August 2013 (UTC)
Instance of {{ Infobox Korean name}} which are underneath a biographical infobox (for example {{ Infobox person}}) need, where possible to be made a module of that infobox, as in this edit. I'm not asking for anyone to start work immediately, but - so that I can draw up a plan and get consensus - can anyone ease advise whether that's something it's likely at a bot could do, or would there be too many false positives? Could we perhaps work on the basis of proximity? Say, "if there is no subheading between them, make the edit"? Or "If nothing but white space separates them"?
Also, would it be possible, please, for someone to draw up a list of articles which use both of the infoboxes named above? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:46, 8 August 2013 (UTC)
Hello, I'm Castigonia, and I want to make a bot request. The reason I would want to ever make a bot is because I would want to contribute to Wikipedia and any other "Wikis" as much as possible, but may not be able to do so (such as sleeping, vacation, school, etc.).
If this request was accepted, I would follow the instructions word-by-word and make a bot that would make major contributions. It would make accomplishments to Wikipedia that would make it a better encyclopedia. It would make accomplishments to other wikis and make them better as well. I have always looked up to a bot such as User:ClueBot NG and now want to create a bot like it.
I hope you accept my request. I am an autoconfirmed user and want to accomplish more on Wikipedia and its sister projects. So please think about your decision and let me know when you have. Castigonia ( talk) 13:45, 10 August 2013 (UTC)
Sorry about that, I really want to make this bot. Thank you for the information! Castigonia ( talk) 14:18, 10 August 2013 (UTC)
Transclusions of Template:Languages of Angola should go below the ref section. — kwami ( talk) 08:31, 14 August 2013 (UTC)
I have been
told that
T:AH is in use in 33,000 pages (which sounds like a low estimate to me). I need to know how many of these pages have four=no. four=no is used by
WP:FOUR to distinguish the
WP:FAs that have been
WP:GA and
WP:DYK according to T:AH that are rejected from the four=yes. The four=no should populate
Category:Wikipedia articles rejected for Four awards in the near future and if the bot both counted and categorized that would be optimal although I just need the count for now. Because of
this, it could take weeks for this category to populate itself. The expected number is between 300 and 1300 of the 3980 FAs have four=no.--
TonyTheTiger (
T/
C/
WP:FOUR/
WP:CHICAGO/
WP:WAWARD) 06:14, 2 August 2013 (UTC)
O.K. I think we are getting close to done. I found a 500 edit sample of my contribution history with 43 four=no edit summary samples 40 of which are already in the category. I guess we are over 90% of the way, but I still don't know how I will know if we are done.-- TonyTheTiger ( T/ C/ WP:FOUR/ WP:CHICAGO/ WP:WAWARD) 03:30, 14 August 2013 (UTC)
An editor has been tagging the 2000 census data in articles with {{not in reference}} ( [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24]) because the {{GR|2}} citation template now leads to a redirect page at the Census Bureau website. Is there any way this can be fixed with a bot? It affects hundreds of thousands of articles. Thanks for looking into this. 71.139.153.14 ( talk) 14:42, 8 August 2013 (UTC)
Off-topic discussion
|
---|
|
The dead link can be fixed in Template:Geographic reference (GR2). Almost sure there is a copy for that broken reference in another location, WayBack machine, mirrors, etc. (Probably this helps). emijrp ( talk) 13:53, 9 August 2013 (UTC)
( edit conflict) One of the problems with FactFinder is that (to my knowledge) on Wikipedia, citations to the site are usually very general (for example, in [25] that the IP provided, the url simply goes to the homepage). This makes it very difficult to somehow magically generate a link to the new FactFinder for the 2010 Census, as The Rambling Man said. It looks like they do have a deep linking guide, so perhaps it would be possible to construct a new template that could take a "geographic identifier" as input and output an actual functional citation...but that may be out of the scope of this discussion. Theopolisme ( talk) 21:28, 9 August 2013 (UTC)
Off-topic discussion
|
---|
|
I think a lot of the blame lies with the Census Bureau's web developers for making it so difficult (if not altogether almost impossible for end users) to link to specific pieces of data. The community decision in 2008 (do you have a link, by the way, for reference?) was probably significantly influenced by this. tl;dr, FactFinder needs a revamp. Theopolisme ( talk) 22:08, 9 August 2013 (UTC)
Any news on this from the bot people or do we need to start tagging these sections as {{ unreferenced section}}'s? The Rambling Man ( talk) 16:31, 19 August 2013 (UTC)
A bot to tag items in Category:G13 eligible AfC submissions that have not been edited since before 2013 for deletion. The main reason is that as I type this the category is at 29,055 refreshes page make that 29,057 and growing fast, due to the sheer size of the backlog it is not really practical for humans to review that many entries. All of this is moot if someone with the bot flag could set up similar parameters in AWB. PantherLeapord| My talk page| My CSD log 05:57, 13 August 2013 (UTC)
We need some automated means of dealing with the ever growing backlog. In the time it would take one person to CSD one entry two or more have already taken it's place! The backlog is now at 46,151 and counting! Even if we were able to process one per second it would take over 12 hours non-stop to process them. Considering the average time to process one is about 10 seconds (for me anyway) it will take over five DAYS! These figures do not account for more entries coming in. PantherLeapord| My talk page| My CSD log 23:44, 16 August 2013 (UTC)
As a result of the page moves executed per this discussion at Talk:Hotel California (Eagles album)#Requested move we are in need of a bot to make the necessary changes per the discussion here at User talk:Tariqabjotu#Hotel California page move. Thank you. -- RacerX11 Talk to me Stalk me 06:05, 18 August 2013 (UTC)
Could someone please parse the list of editors at Wikipedia:Wikipedians_with_articles; check their contributions, and append (if not already there) the text "Not active." to anyone who hasn't edited in a period (say, 9 months?); and remove it from any that have? This task might usefully be re-run on a periodic basis. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:42, 19 August 2013 (UTC)
For the pages on the Category:Pages with archiveurl citation errors page where the archiveurl is from archive.org, generate the archivedate and url from the archiveurl if they don't exist. For example with a cite web that has |archiveurl=http://web.archive.org/web/20060904135528/http://www.vetmed.auburn.edu/news/bailey.htm the bot would generate |archivedate=20060904135528 and |url=http://www.vetmed.auburn.edu/news/bailey.htm . I'm not sure this could be done with webcitation, but it might be worth looking at. Naraht ( talk) 00:36, 20 August 2013 (UTC)
I recently changed title into She's the One (Bruce Springsteen song). Perhaps you can fix ambiguity? -- George Ho ( talk) 05:03, 17 August 2013 (UTC)
Hi people, I'm hoping someone could quickly slap a bot together to perform the tasks in Wikipedia:WikiProject Spirits/Assessment Drives/September 2013, that is check daily if there are newly assessed articles, tally those, and print them out in a table. Does anyone have laying something around, or can slap something together quickly? Cheers, Martijn Hoekstra ( talk) 21:22, 20 August 2013 (UTC)
Moved from The Star Wars Holiday Special per RM. -- George Ho ( talk) 01:10, 23 August 2013 (UTC)
Since the parameters in the info boxes are inconsistently ordered, we sometimes end up with duplicated fields. This can cause editing problems. I'd like a list of all articles with duplicate parameters in transclusions of {{ Infobox language}} and {{ Infobox language family}}, with the parameters that are duplicated, even if the fields are blank. (A blank field will still override a filled one, and will also invite future additions that may not display properly). Since there's no actual changes of the articles by the bot, I hope this will be easy to approve. — kwami ( talk) 23:23, 24 August 2013 (UTC)
@ Kwamikagami: The following is the result (sorry, it wasn't filtered by namespace):
Hazard SJ 08:08, 29 August 2013 (UTC)
Redirect page is The Best Is Yet to Come (song), with pages linking to the redirect. The target page is the very old song, so "(song)" should be dropped in linked articles. -- George Ho ( talk) 23:27, 26 August 2013 (UTC)
Template:Infobox soap character 2 should only be used for EastEnders characters where relationships play a strong role. Nowadays, the infobox has been ambushed to import unencyclopedic text in infoboxes. A bot (or a willing editor) should replace Template:Infobox soap character 2 with Template:Infobox soap character to all non-EastEnders characters. this can be done by simply removing the number 2. -- Magioladitis ( talk) 13:42, 27 August 2013 (UTC)
Andy Mabbett check this. -- Magioladitis ( talk) 14:24, 27 August 2013 (UTC)
On reflection, I've nominated both templates for a merger discussion at TfD. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:30, 27 August 2013 (UTC)
With new error checking, we currently have 45,000+ pages in
Category:Pages using citations with accessdate and no URL. This is mostly due to citation templates that include |accessdate=
but do not have |url=
. --
Gadget850
talk 19:23, 27 August 2013 (UTC)
I'm intrigued by the possibility of creating bots to work with Wikipedia tools like Huggle or Igloo or Vandal Fighter. That could probably be useful....-- User:SmartyPantsKid 20:53, 30 August 2013 (UTC)
Number of pages affected : ~2000
Task : Flag all unused file redirects with {{db-g6|rationale=Unused file redirect}}. And 'unused' is means no incoming links at all.
This will hopefully end the eclipsing of some files at commons as well a dealing with a backlog that has bulit up in respect of -Wikipedia:Database reports/Unused file redirects
Sfan00 IMG ( talk) 14:36, 2 September 2013 (UTC)
But some kind of log so mistakes can be found would be appreciated.
Note not all usages would have a File: prefix, some infoboxes for example add the File/Image prefix internally... Sfan00 IMG ( talk) 14:56, 3 September 2013 (UTC)
This is a read-only request (no bot edits asked, just list pages):
Please list all template calls from
Template:RailGauge.
rowid | namespace | pagename | |1=
|
Redirect from | other params used | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
explained: | output row id | name or number | {{PAGENAME}} | Unnamed para 1 input value as typed ("1=" might be omitted) | {{
railgauge}} {{
gauge}}
|
list of params, n is unk | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
data example a: | 1 | 0 | Standard gauge | 57 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
b: | 4837 | 0 | Bradway Tunnel | ussg | Template:Gauge | wrap=y|al=on | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
c: | 7865 | 0 | Indian Railways | 1676mm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
d: | 124 | 0 | Indian Railways | 1435mm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
data row (CSV example) | "7865", "0", "Indian Railways", "1676mm", "", "allk=on", "foo=nonsensetext", "wrap=yes" |
|1=
are optional and so unknown in number (0 ... ). These n can better be rightmost. Please split into "" each, if can be done easy. Not my primary request clearly. -
DePiep (
talk) 02:13, 3 September 2013 (UTC)It was suggested to me at Wikipedia:Village pump (technical)#Finding ACTIVE foreign language speakers. that a bot would be a good option for updating Wikipedia:Translators available. The bot to do this job would, I am supposing on a once or twice monthly schedule, glance across the contributions of editors named therein, and indicate what was the last month in which each such editor had edited. Which would thusly provide an idea of which editors with translational skills would be around to provide such skills in a pinch, as asked. Blessings!! DeistCosmos ( talk) 02:17, 6 September 2013 (UTC)
Through many pages/months/years? of discussion at WT:NRHP and its archives, it has been decided that a bot is needed to tag any articles with only a single reference to the National Register Information System (NRIS) with the {{ NRIS-only}} template, which encourages the addition of extra sources and puts the articles in cleanup categories, "Articles sourced only to the NRIS since MONTH YEAR". The shortcomings of NRIS are explained here.
To find articles that are only sourced to NRIS, a list of all the pages on which {{ NRISref}} is transcluded could be a starting point. There may also be pages that link directly to older versions of the NRIS website, which would include the string "www.nr.nps.gov" somewhere in their content. From this giant list of articles, those with a single reference need to be picked off and tagged.
After its initial run, the project would like for the bot to continually check new articles and tag them if they are NRIS-only and prevent the removal of the NRIS-only template from existing articles unless a second source is added. Is this possible?-- Dudemanfellabra ( talk) 05:06, 28 August 2013 (UTC)
Hello, WP:JAZZ would like to have a 'bot add the {{ WikiProject Jazz}} banner to jazz-related pages that aren't already tagged, and/or auto-assess pages that aren't already assessed.
{{WikiProjectBannerShell}}
if and when it was able to do so.Essentially, I am asking for a repeat of what Xenobot Mk V originally did for us in 2010 (see above). We wish to add {{ WikiProject Jazz}} to articles in jazz-related categories. The list of relevant categories is located at Wikipedia:WikiProject Jazz/Categories, but please note that there are actually three lists of categories at that page, and they each need to be tagged slightly differently:
{{WikiProject Jazz}}
to the articles (or rather, the talk pages) within the
/General sub-listingTo the best of my knowledge, /Categories represents all applicable categories and sub-categories (I deliberately omitted those that are outside the project's scope), so you should not need to worry about sub-category depth. I finished updated these listings a few minutes before posting this.
Furthermore, we wish to auto-assess those pages that do not already have an assessment (including those already tagged with {{WikiProject Jazz}}):
class=
from other WikiProjects (if any):class=
if only a single rating is availableclass=
if two or more ratings are available; in the event of auto-stub/inherit conflict, inherit the most frequent (or highest) class=
rating|auto=yes
|auto=inherit
|auto=length
|autoi=yes
class=stub
based either of the following criteria:importance=
ratings.And, add {{WikiProjectBannerShell}} when possible/applicable.
I have an additional request, but I am not sure whether it's technically possible (see
comments). I'd be interested in having the 'bot add needs-infobox=yes to the {{WikiProject Jazz}} template, if the article does not have an {{Infobox foo}} template; or if {{WikiProject Jazz}} can inherit this setting from another WikiProject banner, or it can inherit this setting if the talk page already has {{Infobox requested}}
.
Let me know if I can clarify anything, either leave me a message here or at WT:JAZZ.
Thanks in advance, -- Gyrofrog (talk) 17:09, 6 September 2013 (UTC)
Gyrofrog I can do it for you. I have some questions: 1) any |importance=
should be removed? 2) "autoi" is a typo? Do you mean auto? --
Magioladitis (
talk) 10:59, 9 September 2013 (UTC)
|importance=
; do not remove it. But if it is blank for WP:JAZZ, we do not want to inherit that parameter from other WikiProjects.|autoi=
. It is not a typo, but apparently it is not currently being used, either (anyway, it doesn't work for WP:JAZZ – I may have seen it on a different template
[28]).|auto=
or |autoi=
applied. However, a quick test suggests that for |auto=
, "length" and "inherit" are the only two options that work. |auto=yes
and |auto=stub
do not work.|class=Category
, |class=Template
, etc. although I believe the template will automatically handle all of these without having to specify). --
Gyrofrog
(talk) 16:06, 9 September 2013 (UTC)Redirect page is Upstairs, Downstairs; many links link to the redirect page of the 1971 series. Perhaps fix ambiguation? -- George Ho ( talk) 17:21, 6 September 2013 (UTC)
I was asked by Adam Cuerden, the main contributor to the Signpost's tech report, if I would write a short piece about how to make Wikipedia articles more accessible to screen reader users. I promptly did so and it's now been published along with the rest of the tech report. Adam then suggested on my talk page that the first two problems I noted could be fixed in articles by a bot, which is where you guys come in ... (here's the relevant conversation). In particular, as noted there::
What do you all think? Graham 87 05:05, 7 September 2013 (UTC)
(I'd asked about this in late July. Someone else had offered to help with it, but they have since become inactive.)
DASHBot (
talk ·
contribs) used to create, and/or periodically update, a list of unreferenced biographies of living persons for a given Wikiproject (see
User:DASHBot/Wikiprojects). However, that bot has been blocked since March. I'm wondering if another one can accomplish this same task. I'm asking on behalf of
WP:JAZZ (whose list is at
Wikipedia:WikiProject Jazz/Unreferenced BLPs) but there were a lot of other WikiProjects on that list, as well (I'd already removed WP:JAZZ, though). Thanks in advance, --
Gyrofrog
(talk) 13:31, 6 September 2013 (UTC)
User:Theopolisme has taken a temporary hiatus, and I would request that the already approved TAFI related tasks from his bot be taken up by another bot. The python scripts are available here. Performing all these tasks manually are painful. It needs to run one at 00:00 UTC on Mondays, and can basically run all tasks once a week then. -- Nick Penguin( contribs) 04:02, 10 September 2013 (UTC)
A bot is needed to go through Category:Canoeists and change the name of any pages that are currently called 'NAME (Canoer)' to 'NAME (Canoeist)' as per the discussion at WP Kayaking Jamesmcmahon0 ( talk) 10:06, 9 September 2013 (UTC)
I'd been doing some of this linkfixes manually, but figured it's an WB or bot task really.
The list contains image pairs.
Basic task is, For Each image pair in the list-
Could an automated task for doing this be developed? Sfan00 IMG ( talk) 22:07, 9 September 2013 (UTC)
Hello: Can someone take over the new article alert job from TedderBot? This page has the link to the source code. The bot has been out of operation since August 22. Please see here the requests from various project owners to get bot working again. Thanks. — Ganeshk ( talk) 00:33, 30 August 2013 (UTC)
Thanks for getting it up and running again. As it is a very important bot for a number of wikiprojects, and Tedder notes he is very busy IRL, it would indeed be good to have a backup ready to kick in on a short notice, so this request should still be seen as open for anyone willing to develop a backup capability. Better have it ready know than after another x-week delay and grief in the future. -- Piotr Konieczny aka Prokonsul Piotrus| reply here 04:06, 14 September 2013 (UTC)
Good apps — Preceding unsigned comment added by 106.203.17.125 ( talk) 01:59, 18 September 2013 (UTC)
WP:NRHP maintains lists of locations that are on the National Register of Historic Places for everywhere in the USA; an example is National Register of Historic Places listings in Franklin County, Kentucky. Just about all of these lists have tables composed of transclusions of a dedicated template, {{ NRHP row}}; these were just added a year or two ago, replacing hardcoded tables, and it's possible that occasional articles got missed and still have one or more entries with the hardcoded table. For example, the Franklin County KY list had a single entry that appeared different from the rest, and viewing the source showed me that there was a single line with the old hardcoded table; I've since fixed it. Is there any way (other than manually viewing everything) to check for pages that still have the hardcoded table? My first thought is a bot that would check as follows:
The hardcoded tables always used templates whose names end with "color" for all entries, and without the last item, we'd miss any lists in which there's an old-style table entry below the last of the NRHP rows. I don't think we need to worry much about false positives, because the bot wouldn't be doing anything more than logging what it finds. This is a rare enough problem that we're unlikely to find many such pages; the only reason I'm requesting it is that we have more than 3,500 pages whose names begin with "National Register of Historic Places listings in", so it would take an extremely long amount of time for a human to check all of them. Nyttend ( talk) 03:43, 19 September 2013 (UTC)
could I get someone to have a bot tag all the templates listed in the collapsed section of the following two discussions:
it's very simple, you just add {{subst:tfd}} at the top, assuming this runs before the end of September 20 wiki time. they are all unused. thank you. Frietjes ( talk) 17:05, 20 September 2013 (UTC)
Currently Wikipedia:Templates for discussion/Log/2013 September 11#Template:WikiProject United States shows an overwhelming consensus to destroy the WikiProject United States collaboration and associated template. It is requested that someone with a bot demerge all the projects and reallocate all the articles to the appropriate project. For all those who don't like WikiProject United States here is your chance to end it once and for all and restore about 100 dead projects to their previous state of inactivity. If you don't like Kumioko here is your chance to step up and show them how to build a bot to do assessments the right way! Either way, a bot is needed as this task is just too massive for one person to do. It took Kumioko years to get it to this point, how fast can we restore things back to their natural order? 138.162.8.57 ( talk) 19:22, 20 September 2013 (UTC)
{{WikiProject United States|class=foo|importance=blah|ID=yes|ID-importance=bar}}
to {{WikiProject Idaho|class=foo|importance=bar}}
?
GoingBatty (
talk) 02:52, 23 September 2013 (UTC)
Yeah, I know the subject line don't make a lot of sense, ok? :) I guess I'm asking if there is any way to get a bot developed which can go through one or more of the pdf files of encyclopedias at commons' encyclopedia category and maybe generate a list of articles contained in that file. It would probably be great if it could have some sort of indicator of relative length of the article, things like maybe 4 lines, 1/2 page, 2 pages, or whatever. But I do think, maybe, for some of the smaller countries, with less generally spoken languages, and maybe some of the less popular topics, having something like that which could give editors interested in a given topic a quick list of articles which can be sourced from readily available files might be very useful. Maybe, as a first step, considering most reference books have "titles" at the top of each page, a bot which could see if the same "title" is included on two or three consecutive pages, and then list such titles. John Carter ( talk) 01:15, 23 September 2013 (UTC)
I think this is difficult, but.... I just removed a few external links, with link named
I think the chance that any such links are legitimate is nil. Would it be possible for a bot to remove, from all Wikipedia articles,
* [URL Per DTI-NCR Permit No. nnnn, Series of mmmm]
Where:
This might scanning a database dump for DTI-NCR.
— Arthur Rubin (talk) 09:49, 26 September 2013 (UTC)
Hi, I've been approached by an Art Gallery who are aware that many of the thousands of links that Wikipedians have made to their sites over the years are now dead. They may be able to provide a map or logic for the move, if they can would someone be willing to run a bot to update the links? They think it is mostly on EN wiki but will probably be on many other languages as well. They would also be interested in a report listing all of our articles that reference their sites, and presumably any in external link sections. Hundreds, perhaps more than a thousand of the articles involved will be related to living artists so there could be quite a lot of BLPs where the sourcing is improved by this. Jonathan Cardy (WMUK) ( talk) 16:00, 27 September 2013 (UTC)
== Bot to delete a bunch of redirects == Looking for some bot or script to clean up some spam in the form of 200+ redirects to a single article [[Măeriște]]. I suspect the original author was unhappy about the original article being deleted. As a result, the article for this town [[Măeriște]] now has redirects from a multitude of people names, geographic names, newspapers, museums, ethicities, religions, etc. I consider this spam as in [[Wikipedia:CSD#G11|G11]]: Unambiguous advertising or promotion. I would like to delete (or edit with {{dp-spam}} all of the redirects to this article (https://toolserver.org/~dispenser/cgi-bin/rdcheck.py?page=M%C4%83eri%C8%99te) except the following: *Criștelec *Doh, Sălaj *Giurtelecu Șimleului *Kerestelek *Krasznahídvég *Maladia *Maladé *Măerişte *Somlyógyőrtelek *Somlyóújlak *Uileacu Şimleului *Wüst Görgen Thanks [[User:Hollomis|Hollomis]] ([[User talk:Hollomis|talk]]) 22:40, 2 October 2013 (UTC) :Mass RfD is the third door on your left, see [[WP:RFD]] [[User:Werieth|Werieth]] ([[User talk:Werieth|talk]]) 22:43, 2 October 2013 (UTC) ==Edit-a-thons at UC Riverside== FYI: The UCR Libraries are hosting three edit-a-thons focusing on their great special collections (science fiction, water resources, the Inland Empire and more) on Oct. 12, 2013, Oct. 26, 2013, and Nov. 23, 2013. Please participate if you can, and please help publicize to friends in the area! Details here. All are welcome, new and experienced editors alike! -- phoebe / ( talk to me) 22:44, 1 October 2013 (UTC)
A discussion at Wikipedia talk:WikiProject Football#FB team templates has determined that these templates (thousands of them!) should be replaced by direct wikilinks instead (the templates have nothing but a link in them).
What basically needs to be done is; find all instances of a template, replace it with the link (the contents of the template) instead (simply subst the template?), and delete the then unused template. The final part can't be done by bot, but it can be tagged as empty, or we can wait until the task is done and then do one mass-deletion, similar to Wikipedia:Templates for discussion/Log/2013 September 20#Unused fb team templates.
Details on this can be discussed with the members of the Football project at the above discussion, but are there any bot operators who in principle are willing to take this on? Fram ( talk) 09:45, 25 September 2013 (UTC)
Any takers? Giant Snowman 10:54, 8 October 2013 (UTC)
User:Blake305 appears to be inactive. Could someone please pick up this pending task, which has community support and for which his bot already has approval for a trail run? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:45, 4 October 2013 (UTC)
Ok, a lot of work has been done, adding old sources from the hhtp://www.archive.org to articles , especially in the Israel/Palestine area.
Some time ago, archive.org redirected all these links to a link without www, so if you look at http://www.archive.org/details/palestineundermo00lestuoft you will be redired to http://archive.org/details/palestineundermo00lestuoft
Ideally, all these links should be stripped of their www´s, like I have done here. Looks like the work for a bot? Huldra ( talk) 22:56, 7 October 2013 (UTC)
Chris G ( talk · contribs) has left a message on his talk page that real life has taken priority and he's stopped his bots. But he has offered the code if somebody wants to get them running again. Is there anybody willing to take a look at getting RFC bot ( talk · contribs) running again? I'm most interested in the WP:DASHBOARD updating, but the RfC stuff is probably a higher priority. Thanks. 64.40.54.22 ( talk) 06:09, 6 September 2013 (UTC)
This request was made a while ago and taken up by User:Hasteur (who may still have preliminary code available). However, there was a bit of a scuffle at WP:NRHP which resulted in one editor being indefinitely topic banned from areas related to the project, and that turned Hasteur off to the task. Would anyone else be willing to pick this up? The initial bot task is to find all articles whose only inline citation is to the National Register Information System via {{ NRISref}} and tag them with {{ NRIS-only}}. A rough pseudo-code logical procedure is provided below, courtesy of User:Hasteur:
For each page that includes a transclusion of {{ NRISref}}:
After the initial run which tags existing articles, the bot should scan recent changes and tag any newly created articles citing only the NRIS. Also, the bot would need to prevent removal of the NRIS-only template unless multiple references are present on the page. All of this has gained consensus with members of the project.
Hasteur had run this through several test runs of the first task and generated a list in his userspace of articles that would be tagged on a live run, and we were in the process of refining the procedure to better suit the project's needs, but it seems Hasteur has either gotten too busy or begun to ignore this. If anyone could pick this up, WP:NRHP would be very grateful! If any further elaboration is needed, let me know!-- Dudemanfellabra ( talk) 17:28, 10 October 2013 (UTC)
Several times, I've seen an article name with a parenthetical sub-title, even though the non-parenthetical title was free. For instance, naming an article "Such and Such (film)" when "Such and Such" without any suffix wasn't in use. Is there a way that a bot can find instances of such titles? Ten Pound Hammer • ( What did I screw up now?) 00:39, 11 October 2013 (UTC)
Done I found 35,281 non-redirect article-space pages that were unnecessarily disambiguated. The first ten thousand of these are listed at User:Theo's Little Bot/unnecessary dab. Theopolisme ( talk) 22:40, 18 October 2013 (UTC)
Is there a way that a bot can find all instances of {{ Infobox album}} and {{ Infobox single}} have left the producer field vacant, and maybe categorize them? I think this might be useful in categorizing works with unknown producers (e.g. Ronan (song) and All Cried Out (Kree Harrison song)). Ten Pound Hammer • ( What did I screw up now?) 02:02, 15 October 2013 (UTC)
Hi, I have requested this before during summer, but without any reponse and it seems that there are more people active here now: while going through many of the medical stubs, I encountered a lot of stubs (and some other articles) which have been turned into redirects and are still classified in the assessment (WPMED, WPAN, WPPHARM) on their talkpage as an article. I have corrected many of those, but there must still be hundreds more. Could a bot set the class to redirect in those articles and remove any importance rating? And, as a secondary request, would it be possible to find all articles which have a Infobox disease, Infobox symptom, Interventions infobox or Diagnostic infobox and tag them with WPMED if not already done? -- WS ( talk) 12:14, 17 October 2013 (UTC)
As a botop, I would do it myself. But I require admin rights to carry it out. The task is simple. A bot is to downgrade all fully protected templates to template protection, so editor with the template editor rights can actually edit the template.— cyberpower Online Trick or Treat 16:35, 17 October 2013 (UTC)
select CONCAT('*[[Template:', page_title, ']]') FROM page_restrictions JOIN page ON pr_page=page_id WHERE page_namespace=10 AND pr_type='edit' AND pr_level='sysop' and pr_expiry='infinity';
UTC)
NOTE: You can get a list of the fully protected templates with this link -- WOSlinker ( talk) 10:35, 19 October 2013 (UTC)
I was wondering if it would be useful to create a list of articles that do not yet exist based on categories at WikiCommons. There are I'm sure many groups of images just waiting for an enwiki article, and if someone thinks that something is deserving of a category there it is likely that it is deserving of an article here. The bot could simply look through all the categories and produce a list of those which do not contain an image linked from enwiki. This would likely be a one-off task but could be run periodically if deemed useful. violet/riga [talk] 09:20, 19 October 2013 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 50 | ← | Archive 54 | Archive 55 | Archive 56 | Archive 57 | Archive 58 | → | Archive 60 |
Could somebody insert {{mobile IP|[[Vodafone]] UK (212.183.128.0/20)}} to the top of any existing user_talk: in the aforementioned range? Of course, if it is not already there, like in user talk:212.183.140.15. I hoped to do it manually, but became discouraged when looked on Special:PrefixIndex/user talk:212.183.128.
It would be especially fine if the bot replaces all alternative templates such as {{ shared IP}} or {{ dynamic IP}}. Incnis Mrsi ( talk) 16:22, 29 July 2013 (UTC)
I think a bot that messaged a user when they added a typo to a page would be a good idea and help nip a lot of typos in the bud. This would work similar to how BracketBot (by @ A930913:) does now, posting a message to user's talk page with a link to their edit and a snapshot of all typos on the page. Potentially some typos would be omitted by the bot, particularly those with a high chance of being a false-positive. Jamesmcmahon0 ( talk) 14:46, 28 July 2013 (UTC) The message added to the user's talk page should also explain how to use templates such as {{ Typo}}, {{ Not a typo}}, {{ As written}} etc. to reduce false positive in the future. Jamesmcmahon0 ( talk) 14:52, 28 July 2013 (UTC)
Let me tell my opinion as a person who has many years of experience in correcting spelling mistakes by bot in Hungarian Wikipedia. (See this page to learn what kind of experience I speak about.) I may seem unconstructive but the best thing is to realize that this task is not worth the effort. Either you include such a small set of typos that the project will be ridiculous, or it will be slower and slower and soon it won't be able to follow the flow of new edits. On the other side, as you add newer typos to the same bot, likeliness of false positives will increase. I began with 2 fixes (small packages containing the pattern of the known typical mistakes), and now I have several dozens of them which are run separately and I always make a lot of experiments to invent new ones. It's not so simple. Bináris ( talk) 20:13, 30 July 2013 (UTC)
Could some respond at Wikipedia:Bots/Requests for approval/Theo's Little Bot 24, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:05, 30 July 2013 (UTC)
A bot is needed to make a list of all articles in the category Minor planets that are just like 11451_Aarongolden in that they don't have enough in them to pass WP:NASTRO. Also, you might want to tag each one with some sort of appropriate tag that you judge appropriate or helpful in terms of keeping track of them or whatever. You be the judge; just don't delete them or anything like that, just yet.
Please give the list a descriptive title with a date, such as "Minor Planet articles that might fail NASTRO - Phase One" with the date the list was completed.
To explain, at the moment, this is just to know how many such articles there still are, despite earlier efforts, so we can know the size of the situation and discuss what to do, if anything, with the articles. Later, if we decide to pursue it, WP:NASTRO requires a multi-step "good faith effort" to establish notability that you can read about there if you wish, but if that fails, NASTRO asks that they be converted into redirects to the List of minor planets and the info in the articles transfered there and so on - suffice it to say that it looks like a big job, so let's not get ahead of ourselves. I just offer this second paragraph in case you want to know the reason for this request.
If you need/want more information or more specific instructions or something, please do ask. Chrisrus ( talk) 04:44, 30 July 2013 (UTC)
<ref
{{
Infobox astro object}}
or {{
Infobox planet}}
tagged with {{
Notability}}
(which should be {{
Notability|Astro}}
).
GoingBatty (
talk) 13:59, 30 July 2013 (UTC)The bot is generating a list now; be forewarned that it might take a while. Theopolisme ( talk) 19:47, 30 July 2013 (UTC)
really would be nice to have a bot to update Wikipedia:Peer review/backlog/items automatically, rather than relying upon editors to do it. perhaps there is already a bot for backlogs that can merely have that page added to it's tasks.... -- Aunva6 talk - contribs 18:59, 2 August 2013 (UTC)
Over at Wikipedia:Dispute resolution noticeboard we have accrued an ad-hock combination of scripts and templates, assisted by EarwigBot and MiszaBot. In particular, we seem to be asking The Earwig for a lot. He has been very responsive and has been great about our constant stream of requests, but rather than dumping more and more on him I am wondering whether someone who is really good at automation has the time and inclination to do a proper job of re-engineering all of our DRN automation tools from top to bottom. If we manage to get a smooth-running system working, other noticeboards might be interested in using the same system. Is anyone interested in working on this? -- Guy Macon ( talk) 06:59, 28 July 2013 (UTC)
A bot is needed to perform the good faith effort to establish notability specified in the "Dealing with minor planets" section of NASTRO: Wikipedia:NASTRO#Dealing_with_minor_planets Here is the list: Wikipedia:Minor_planet_articles_that_might_fail_NASTRO. The bot should add to that list how many, if any, hits the name of the object gets on that database NASTRO specifies.
Thank you for your kind attention to this matter. Chrisrus ( talk) 04:12, 6 August 2013 (UTC)
Hi, while going through many of the medical stubs, I encountered a lot of stubs (and some other articles) which have been turned into redirects and are still classified in the assessment (WPMED) on their talkpage as an article. Could a bot set the class to redirect in those articles and remove any importance rating? -- WS ( talk) 14:27, 8 August 2013 (UTC) And, as a secondary request, would it be possible to find all articles which have a Infobox disease, Infobox symptom, Interventions infobox or Diagnostic infobox and tag them with WPMED if not already done? -- WS ( talk) 11:19, 9 August 2013 (UTC)
For some time now, I've been doing this manually, as I edited templates or their documentation, but I've hardly scratched the surface. Can someone assist, please?
The process is:
We should also so the same thing for /sandbox and /testcases and other subpages' talk pages.
This prevents fragmentation of discussion between the various talk pages; particularly as documentation pages seem to be under-watched. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:35, 2 August 2013 (UTC)
Over the past several months, those of us working in WP:SPI have confirmed a whole raft of related paid-editor sockpuppets — over 300 so far, according to SPI reports, and that's probably the tip of the iceberg.
These socks all have a similar editing pattern: make some minor random edits to get autoconfirmed status, sleep for a while, maybe a month or two, and then build a seemingly well-referenced but often-promotional article in their sandbox, which they then move to main space. The identity of the company creating these socks is known but I won't reveal it here.
Right now we're in whack-a-mole mode. The socks aren't identified until they actually post an article into main space and a patroller familiar with the SPI case notices the pattern and reports it.
Therefore, it would be really useful if there were a bot that maintained a page that listed new sandbox creations. Such a list would be most useful if it showed the date of creation, link to the sandbox, and a snippet of the lead sentence.
Patrollers could then more easily report potential socks in advance to SPI, and a checkuser could verify them and block them, before the material goes to main space.
A bot maintaining such a page would help us get a jump on this army of socks, thereby denying them their revenue and perhaps, eventually, convincing this company to work with the community rather than resort to block evasion and other tactics. ~ Amatulić ( talk) 23:03, 7 August 2013 (UTC)
Instance of {{ Infobox Korean name}} which are underneath a biographical infobox (for example {{ Infobox person}}) need, where possible to be made a module of that infobox, as in this edit. I'm not asking for anyone to start work immediately, but - so that I can draw up a plan and get consensus - can anyone ease advise whether that's something it's likely at a bot could do, or would there be too many false positives? Could we perhaps work on the basis of proximity? Say, "if there is no subheading between them, make the edit"? Or "If nothing but white space separates them"?
Also, would it be possible, please, for someone to draw up a list of articles which use both of the infoboxes named above? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:46, 8 August 2013 (UTC)
Hello, I'm Castigonia, and I want to make a bot request. The reason I would want to ever make a bot is because I would want to contribute to Wikipedia and any other "Wikis" as much as possible, but may not be able to do so (such as sleeping, vacation, school, etc.).
If this request was accepted, I would follow the instructions word-by-word and make a bot that would make major contributions. It would make accomplishments to Wikipedia that would make it a better encyclopedia. It would make accomplishments to other wikis and make them better as well. I have always looked up to a bot such as User:ClueBot NG and now want to create a bot like it.
I hope you accept my request. I am an autoconfirmed user and want to accomplish more on Wikipedia and its sister projects. So please think about your decision and let me know when you have. Castigonia ( talk) 13:45, 10 August 2013 (UTC)
Sorry about that, I really want to make this bot. Thank you for the information! Castigonia ( talk) 14:18, 10 August 2013 (UTC)
Transclusions of Template:Languages of Angola should go below the ref section. — kwami ( talk) 08:31, 14 August 2013 (UTC)
I have been
told that
T:AH is in use in 33,000 pages (which sounds like a low estimate to me). I need to know how many of these pages have four=no. four=no is used by
WP:FOUR to distinguish the
WP:FAs that have been
WP:GA and
WP:DYK according to T:AH that are rejected from the four=yes. The four=no should populate
Category:Wikipedia articles rejected for Four awards in the near future and if the bot both counted and categorized that would be optimal although I just need the count for now. Because of
this, it could take weeks for this category to populate itself. The expected number is between 300 and 1300 of the 3980 FAs have four=no.--
TonyTheTiger (
T/
C/
WP:FOUR/
WP:CHICAGO/
WP:WAWARD) 06:14, 2 August 2013 (UTC)
O.K. I think we are getting close to done. I found a 500 edit sample of my contribution history with 43 four=no edit summary samples 40 of which are already in the category. I guess we are over 90% of the way, but I still don't know how I will know if we are done.-- TonyTheTiger ( T/ C/ WP:FOUR/ WP:CHICAGO/ WP:WAWARD) 03:30, 14 August 2013 (UTC)
An editor has been tagging the 2000 census data in articles with {{not in reference}} ( [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24]) because the {{GR|2}} citation template now leads to a redirect page at the Census Bureau website. Is there any way this can be fixed with a bot? It affects hundreds of thousands of articles. Thanks for looking into this. 71.139.153.14 ( talk) 14:42, 8 August 2013 (UTC)
Off-topic discussion
|
---|
|
The dead link can be fixed in Template:Geographic reference (GR2). Almost sure there is a copy for that broken reference in another location, WayBack machine, mirrors, etc. (Probably this helps). emijrp ( talk) 13:53, 9 August 2013 (UTC)
( edit conflict) One of the problems with FactFinder is that (to my knowledge) on Wikipedia, citations to the site are usually very general (for example, in [25] that the IP provided, the url simply goes to the homepage). This makes it very difficult to somehow magically generate a link to the new FactFinder for the 2010 Census, as The Rambling Man said. It looks like they do have a deep linking guide, so perhaps it would be possible to construct a new template that could take a "geographic identifier" as input and output an actual functional citation...but that may be out of the scope of this discussion. Theopolisme ( talk) 21:28, 9 August 2013 (UTC)
Off-topic discussion
|
---|
|
I think a lot of the blame lies with the Census Bureau's web developers for making it so difficult (if not altogether almost impossible for end users) to link to specific pieces of data. The community decision in 2008 (do you have a link, by the way, for reference?) was probably significantly influenced by this. tl;dr, FactFinder needs a revamp. Theopolisme ( talk) 22:08, 9 August 2013 (UTC)
Any news on this from the bot people or do we need to start tagging these sections as {{ unreferenced section}}'s? The Rambling Man ( talk) 16:31, 19 August 2013 (UTC)
A bot to tag items in Category:G13 eligible AfC submissions that have not been edited since before 2013 for deletion. The main reason is that as I type this the category is at 29,055 refreshes page make that 29,057 and growing fast, due to the sheer size of the backlog it is not really practical for humans to review that many entries. All of this is moot if someone with the bot flag could set up similar parameters in AWB. PantherLeapord| My talk page| My CSD log 05:57, 13 August 2013 (UTC)
We need some automated means of dealing with the ever growing backlog. In the time it would take one person to CSD one entry two or more have already taken it's place! The backlog is now at 46,151 and counting! Even if we were able to process one per second it would take over 12 hours non-stop to process them. Considering the average time to process one is about 10 seconds (for me anyway) it will take over five DAYS! These figures do not account for more entries coming in. PantherLeapord| My talk page| My CSD log 23:44, 16 August 2013 (UTC)
As a result of the page moves executed per this discussion at Talk:Hotel California (Eagles album)#Requested move we are in need of a bot to make the necessary changes per the discussion here at User talk:Tariqabjotu#Hotel California page move. Thank you. -- RacerX11 Talk to me Stalk me 06:05, 18 August 2013 (UTC)
Could someone please parse the list of editors at Wikipedia:Wikipedians_with_articles; check their contributions, and append (if not already there) the text "Not active." to anyone who hasn't edited in a period (say, 9 months?); and remove it from any that have? This task might usefully be re-run on a periodic basis. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:42, 19 August 2013 (UTC)
For the pages on the Category:Pages with archiveurl citation errors page where the archiveurl is from archive.org, generate the archivedate and url from the archiveurl if they don't exist. For example with a cite web that has |archiveurl=http://web.archive.org/web/20060904135528/http://www.vetmed.auburn.edu/news/bailey.htm the bot would generate |archivedate=20060904135528 and |url=http://www.vetmed.auburn.edu/news/bailey.htm . I'm not sure this could be done with webcitation, but it might be worth looking at. Naraht ( talk) 00:36, 20 August 2013 (UTC)
I recently changed title into She's the One (Bruce Springsteen song). Perhaps you can fix ambiguity? -- George Ho ( talk) 05:03, 17 August 2013 (UTC)
Hi people, I'm hoping someone could quickly slap a bot together to perform the tasks in Wikipedia:WikiProject Spirits/Assessment Drives/September 2013, that is check daily if there are newly assessed articles, tally those, and print them out in a table. Does anyone have laying something around, or can slap something together quickly? Cheers, Martijn Hoekstra ( talk) 21:22, 20 August 2013 (UTC)
Moved from The Star Wars Holiday Special per RM. -- George Ho ( talk) 01:10, 23 August 2013 (UTC)
Since the parameters in the info boxes are inconsistently ordered, we sometimes end up with duplicated fields. This can cause editing problems. I'd like a list of all articles with duplicate parameters in transclusions of {{ Infobox language}} and {{ Infobox language family}}, with the parameters that are duplicated, even if the fields are blank. (A blank field will still override a filled one, and will also invite future additions that may not display properly). Since there's no actual changes of the articles by the bot, I hope this will be easy to approve. — kwami ( talk) 23:23, 24 August 2013 (UTC)
@ Kwamikagami: The following is the result (sorry, it wasn't filtered by namespace):
Hazard SJ 08:08, 29 August 2013 (UTC)
Redirect page is The Best Is Yet to Come (song), with pages linking to the redirect. The target page is the very old song, so "(song)" should be dropped in linked articles. -- George Ho ( talk) 23:27, 26 August 2013 (UTC)
Template:Infobox soap character 2 should only be used for EastEnders characters where relationships play a strong role. Nowadays, the infobox has been ambushed to import unencyclopedic text in infoboxes. A bot (or a willing editor) should replace Template:Infobox soap character 2 with Template:Infobox soap character to all non-EastEnders characters. this can be done by simply removing the number 2. -- Magioladitis ( talk) 13:42, 27 August 2013 (UTC)
Andy Mabbett check this. -- Magioladitis ( talk) 14:24, 27 August 2013 (UTC)
On reflection, I've nominated both templates for a merger discussion at TfD. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:30, 27 August 2013 (UTC)
With new error checking, we currently have 45,000+ pages in
Category:Pages using citations with accessdate and no URL. This is mostly due to citation templates that include |accessdate=
but do not have |url=
. --
Gadget850
talk 19:23, 27 August 2013 (UTC)
I'm intrigued by the possibility of creating bots to work with Wikipedia tools like Huggle or Igloo or Vandal Fighter. That could probably be useful....-- User:SmartyPantsKid 20:53, 30 August 2013 (UTC)
Number of pages affected : ~2000
Task : Flag all unused file redirects with {{db-g6|rationale=Unused file redirect}}. And 'unused' is means no incoming links at all.
This will hopefully end the eclipsing of some files at commons as well a dealing with a backlog that has bulit up in respect of -Wikipedia:Database reports/Unused file redirects
Sfan00 IMG ( talk) 14:36, 2 September 2013 (UTC)
But some kind of log so mistakes can be found would be appreciated.
Note not all usages would have a File: prefix, some infoboxes for example add the File/Image prefix internally... Sfan00 IMG ( talk) 14:56, 3 September 2013 (UTC)
This is a read-only request (no bot edits asked, just list pages):
Please list all template calls from
Template:RailGauge.
rowid | namespace | pagename | |1=
|
Redirect from | other params used | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
explained: | output row id | name or number | {{PAGENAME}} | Unnamed para 1 input value as typed ("1=" might be omitted) | {{
railgauge}} {{
gauge}}
|
list of params, n is unk | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
data example a: | 1 | 0 | Standard gauge | 57 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
b: | 4837 | 0 | Bradway Tunnel | ussg | Template:Gauge | wrap=y|al=on | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
c: | 7865 | 0 | Indian Railways | 1676mm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
d: | 124 | 0 | Indian Railways | 1435mm | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
data row (CSV example) | "7865", "0", "Indian Railways", "1676mm", "", "allk=on", "foo=nonsensetext", "wrap=yes" |
|1=
are optional and so unknown in number (0 ... ). These n can better be rightmost. Please split into "" each, if can be done easy. Not my primary request clearly. -
DePiep (
talk) 02:13, 3 September 2013 (UTC)It was suggested to me at Wikipedia:Village pump (technical)#Finding ACTIVE foreign language speakers. that a bot would be a good option for updating Wikipedia:Translators available. The bot to do this job would, I am supposing on a once or twice monthly schedule, glance across the contributions of editors named therein, and indicate what was the last month in which each such editor had edited. Which would thusly provide an idea of which editors with translational skills would be around to provide such skills in a pinch, as asked. Blessings!! DeistCosmos ( talk) 02:17, 6 September 2013 (UTC)
Through many pages/months/years? of discussion at WT:NRHP and its archives, it has been decided that a bot is needed to tag any articles with only a single reference to the National Register Information System (NRIS) with the {{ NRIS-only}} template, which encourages the addition of extra sources and puts the articles in cleanup categories, "Articles sourced only to the NRIS since MONTH YEAR". The shortcomings of NRIS are explained here.
To find articles that are only sourced to NRIS, a list of all the pages on which {{ NRISref}} is transcluded could be a starting point. There may also be pages that link directly to older versions of the NRIS website, which would include the string "www.nr.nps.gov" somewhere in their content. From this giant list of articles, those with a single reference need to be picked off and tagged.
After its initial run, the project would like for the bot to continually check new articles and tag them if they are NRIS-only and prevent the removal of the NRIS-only template from existing articles unless a second source is added. Is this possible?-- Dudemanfellabra ( talk) 05:06, 28 August 2013 (UTC)
Hello, WP:JAZZ would like to have a 'bot add the {{ WikiProject Jazz}} banner to jazz-related pages that aren't already tagged, and/or auto-assess pages that aren't already assessed.
{{WikiProjectBannerShell}}
if and when it was able to do so.Essentially, I am asking for a repeat of what Xenobot Mk V originally did for us in 2010 (see above). We wish to add {{ WikiProject Jazz}} to articles in jazz-related categories. The list of relevant categories is located at Wikipedia:WikiProject Jazz/Categories, but please note that there are actually three lists of categories at that page, and they each need to be tagged slightly differently:
{{WikiProject Jazz}}
to the articles (or rather, the talk pages) within the
/General sub-listingTo the best of my knowledge, /Categories represents all applicable categories and sub-categories (I deliberately omitted those that are outside the project's scope), so you should not need to worry about sub-category depth. I finished updated these listings a few minutes before posting this.
Furthermore, we wish to auto-assess those pages that do not already have an assessment (including those already tagged with {{WikiProject Jazz}}):
class=
from other WikiProjects (if any):class=
if only a single rating is availableclass=
if two or more ratings are available; in the event of auto-stub/inherit conflict, inherit the most frequent (or highest) class=
rating|auto=yes
|auto=inherit
|auto=length
|autoi=yes
class=stub
based either of the following criteria:importance=
ratings.And, add {{WikiProjectBannerShell}} when possible/applicable.
I have an additional request, but I am not sure whether it's technically possible (see
comments). I'd be interested in having the 'bot add needs-infobox=yes to the {{WikiProject Jazz}} template, if the article does not have an {{Infobox foo}} template; or if {{WikiProject Jazz}} can inherit this setting from another WikiProject banner, or it can inherit this setting if the talk page already has {{Infobox requested}}
.
Let me know if I can clarify anything, either leave me a message here or at WT:JAZZ.
Thanks in advance, -- Gyrofrog (talk) 17:09, 6 September 2013 (UTC)
Gyrofrog I can do it for you. I have some questions: 1) any |importance=
should be removed? 2) "autoi" is a typo? Do you mean auto? --
Magioladitis (
talk) 10:59, 9 September 2013 (UTC)
|importance=
; do not remove it. But if it is blank for WP:JAZZ, we do not want to inherit that parameter from other WikiProjects.|autoi=
. It is not a typo, but apparently it is not currently being used, either (anyway, it doesn't work for WP:JAZZ – I may have seen it on a different template
[28]).|auto=
or |autoi=
applied. However, a quick test suggests that for |auto=
, "length" and "inherit" are the only two options that work. |auto=yes
and |auto=stub
do not work.|class=Category
, |class=Template
, etc. although I believe the template will automatically handle all of these without having to specify). --
Gyrofrog
(talk) 16:06, 9 September 2013 (UTC)Redirect page is Upstairs, Downstairs; many links link to the redirect page of the 1971 series. Perhaps fix ambiguation? -- George Ho ( talk) 17:21, 6 September 2013 (UTC)
I was asked by Adam Cuerden, the main contributor to the Signpost's tech report, if I would write a short piece about how to make Wikipedia articles more accessible to screen reader users. I promptly did so and it's now been published along with the rest of the tech report. Adam then suggested on my talk page that the first two problems I noted could be fixed in articles by a bot, which is where you guys come in ... (here's the relevant conversation). In particular, as noted there::
What do you all think? Graham 87 05:05, 7 September 2013 (UTC)
(I'd asked about this in late July. Someone else had offered to help with it, but they have since become inactive.)
DASHBot (
talk ·
contribs) used to create, and/or periodically update, a list of unreferenced biographies of living persons for a given Wikiproject (see
User:DASHBot/Wikiprojects). However, that bot has been blocked since March. I'm wondering if another one can accomplish this same task. I'm asking on behalf of
WP:JAZZ (whose list is at
Wikipedia:WikiProject Jazz/Unreferenced BLPs) but there were a lot of other WikiProjects on that list, as well (I'd already removed WP:JAZZ, though). Thanks in advance, --
Gyrofrog
(talk) 13:31, 6 September 2013 (UTC)
User:Theopolisme has taken a temporary hiatus, and I would request that the already approved TAFI related tasks from his bot be taken up by another bot. The python scripts are available here. Performing all these tasks manually are painful. It needs to run one at 00:00 UTC on Mondays, and can basically run all tasks once a week then. -- Nick Penguin( contribs) 04:02, 10 September 2013 (UTC)
A bot is needed to go through Category:Canoeists and change the name of any pages that are currently called 'NAME (Canoer)' to 'NAME (Canoeist)' as per the discussion at WP Kayaking Jamesmcmahon0 ( talk) 10:06, 9 September 2013 (UTC)
I'd been doing some of this linkfixes manually, but figured it's an WB or bot task really.
The list contains image pairs.
Basic task is, For Each image pair in the list-
Could an automated task for doing this be developed? Sfan00 IMG ( talk) 22:07, 9 September 2013 (UTC)
Hello: Can someone take over the new article alert job from TedderBot? This page has the link to the source code. The bot has been out of operation since August 22. Please see here the requests from various project owners to get bot working again. Thanks. — Ganeshk ( talk) 00:33, 30 August 2013 (UTC)
Thanks for getting it up and running again. As it is a very important bot for a number of wikiprojects, and Tedder notes he is very busy IRL, it would indeed be good to have a backup ready to kick in on a short notice, so this request should still be seen as open for anyone willing to develop a backup capability. Better have it ready know than after another x-week delay and grief in the future. -- Piotr Konieczny aka Prokonsul Piotrus| reply here 04:06, 14 September 2013 (UTC)
Good apps — Preceding unsigned comment added by 106.203.17.125 ( talk) 01:59, 18 September 2013 (UTC)
WP:NRHP maintains lists of locations that are on the National Register of Historic Places for everywhere in the USA; an example is National Register of Historic Places listings in Franklin County, Kentucky. Just about all of these lists have tables composed of transclusions of a dedicated template, {{ NRHP row}}; these were just added a year or two ago, replacing hardcoded tables, and it's possible that occasional articles got missed and still have one or more entries with the hardcoded table. For example, the Franklin County KY list had a single entry that appeared different from the rest, and viewing the source showed me that there was a single line with the old hardcoded table; I've since fixed it. Is there any way (other than manually viewing everything) to check for pages that still have the hardcoded table? My first thought is a bot that would check as follows:
The hardcoded tables always used templates whose names end with "color" for all entries, and without the last item, we'd miss any lists in which there's an old-style table entry below the last of the NRHP rows. I don't think we need to worry much about false positives, because the bot wouldn't be doing anything more than logging what it finds. This is a rare enough problem that we're unlikely to find many such pages; the only reason I'm requesting it is that we have more than 3,500 pages whose names begin with "National Register of Historic Places listings in", so it would take an extremely long amount of time for a human to check all of them. Nyttend ( talk) 03:43, 19 September 2013 (UTC)
could I get someone to have a bot tag all the templates listed in the collapsed section of the following two discussions:
it's very simple, you just add {{subst:tfd}} at the top, assuming this runs before the end of September 20 wiki time. they are all unused. thank you. Frietjes ( talk) 17:05, 20 September 2013 (UTC)
Currently Wikipedia:Templates for discussion/Log/2013 September 11#Template:WikiProject United States shows an overwhelming consensus to destroy the WikiProject United States collaboration and associated template. It is requested that someone with a bot demerge all the projects and reallocate all the articles to the appropriate project. For all those who don't like WikiProject United States here is your chance to end it once and for all and restore about 100 dead projects to their previous state of inactivity. If you don't like Kumioko here is your chance to step up and show them how to build a bot to do assessments the right way! Either way, a bot is needed as this task is just too massive for one person to do. It took Kumioko years to get it to this point, how fast can we restore things back to their natural order? 138.162.8.57 ( talk) 19:22, 20 September 2013 (UTC)
{{WikiProject United States|class=foo|importance=blah|ID=yes|ID-importance=bar}}
to {{WikiProject Idaho|class=foo|importance=bar}}
?
GoingBatty (
talk) 02:52, 23 September 2013 (UTC)
Yeah, I know the subject line don't make a lot of sense, ok? :) I guess I'm asking if there is any way to get a bot developed which can go through one or more of the pdf files of encyclopedias at commons' encyclopedia category and maybe generate a list of articles contained in that file. It would probably be great if it could have some sort of indicator of relative length of the article, things like maybe 4 lines, 1/2 page, 2 pages, or whatever. But I do think, maybe, for some of the smaller countries, with less generally spoken languages, and maybe some of the less popular topics, having something like that which could give editors interested in a given topic a quick list of articles which can be sourced from readily available files might be very useful. Maybe, as a first step, considering most reference books have "titles" at the top of each page, a bot which could see if the same "title" is included on two or three consecutive pages, and then list such titles. John Carter ( talk) 01:15, 23 September 2013 (UTC)
I think this is difficult, but.... I just removed a few external links, with link named
I think the chance that any such links are legitimate is nil. Would it be possible for a bot to remove, from all Wikipedia articles,
* [URL Per DTI-NCR Permit No. nnnn, Series of mmmm]
Where:
This might scanning a database dump for DTI-NCR.
— Arthur Rubin (talk) 09:49, 26 September 2013 (UTC)
Hi, I've been approached by an Art Gallery who are aware that many of the thousands of links that Wikipedians have made to their sites over the years are now dead. They may be able to provide a map or logic for the move, if they can would someone be willing to run a bot to update the links? They think it is mostly on EN wiki but will probably be on many other languages as well. They would also be interested in a report listing all of our articles that reference their sites, and presumably any in external link sections. Hundreds, perhaps more than a thousand of the articles involved will be related to living artists so there could be quite a lot of BLPs where the sourcing is improved by this. Jonathan Cardy (WMUK) ( talk) 16:00, 27 September 2013 (UTC)
== Bot to delete a bunch of redirects == Looking for some bot or script to clean up some spam in the form of 200+ redirects to a single article [[Măeriște]]. I suspect the original author was unhappy about the original article being deleted. As a result, the article for this town [[Măeriște]] now has redirects from a multitude of people names, geographic names, newspapers, museums, ethicities, religions, etc. I consider this spam as in [[Wikipedia:CSD#G11|G11]]: Unambiguous advertising or promotion. I would like to delete (or edit with {{dp-spam}} all of the redirects to this article (https://toolserver.org/~dispenser/cgi-bin/rdcheck.py?page=M%C4%83eri%C8%99te) except the following: *Criștelec *Doh, Sălaj *Giurtelecu Șimleului *Kerestelek *Krasznahídvég *Maladia *Maladé *Măerişte *Somlyógyőrtelek *Somlyóújlak *Uileacu Şimleului *Wüst Görgen Thanks [[User:Hollomis|Hollomis]] ([[User talk:Hollomis|talk]]) 22:40, 2 October 2013 (UTC) :Mass RfD is the third door on your left, see [[WP:RFD]] [[User:Werieth|Werieth]] ([[User talk:Werieth|talk]]) 22:43, 2 October 2013 (UTC) ==Edit-a-thons at UC Riverside== FYI: The UCR Libraries are hosting three edit-a-thons focusing on their great special collections (science fiction, water resources, the Inland Empire and more) on Oct. 12, 2013, Oct. 26, 2013, and Nov. 23, 2013. Please participate if you can, and please help publicize to friends in the area! Details here. All are welcome, new and experienced editors alike! -- phoebe / ( talk to me) 22:44, 1 October 2013 (UTC)
A discussion at Wikipedia talk:WikiProject Football#FB team templates has determined that these templates (thousands of them!) should be replaced by direct wikilinks instead (the templates have nothing but a link in them).
What basically needs to be done is; find all instances of a template, replace it with the link (the contents of the template) instead (simply subst the template?), and delete the then unused template. The final part can't be done by bot, but it can be tagged as empty, or we can wait until the task is done and then do one mass-deletion, similar to Wikipedia:Templates for discussion/Log/2013 September 20#Unused fb team templates.
Details on this can be discussed with the members of the Football project at the above discussion, but are there any bot operators who in principle are willing to take this on? Fram ( talk) 09:45, 25 September 2013 (UTC)
Any takers? Giant Snowman 10:54, 8 October 2013 (UTC)
User:Blake305 appears to be inactive. Could someone please pick up this pending task, which has community support and for which his bot already has approval for a trail run? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:45, 4 October 2013 (UTC)
Ok, a lot of work has been done, adding old sources from the hhtp://www.archive.org to articles , especially in the Israel/Palestine area.
Some time ago, archive.org redirected all these links to a link without www, so if you look at http://www.archive.org/details/palestineundermo00lestuoft you will be redired to http://archive.org/details/palestineundermo00lestuoft
Ideally, all these links should be stripped of their www´s, like I have done here. Looks like the work for a bot? Huldra ( talk) 22:56, 7 October 2013 (UTC)
Chris G ( talk · contribs) has left a message on his talk page that real life has taken priority and he's stopped his bots. But he has offered the code if somebody wants to get them running again. Is there anybody willing to take a look at getting RFC bot ( talk · contribs) running again? I'm most interested in the WP:DASHBOARD updating, but the RfC stuff is probably a higher priority. Thanks. 64.40.54.22 ( talk) 06:09, 6 September 2013 (UTC)
This request was made a while ago and taken up by User:Hasteur (who may still have preliminary code available). However, there was a bit of a scuffle at WP:NRHP which resulted in one editor being indefinitely topic banned from areas related to the project, and that turned Hasteur off to the task. Would anyone else be willing to pick this up? The initial bot task is to find all articles whose only inline citation is to the National Register Information System via {{ NRISref}} and tag them with {{ NRIS-only}}. A rough pseudo-code logical procedure is provided below, courtesy of User:Hasteur:
For each page that includes a transclusion of {{ NRISref}}:
After the initial run which tags existing articles, the bot should scan recent changes and tag any newly created articles citing only the NRIS. Also, the bot would need to prevent removal of the NRIS-only template unless multiple references are present on the page. All of this has gained consensus with members of the project.
Hasteur had run this through several test runs of the first task and generated a list in his userspace of articles that would be tagged on a live run, and we were in the process of refining the procedure to better suit the project's needs, but it seems Hasteur has either gotten too busy or begun to ignore this. If anyone could pick this up, WP:NRHP would be very grateful! If any further elaboration is needed, let me know!-- Dudemanfellabra ( talk) 17:28, 10 October 2013 (UTC)
Several times, I've seen an article name with a parenthetical sub-title, even though the non-parenthetical title was free. For instance, naming an article "Such and Such (film)" when "Such and Such" without any suffix wasn't in use. Is there a way that a bot can find instances of such titles? Ten Pound Hammer • ( What did I screw up now?) 00:39, 11 October 2013 (UTC)
Done I found 35,281 non-redirect article-space pages that were unnecessarily disambiguated. The first ten thousand of these are listed at User:Theo's Little Bot/unnecessary dab. Theopolisme ( talk) 22:40, 18 October 2013 (UTC)
Is there a way that a bot can find all instances of {{ Infobox album}} and {{ Infobox single}} have left the producer field vacant, and maybe categorize them? I think this might be useful in categorizing works with unknown producers (e.g. Ronan (song) and All Cried Out (Kree Harrison song)). Ten Pound Hammer • ( What did I screw up now?) 02:02, 15 October 2013 (UTC)
Hi, I have requested this before during summer, but without any reponse and it seems that there are more people active here now: while going through many of the medical stubs, I encountered a lot of stubs (and some other articles) which have been turned into redirects and are still classified in the assessment (WPMED, WPAN, WPPHARM) on their talkpage as an article. I have corrected many of those, but there must still be hundreds more. Could a bot set the class to redirect in those articles and remove any importance rating? And, as a secondary request, would it be possible to find all articles which have a Infobox disease, Infobox symptom, Interventions infobox or Diagnostic infobox and tag them with WPMED if not already done? -- WS ( talk) 12:14, 17 October 2013 (UTC)
As a botop, I would do it myself. But I require admin rights to carry it out. The task is simple. A bot is to downgrade all fully protected templates to template protection, so editor with the template editor rights can actually edit the template.— cyberpower Online Trick or Treat 16:35, 17 October 2013 (UTC)
select CONCAT('*[[Template:', page_title, ']]') FROM page_restrictions JOIN page ON pr_page=page_id WHERE page_namespace=10 AND pr_type='edit' AND pr_level='sysop' and pr_expiry='infinity';
UTC)
NOTE: You can get a list of the fully protected templates with this link -- WOSlinker ( talk) 10:35, 19 October 2013 (UTC)
I was wondering if it would be useful to create a list of articles that do not yet exist based on categories at WikiCommons. There are I'm sure many groups of images just waiting for an enwiki article, and if someone thinks that something is deserving of a category there it is likely that it is deserving of an article here. The bot could simply look through all the categories and produce a list of those which do not contain an image linked from enwiki. This would likely be a one-off task but could be run periodically if deemed useful. violet/riga [talk] 09:20, 19 October 2013 (UTC)