This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 45 | ← | Archive 47 | Archive 48 | Archive 49 | Archive 50 | Archive 51 | → | Archive 55 |
Is there a bot that can scan all Wikipedia articles and make a report of all cases, where links to Wikipedia articles exist between ref-tags? See for example ref no.7 at Data (Star Trek) for what I mean. Then I could go through that list and fix them systematically. I don't know whether the fixing of this also could/should be done by a bot or not. If it could/should be fixed by a bot, then of course there is no need to do it manually. -- Toshio Yamaguchi ( tlk− ctb) 13:50, 15 July 2012 (UTC)
Also, I don't know how common a problem that is. Perhaps it's just a few cases but I have no idea how to find out. -- Toshio Yamaguchi ( tlk− ctb) 14:40, 15 July 2012 (UTC)
I've already coded a bot for updating US cities to 2010 census data and I've been informed I'm suppose to request permission to run it? I've never did any other significant editing on Wikipedia before, a few small edits here and there for current events, so I was unaware of this policy. I've already updated Iowa, North Dakota, South Dakota, Nebraska, Kansas and half of Missouri if you want to see what my work looks like. The person who posted on my talk page said I could get banned if I don't get permission so here I am. Is it okay with the community if I continue to update cities to the 2010 census data or not? Jamo2008 ( talk) 17:16, 15 July 2012 (UTC)
Is it possible to generate a list of biography articles (all those with a 'XXXX births' category) that lack a {{ WPBiography}} tag on their talk pages? I've noticed a lot of these around, and was wondering if anyone was able to help clear that backlog or find out how large it is? I think there were bots or editors who dealt with this in the past, but it doesn't seem to be being done any more. Carcharoth ( talk) 12:12, 2 July 2012 (UTC)
Yobot can add the banner in all the pages need it and don't have it. In the old days I ve been doing only this but at some point we ran out of pages. :) -- Magioladitis ( talk) 03:50, 6 July 2012 (UTC)
I generated a list of untagged bio articles and put it at http://toolserver.org/~cbm/untagged.txt . The process I used to create it is described at [1] so that someone else with toolserver access could recreate it later. I tried to save it to a wiki page but the system choked on it :(. There is a 0 or 1 beside each title to say whether the page is in Category:Living people. — Carl ( CBM · talk) 12:29, 7 July 2012 (UTC)
Thanks to Carl for generating the list. How many articles are on that list? It is not particularly urgent, but is there any reason Yobot has to do it rather than another bot? It seems a bit pointless unblocking a bot to do this if the owner is busy doing other stuff. I'd be happy to wait until after Wikimania, but I think Izno is right that this should be a database report that would allow more people to help do this, rather than leaving it all to one person or one bot. Carcharoth ( talk) 01:24, 9 July 2012 (UTC)
Grrrr, thanks alot guys. I've had
Category:Biography articles without listas parameter and
Category:Biography articles without living parameter cleaned out for awhile. Now you go and make work for me again. Think of the poor schmuck (aka me) that gets to add parameters to the tag.
I do have one request. Could LaraBot go thru the list first. LaraBot will add living and stub class to the Biography banner for living biographies. That will help save me some time. I'm currently the only one that tags new living articles, adds living and adds listas, plus I add class and work-groups. Anyone want to volunteer to help out? I use AWB with some scripts by Magioladitis and Kumioko that makes things go faster. Bgwhite ( talk) 04:39, 9 July 2012 (UTC)
The bot is running as fast as it gets but I wonder if we should use group runs in the future to reduce time of huge tasks to be finished. -- Magioladitis ( talk) 04:06, 17 July 2012 (UTC)
The New Zealand Electronic Text Centre recently moved domain from http://www.nzetc.org/ to http://nzetc.victoria.ac.nz/ The old links redirect to the new links, no content was removed and all the links should redirect to the same content in the new domain. There are 2455 links in en Wikipedia (mainly in references) to this site. Some of the links include fragment identifiers pointing to specific places in long pages of text. Would it be possible to change these in an automated or semi-automated fashion? Ideally I'd like to also (a) being able to check that links actually work, putting a list of broken links in my user space or user talk space for me to fix and/or (b) checking for links on foriegn wikipedias. See previous discussion of move. I have a conflict of interest with this website. Stuartyeates ( talk) 00:19, 13 July 2012 (UTC)
I've made a request at Commons:Commons:Bots/Work requests, section "Category merger", for a bot to merge categories for nearly two thousand images, but the only help I've gotten so far is a pointer to a bot that hasn't been active in six months. Could anyone here write a bot to help there? Nyttend ( talk) 02:45, 17 July 2012 (UTC)
Chzz is retired since Jan/Feb and his bot stopped in Mai, so would somebody overtake this task? mabdul 15:45, 11 July 2012 (UTC)
Wikilivres is a site operated by Wikimedia Canada to host texts and images that are free in Canada but can't be hosted on WMF projects (i.e. Commons requires works to be free both in the US and the source country; due to differing copyright durations and the URAA restoration, many works are now free in their source country but remain copyrighted in the US). It appears that they recently moved from wikilivres.info to wikilivres.ca. The old URLs are not forwarding, and this has left us with a fair number of broken links that could be easily fixed, either by replacing "wikilivres.info" with "wikilivres.ca" or by using the interwikilink prefix wikilivres:. Thanks, cmadler ( talk) 15:51, 18 July 2012 (UTC)
Is someone with a bot able to turn the data at User:Ryan Vesey/Minnesota State Legislators into a table with 4 or 5 cells? Right now the information exists as "Legislator and link", "Term/district/body of legislature", "link to website", "citation". It would be great if that could be turned into a table and a notes cell could be added. Ryan Vesey Review me! 15:57, 20 July 2012 (UTC)
Legislator | District | Link | citation | note |
---|---|---|---|---|
Bruce W. Anderson | House 1977-82 (District 26A); House 1983-84 (District 28A) | Anderson, Bruce W. "Buzz" | {{cite web|title=Anderson, Bruce W. "Buzz"|url=http://www.leg.state.mn.us/legdb/fulldetail.aspx?ID=10014|work=Legislators Past and Present|publisher=Minnesota Legislative Reference Library|accessdate=19 June 2012}} |
User:RM bot is currently down, and causing problems for WP:RM. So, a fix, or replacement would be useful. See the discussions at Wikipedia:Village_pump_(technical)#RM_bot_inactive and WT:RM#What's the deal? -- 76.65.131.160 ( talk) 08:58, 23 July 2012 (UTC)
Okay, the Latin music task force was converted to its own WikiProject. When it was still a task force, it used the Template:WikiProject Latin America banner, but now it has it's own banner. What I need is for a bot to replace the previous banner with the new banner. If it's possible, carry the assessment from the old banner to the new one. That is the quality of the Latin American articles and the music-importance parameter to the new one. Here are the categories that need to be replaced: Category:South American music Category:Central American music Category:Mexican music Category:Cuban music Category:Dominican Republic music Category:Puerto Rican music
In addition to the above, I need a bot to tag these articles as the scope of the project has expanded: Category:Spanish music Category:Portuguese music Category:Cape Verdean music
Thanks! Erick ( talk) 02:52, 24 July 2012 (UTC)
As far as I can tell the majority of the tasks previously done by Rich F's bots have not been picked up by anyone. I'm not going to write a long comment about it because I believe few if any besides me care so there's no reason to waste my time. I just want to say that there are still a lot of tasks that Rich's bots did that still need adopting if anyone feels so inclined. One in particular is the WikiProject Watchlists that Femtobot was creating. Kumioko ( talk) 18:13, 7 July 2012 (UTC)
Kumioko, I just ran Catscan, according to above instructions. There was an output, all right. It's not in any kind of format that would be usable to the projects. For the targeted audience - the casual lurker or reader at a given project - this is all gobblegook that doesn't make sense. I don't even know the why or the wherefore of its output. It's just a long list of anything and everything in the project, and no rhyme or reason why it's there. There is nothing clickable on the output, so that right there torpedoes its usefulness. And you tell your browser to go back to the Catscan page, there's a list (yes, clickable) that just seems to be a random listing of any old article in the project. Big whoop. But to give you an example. In the last week, I've created three different articles for the Hawaii project. None of them showed up on the output. So, if I narrow the search to one of those three articles I actually created, the output is this:
== Subset == {| border='1' ! |- |} ---- ;querytime_sec:1.480544090271 ;total_categories_searched:0 ;query_url:http://toolserver.org/~magnus/catscan_rewrite.php?ns%5B0%5D=1&ns%5B4%5D=1&templates_yes=mahi+beamer&format=wiki
If clicked, that just takes you back to Catscan and nothing useful. Whatever else this tool is, it's no comparison to what Femtobot did for the projects. As least Rich's bots were automated. Even if the Catscan were more average user friendly, it would still take a dedicated individual to run it manually. Do you know any projects that have someone with nothing else to do with their time, and plan on being around and available to the projects indefinitely? Somewhere, I swear I hear Rich Farmbrough singing, "One of these days...you're gonna miss me, honey..." Maile66 ( talk) 13:35, 19 July 2012 (UTC)
<span class="mw-title"><a href="[^<">]*?/wiki/([^<">]*)"
with the "HTML Scraper (Advanced Regex)" with Group 1. —
Dispenser 13:36, 25 July 2012 (UTC)
Hi, Can you please provide the updated number of articles in Wikipedia:WikiProject Chennai and their respective ratings? thank you so much. Challengethelimits ( talk) 11:06, 24 July 2012 (UTC)
Thank you so much Challengethelimits ( talk) 04:10, 25 July 2012 (UTC)
I think when we get drive by IPs add articles and the basis for a stub i think we put a lot off editing by them not knowing how to add references or not knowing how to see the finished result. So what about a bot (similar to the "this article cites no sources" message but) which adds the reference section to the bottom of these articles, which don't already have one to make it easier? It's a pain for me to do it without copying and pasting from another article so a bot would be a big help. Thanks ツ Jenova 20 ( email) 10:45, 25 July 2012 (UTC)
I want to make a bot able to detect vandalism. ( Charlie22712 ( talk) 11:22, 25 July 2012 (UTC))
For those which don't already have it, could you add a section to redirects to List of Latin-script digraphs? That is, Dr (digraph) should redirect to List of Latin-script digraphs#D (just use the first letter of the title of the redirect page, ignoring acute accents on vowels: anything funky has been done manually). There are about 250 of them. Exceptions are variants of the article title: anything which has "digraphs" in the plural.
Also, could rd's to Trigraph (orthography) be redirected to List of Latin-script trigraphs, also with a letter-section link? (Don't worry about matching; there are anchors for each letter.)
Thanks, — kwami ( talk) 20:58, 17 July 2012 (UTC)
Hi, I've seen this: Wikipedia:PEACOCK#Puffery,
and I think that a bot could be implemented to automatically remove some of these words. For example, the word "virtuoso" could be easily removed. This word appears almost exclusively on the first sentence, and the formula is always similar:
Ferdinand David (19 June 1810 – 18 July 1873[1]) was a German virtuoso violinist and composer.
or:
Shlomo Mintz (born October 30, 1957) is an Israeli violin virtuoso and conductor.
In the first example, the word "virtuoso" could simply be removed, whereas in the second, word should be removed and "violin" turned into "violinist".
What do you think?
Thank you-- Fauban 16:25, 21 July 2012 (UTC)
Well, I've chosen the word "virtuoso" because of the formulaic nature of most sentences where it appears. It's always the first sentence, and int involves the wordings "virtuoso + [instrumentist]" or "[instrument] + virtuoso".
The bot's actions could be restricted to the first sentence, and only when it's one of these two kinds:
[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] VIRTUOSO [pianist/violinist...] [other stuff (may not appear)] [.]
[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] [piano/violin...] VIRTUOSO [other stuff(may not appear)] [.]
By restricting the bot's actions to this, we would avoid the 100% of mistakes.-- Fauban 17:34, 21 July 2012 (UTC)
Sorry, but if you check some good articles about undisputed virtuosos (e.g. Franz_Liszt), you will see that the thing is different; and that if only the suggestions I wrote above are followed, these sourced claims would be left untouched.-- Fauban 09:48, 24 July 2012 (UTC)
Hello, everyone.
Category:Good articles without an oldid collects articles which have been tagged as good articles, but do not have oldids in the talk page box linking to the reviewed version of the article. I have been regularly removing articles by going through this list and adding the oldids, but considering that the process is relatively straightforward, I wanted to inquire whether it was possible to automate the process. I'm not sure if this is something that GA bot might do or not.
Here is the process I use:
There are several ways to find the oldid of a GA version and add it to the talk page, but this is the process I use. I figured that this might be possibly automated.
Just a suggestion! Keep up the great work, bot owners! Kind regards, Matt ( talk) 22:01, 29 July 2012 (UTC)
I'd like to get this test-page working: here, which is a list of countries participating in the Olympics this year. I'll then copy and paste it to: here, where one of our users has started doing it manually. You would need to use both country names (in Welsh) and the IOC column which you will find on this page, and redirect them to the existing flags (same Welsh names). Any problems or can we do it? Many thanks. Llywelyn2000 ( talk) 15:03, 29 July 2012 (UTC)
Hello-- I clear a lot of red links, and we could eliminate a substantial number with bot-created redirects for certain articles starting with "The," specifically rivers and battles for starters. For instance:
I think this would be a great bot job. I did some for the major rivers by hand the other day (a dozen redirects fixed about a dozen red links), but it was way too tedious. This is a pretty standard procedure in indexing, and it would make the site more navigable. River seems like a pretty easy one: any article ending in text string "River." There may be false positives, but they are just redirects and would not be visible to most readers. Battles seem a little more complicated, but I suppose the string "battle of" would get most of them. If it goes well, we could also do ships or other things down the road. Let me know your thoughts. Thanks! Jokestress ( talk) 01:10, 30 July 2012 (UTC)
Not done - late.
Can I ask for a filtered template usage (transclusion) overview here? I'd like to check a template's param input (600 transclusions, so AWB would be cumbersome for me and still not secure)
Topic: {{
IPAlink}}
has to-be-deprecated params (awkward names & background). I'd like to have a list of pages that use these params off-regular.
2= > or [
(param 2 has closing bracket)bracket= > or [
(named param bracket has closing bracket)errortext=
(named param errortext is used)name=
(named param name is used)- DePiep ( talk) 23:31, 20 July 2012 (UTC)
{{#if:{{{2|}}}{{{bracket|}}}{{{errortext|}}}{{{name|}}}|[[Category:IPAlink articles with invalid parameters|{{NAMESPACE}} {{PAGENAME}}]]}} If you want to do that let me know and I can do it in the sandbox and we can test to see if it will work. Kumioko ( talk) 01:02, 26 July 2012 (UTC)
{{\s*IPAlink(.*?)(2|bracket|errortext|name)\s*([\|}{<\n])
Kumioko ( talk) 19:35, 26 July 2012 (UTC)+
param1=x
, or catch like (param3)x
?) AWB cannot catch these 100% (does regex really catch the input value, or just the input?). That is why I turned to this bot's page. I was only just asking: can some bot do that more easily and systematicaly (catch & select the values of a param, 100%) and throw me a list? No. Also, the responses being this late - for such a simple q - is disappointing. -
DePiep (
talk) 01:10, 28 July 2012 (UTC)
If you look at Category:Olympics stubs, you'll see it contains about 600 articles. About 1/3 of those have "2012" somewhere in their title, (i.e., Canoeing at the 2012 Summer Olympics – Men's slalom C-1, Chad at the 2012 Summer Olympics).
Those articles all contain either {{
Olympics-stub}}
or {{
Olympic-stub}}
where they should have {{
2012-Olympic-stub}}
.
Is there a bot that can easily update all these articles? I've been doing them by hand slowly, but a bot seems to make a lot more sense.
Alternatively, given that this would likely be a one-time process, is there a simple way someone fairly geeky can babysit a process that does it semi-automatically? If it doesnt involve using Windows, I'm up for it.
Thanks, Dori ☾ Talk ☯ Contribs☽ 23:45, 31 July 2012 (UTC)
Please refer to this TfD section where Template:Gotras of Jats is discussed.
The template is intended to be for the navigation of the entire set of articles contained in the article List of Jat Clans, and applied to all named articles in that list. However, the template both does not contain all the articles in that list and contains articles that are not in that list. For the template to be useful, and the discusson is heading towards the weak consensus that it is useful, it must be 100% in agreement with the list article, and the articles in the list and the template must have the template applied to them.
So the Bot should:
A bot is needed because the number of Gotras (clans) of Jats is likely to be 2,700 in the end, and manual maintenance is next to impossible. A full analysis may design a better procedure than my outline above Fiddle Faddle ( talk) 14:32, 27 July 2012 (UTC)
Would it be possible for a bot to remove all links to http://soccerdatabase.eu - it appears to be an illegal website which is simply copying content from the now defunct http://www.playerhistory.com - the owner of the playerhistory website is Polarman ( talk · contribs) and he has said he is launching legal action against soccerdatabse. Regards, Giant Snowman 13:52, 2 August 2012 (UTC)
It was decided in a recent RM to use "X baronets" instead of "X Baronets" in all baronetcy article titles. This would mean renaming all the non-redirect pages in that category to use the lowercase version. I hope this can be done by bot as it involves moving a very large number of pages. Jafeluv ( talk) 09:11, 25 July 2012 (UTC)
I've requested this before, but perhaps I made it too elaborate.
We have a search engine that allows the reader to look up a language with its ISO639-3 code. However, it only works as well as the redirects are maintained, and they're badly out of date: either directing to the wrong page, or in many cases missing altogether.
Create or update rd pages for all piped blue links at Wikipedia:WikiProject Languages/Articles by code:
For each link of the form
[[ABC|xyz]]
a rd should be located at
ISO 639:xyz
and the syntax should be
#REDIRECT [[ABC]]{{R from ISO 639|XYZ}}
For instance, the first link is
[[Ghotuo language|aaa]]
so the rd at
should read
#REDIRECT [[Ghotuo language]]{{R from ISO 639|AAA}}
(as indeed it does, so this one would be skipped as not requiring any change)
Some of these links will themselves be to redirects, but that can be fixed by one of the automated link patrollers. (Unless you want this bot to follow the rd chain? That might be a better use of server load, as it would avoid changing rd's only to have them changed back for articles which are not at their ISO name.)
It would be nice if the bot would verify that the article at the end of the rd chain contains an infobox listing the ISO code in question, but that starts getting more involved. If it wouldn't be too much trouble, maybe flag the rd with an error category for manual review if it doesn't?
— kwami ( talk) 10:50, 3 August 2012 (UTC)
Many of the articles in Category:United States Supreme Court cases and its subcategories (all of which I've checked; there are no irrelevant subcategories, such as Category:American military personnel killed in the Gulf War being a subcategory of Category:Morocco) cover individual Supreme Court cases; almost all of these articles are entitled "PARTY1 v. PARTY2". One sometimes sees "PARTY1 v PARTY2" in ordinary writing, and it's easy to omit the punctuation after the "v". Could a bot go around and create redirects for all of these articles, using Virginia v West Virginia as a model? Nyttend ( talk) 21:25, 27 July 2012 (UTC)
As the AFC reviewer has somehow a part of the responsibility, can somebody write a bot, which monitors the articles listed at AfD (or xfd) and informing the AFC reviewer/accepter? The reviewer is listed in the AFC project template at the talk page in the parameter |reviewer=
. So rather a simple task. ;-)
mabdul 11:58, 4 August 2012 (UTC)
RM bot has maintained the WP:Requested moves page for quite a while now, but it's owner has stopped editing, and the bot has shut down. We desperately need someone to adopt it and start it up again, as the Requested moves page is currently being manually updated. The source code is at User:RM bot/requestedmoves.php for anyone that is capable of compiling and adopting it. Thank you.-- Aervanath ( talk) 14:49, 4 August 2012 (UTC)
Вы могли бы мне помоч сделать облако тегов вот для этого сайта adrenalin-css.3dn.ru Напишите код в ответе всиысле htm код блога который бы получился — Preceding unsigned comment added by 178.122.3.140 ( talk) 20:18, 5 August 2012 (UTC)
See the "Question about protection" section of WP:VP/T (which will probably end up at Wikipedia:Village pump (technical)/Archive 101 rather soon) — an admin recently put temporary full protection on an article that had been indefinitely semiprotected, but due to the nature of protection, the page will be completely unprotected at the expiration of full protection. Since (at least right now) software doesn't permit any alternatives, we'll have to go back to the page and add semiprotection after the full protection expires (or cut short the time for full protection), but this requires that someone realise that it's time to semiprotect. What if we had a bot to post a notice at WP:AN to remind admins to protect pages like this? The bot could have a page in its userspace where someone could list pages that would need protection and times; the bot could be programmed always to notify WP:AN a certain number of minutes before the protection was needed. Because I'm asking only for a notification bot, not a protection bot, we wouldn't need the bot to be an admin, and the list page wouldn't need to be protected.
Five days ago, I posted a request at the proposals VP asking for input on this idea; the only response was "I take it as obvious that the best approach would be a change tot he software, but this is a temporary measure, until such changes are made. Sounds like a good idea to me." by SPhilbrick. Nyttend ( talk) 00:37, 10 August 2012 (UTC)
I often post at
WP:MFD.
Wikipedia:User pages lets users know that they many not have excessive unrelated content in their user pages. I'm looking for a bot to populate a series of
administration categories entitled
Category:User pages with excessive unrelated content having subcategories
Category:User pages with excessive unrelated content 2010, Category:User pages with excessive unrelated content 2009, 2008, 2007, ..., 2001 (based on the date the page was created), each of which could be a subcategory of the MfD project, such as
Category:Miscellaneous pages for deletion. The bot would look for only those user pages which meet each of the following four criteria:
1. of users who have only posted edits to any user space page (user and user talk),
2. user pages that at least two years old (user pages that were created in the year 2010 or earlier),
3. of users that have at least 5 edits collectively in all user space pages, and
4. where the user page now has at least 10,000 bytes.
I got the idea from the user page
User:Skverma1949's listing at MfD.
[2] The Skverma1949 page originated in August 2009, the user has only posted edits to user space (three different user space pages of users having similar user names), has more than 5 edits to all user space pages, and the page now has 17,000+ bytes. There are many user pages where the user has one or two edits total, usually to their user page. I'm looking to screen those out, at this time, via the requirement of at least 5 edits collectively in all user space pages to focus on further human screening of each page listed in the catetorirs for
Wikipedia:User pages and
WP:NOTWEBHOST issues. --
Uzma Gamal (
talk) 15:07, 10 August 2012 (UTC)
[[Category:...]]
tag, and then the bot might get reverted (although most of these users are probably long gone) and you'd never know. It's more likely you just want a list of these users/pages, which someone with Toolserver access could probably create for you easily enough.
Anomie
⚔ 20:07, 10 August 2012 (UTC)
It would be usefull to us at
WP:GA to have an easy way to keep track of who is reviewing Good articles and who is submitting them. It will potentially help in identifying trends, identifying new or inexperienced reviewers so we can guide them, make it easier to fix problems from poor or bad-faith reviewers and to identify potential new reviewers. To submit a review someone puts this {{
subst:GAN}}
template on the page. This will list the article at
WP:GAN. To review the article the reviewer will create the review page (it will be in the form Talk:articlename/GAreview numer eg
Talk:Prussian Homage (painting)/GA1). The more information we could get the better, but even the raw numers (nominations/reviews) would be useful. Other details that may prove useful would be a link to the articles and/or reviews, how many have failed, passed, or been re-assessed and the dates the nominations/reviews were conducted.
If a new page could be updated with this information that would be great. The most usefull presentation though would be to have it next to new nominations at WP:GA. Something like:
where N=Nominations and R=reviews. That list is currently populated by GA bot ( talk · contribs) so maybe it could pull information from the new page and add it when posting? Any help much appreciated. N.B. Chris (the owner of GA bot) has been notified of this thread. AIRcorn (talk) 09:39, 8 August 2012 (UTC)
Looks like all varieties of palm.com urls have been discontinued. We have >> 100 links at Special:Linksearch/*.palm.com. It would be great if someone could run a detox run through to {{ deadlink}} them all. Thanks. — billinghurst sDrewth 16:22, 11 August 2012 (UTC)
I don't know whether DASHBot still performs this task or not, but according to User:DASHBot/Logs the bot didn't remove any NFCC#9 violations since February 2012. If DASHBot is no longer performing this task, then I think a replacement is needed. -- Toshio Yamaguchi ( tlk− ctb) 08:47, 12 August 2012 (UTC)
Could a friendly bot go through my userspace and add {{ Noindex}} to every subpage wrapped in include only tags? (Some pages are transcluded or substituted). I keep trying to add them, but continually find ones I've missed. Thanks. Ryan Vesey 07:05, 12 August 2012 (UTC)
As per Wikipedia talk:WikiProject Automobiles#Image formatting in articles, we have decided to implement some changes to:
{{
Infobox automobile}}
(4332 transclusions){{
Infobox automobile platform}}
(68 transclusions){{
Infobox automobile engine}}
(305 transclisions)However, the scope of these changes is beyond what could reasonably be done manually so I, on behalf of WikiProject Automobiles would like to request the necessary assistance.
Before a bot is required, Stepho-wrs will update the templates to include a new "image_file" parameter.
Stage 1:
{{
Infobox automobile}}
only:
{{
Infobox automobile}}
only:
Not done Stage 2:
Regards, OSX ( talk • contributions) 09:29, 14 August 2012 (UTC)
{{
Infobox electric vehicle}}
but that presumably should also be included in the above changes to ensure consistency?
Warren (
talk) 14:39, 14 August 2012 (UTC)
Stage 2 is not needed. I added support to barefilenames to the existing parameter. I also added a tracking category and after I fix all pages in there I'll update the code once more. -- Magioladitis ( talk) 19:38, 14 August 2012 (UTC)
I updated my script to convert to bare filenames and move caption to the correct position. But the script still needs some adjustments to run automatically. Otherwise, I'll have to run it manually for 4,000 pages. -- Magioladitis ( talk) 09:04, 15 August 2012 (UTC)
Pages that still use the old format can be found in Category:Infobox automobile image param needs updating. -- Magioladitis ( talk) 11:53, 15 August 2012 (UTC)
WP:VG could use a bot to automatically archive our deletion page. This page features transclusions from the main deletion form, and keeping up with archiving completed discussions can be a bit much. Even if the bot could only move the discussions to the closed section at the bottom of the page that would be appreciated. Currently the "goal" is to archive discussions moved to the closed sections monthly into Wikipedia:WikiProject Video games/Deletion/2012, but if this can't be automated the latter would be appreciated. Thanks. -- Teancum ( talk) 14:11, 16 August 2012 (UTC)
There are a lot of olympic athletes whose only link is to the main olympics page. Can a bot be created that would crawl Category:Competitors at the 2012 Summer Olympics find pages whose only source is http://www.london2012.com/ then find the link to the athlete? It might be something that a bot can't do without human assistance. Ryan Vesey 21:24, 15 August 2012 (UTC)
Yes what we need is something which can search the athlete name, retrieve the url and replace the main page link and also read the birth date. Its definitely bot programmable but its finding somebody to do it.♦ Dr. Blofeld 21:37, 15 August 2012 (UTC)
Hello. I'd like a bot to help with the following task. There's a newly created series of by-year-categories Category:Transport infrastructure by year of completion. In part, these have to be populated manually but here's the part I'd like to have performed by a bot. The category Category:Transport infrastructure completed in XXXX should at a minimum contain the following six subcategories:
So it's a fairly straightforward task: add [[Category:Transport infrastructure completed in XXXX]] to the six corresponding categories. As an extra task: not all transport infrastructure by year categories already exist so the bot would have to create them if need be. They should be created using the following 1992 example.
{{10years|20th|Transport infrastructure completed in|1992|Transport infrastructure by year of completion}}
[[Category:Transport infrastructure by year of completion|1992]]
[[Category:Infrastructure completed in 1992]]
[[Category:1992 in transport|Infrastructure]]
It might make sense to start with the years of the 20th and 21st century as Vegaswikian are still trying to figure out the best way to populate these categories fully. Thanks in advance for any help. Pichpich ( talk) 21:39, 16 August 2012 (UTC)
I think I stumbled across a big problem that involves all referencing, and probably copyvio, by a blocked user named Billy Hathorn in the entire state of Louisiana. Louisiana is not my usual territory. In recently creating an article there, I found everything created by Bill Hathorn used bare urls that are now primarily dead links. One of the reasons he got blocked is cut-and-paste copyvio, so that's an additional possible problem there. He made over 100,000 edits, many of them in Louisiana. Big mess with the Louisiana articles. Is it possible - and is anyone willing - to create one or more bots to search all articles affiliated with Billy Hathorn in Louisiana? If you need examples, please read This Thread. Kumioko has created a table of Hathorn's edits to help. Maile66 ( talk) 01:10, 17 August 2012 (UTC)
It seems that http://www.findarticles.com now redirects to search.com and archive.org is blocked by robots.txt. Should a bot add {{ dead link}} to all the urls pointing to findarticles? Smallman12q ( talk) 21:07, 18 August 2012 (UTC)
Old ITF-links doesn't work anymore. Here is old and new links:
-- Stryn ( talk) 15:10, 15 August 2012 (UTC)
It seems that category is currently not used and empty. Bulwersator ( talk) 11:37, 19 August 2012 (UTC)
I recently tried getting good information on a particular concept that doesn't have "straightforward" translations: While the concept itself may be said to exist in several cultures and languages, it is not conceptualised in the same way; Germans conceptualise it with a link to "images" (bildung) while Norwegians conceptualise it with a link to "forming" (dannelse), and the english language has no particular translation (thus the German word is usually used to signify the concept in English). Now, then, if I go to the Norwegian wiki for it, and see that the information is both short and partly erroneous (or in the very least I haven't seen anything that documents the truth of what appears to be a common myth), and I want therefore to check out the other languages, I get redirected to manners in English, and Umgangsformen in German. If I go to the german page for bildung, and want to see whether there is some good information on it in English, I get sent to the page for education rather than the English entry for bildung (which does exist); if someone who speaks both English and a bit of German, wants to know what the German wiki says on education, they will be sent to bildung (instead of ausbildung, which is more correct); and if they want to get the Danish sense of bildung from the German article, they'll get sent to the Danish page for Uddannelse (rather than dannelse); etc., etc.
In short, someone, at some time, has made some sort of link somewhere that seems to have taken on a life of its own -- probably through bots -- and that it is now very difficult to untangle, simply because there are so many wrongly linked pages (not to mention the fact that it is highly unlikely that any single individual knows enough languages to know whether the link from the English page on education really links to both the German and the Malay page on education (rather than bildung or something completely different altogether)). However, the solution may be simple: If someone could make a bot that could be prompted to visit a certain page and erase all interwiki links from all the linked wikis (in a short enough time that no "competing" interwiki-linking bot could re-establish them), the linking process could be started anew by users (who are now effectively left helpless against the bots), and hopefully getting it right this time around (possibly also with the added possibility of adding nobot-tags to all the pages to avoid autolinking if the concepts eventually turn out to be too confused). It does, of course, carry a real possibility for misuse, and I don't know enough about how bots work, but perhaps activation could be left up to some moderator?
Der Zeitgeist ( talk) 09:29, 21 August 2012 (UTC)
Hey everyone! So, I have a question, that I realized a bot would be best to fix. A few months ago, I tried to tackle the "Requested moves" category backlog, but to no avail. It is simply too large and populates itself too quickly to be adequately tackled by a user. One of the issues that I noticed is that a lot of times, someone would slap a tag onto the article, but not initiate an discussion. As a result, probably half of the tags do not have any reasoning behind then, making them not only showing a false positive, but making the categories unnecessarily huge. Is there any way that a bot could be coded to rectify this problem? If this is AutoWikiBrowser-doable, I would be willing to do it, but I fear that this is something only a bot can accomplish. Thanks a lot, and I look forward to your suggestions! Kevin Rutherford ( talk) 20:42, 22 August 2012 (UTC)
Hello. This spring Armenian Wikipedia changed its name from "Վիքիփեդիա" to "Վիքիպեդիա". So did its "Wikipedia" namespace. Articles were moved from old namespace to new one, without leaving redirects. That broke interwiki links and as it seems, confused IW bots. Some interwikis were removed from articles, some remain but take to "nonexisting" pages. What I'd like to ask is to look for articles in en:wp containing [[hy:Վիքիփեդիա: and change it to [[hy:Վիքիպեդիա:. This will restore interwikis between English and Armenian wikipedias, and I hope bots will update it in other Wikipedias, after that. Thanks. -- Aleksey Chalabyan a.k.a. Xelgen ( talk) 01:44, 24 August 2012 (UTC)
Is there a bot that removes {{ Orphan}} from pages with more than one incoming link? I've got a couple of articles that are currently orphaned, but will be de-orphaned as I create more articles; however, I might not know when they are de-orphaned. Does a bot currently remove the tag and if not, is there any reason one can't? Ryan Vesey 01:45, 24 August 2012 (UTC)
Please remove {{ Newinfobox}}/{{ New infobox}} from these talk pages, because I have replaced infoboxes with new ones. Thanks. -- Makecat Talk 07:47, 24 August 2012 (UTC)
Please add the navbox {{ John Zorn}} to all exist articles, which refers from template. -- Marek Koudelka ( talk) 16:30, 26 August 2012 (UTC)
Let's say there's a wikilink to a specific section of an article. Then, let's say that section is renamed. Is there a bot in place for this particular scenario? Said bot should search all of Wikipedia, upon any section being renamed, for any links to that section, and then rename the link to correspond to the new section name.
====Quantum world====
" were changed to "====Quantum realm====
"[[Determinism#Quantum_world]]
" and change them to read "[[Determinism#Quantum_realm]]
"Bot would scan for edits to existing lines beginning and ending in equals signs. If the entire line were removed, and the bot found a link to the removed sectioning line, it could either rename any links to point to the above section that contained the removed section, or simply rename any links to the main article at large (i.e. "[[Determinism#Quantum_world]]
" becomes "[[Determinism]]
" if the sectioning line of code were removed).
If such a bot already exists, please post its link here, so that I might scrutinize it further. If, on the other hand, such a bot does not exist (and is not simply inactive at this time), I would be interested in attempting to make it myself. Thank you. JimsMaher ( talk) 17:57, 27 August 2012 (UTC)
There are several articles with a broken link to the IUCN Red List ( iucnredlist.org) entry. Is anyone able to generate a list of such articles? -- Leyo 22:29, 28 August 2012 (UTC)
I have created the navbox Template:Louisiana Political Museum and Hall of Fame and have inserted it at the bottom of Louisiana Political Museum and Hall of Fame. Can anyone please run a bot for me that will insert this template at the bottom of the existing articles listed on the template? And...if possible..at the same time, remove any Orphan Tags that may exist on those articles, as this navbox will resolve the orphan issue. Thanks for your help. Maile66 ( talk) 14:59, 30 August 2012 (UTC)
Thank you. Maile66 ( talk) 17:12, 30 August 2012 (UTC)
Could a programmer write a bot to go through NRHP place articles and check for availability of an NRHP nomination document at the National Park Service website, and add a reference to ones that are found, if not already included in the article? This would be hugely helpful in updating thousands of NRHP articles, as the National Park Service puts more and more states' nomination documents on-line.
For example, there are 2,543 NRHP-listed places in California, indexed from List of RHPs in CA. There are articles already for about half of those, i think. Few have references yet to the relevant online NRHP nomination documents, because the National Park Service only just made them all available for that state. The bot would draw a reference number from the NRHP infobox in an article, and use that to look up the nomination document at the National Park Service. It would add a reference (such as <ref name=nrhpdoc>{{cite web|url=http://pdfhost.focus.nps.gov/docs/NRHP/Text/REFNUM.pdf |title=National Register of Historic Places Inventory/Nomination: NAME |author= |date= |publisher=National Park Service}} and [http://pdfhost.focus.nps.gov/docs/NRHP/Photos/REFNUM.pdf accompanying photos]</ref> but with REFNUM and NAME filled in by field values from the NRHP infobox in the article) into the article, just above the references section. Such as in this diff adding NRHP nomination doc reference to a California NRHP article
A complication stems from the fact that the National Park Service's website returns a dummy document saying "Not yet digitised" in cases where the real document is not available. We want the reference to be added only when a real document is in fact available at the expected URL.
I expect this is more difficult to program than #bot to bring photos from NRHP list-articles to individual NRHP place articles item above, but it is actually far more important and useful. -- do ncr am 20:13, 31 August 2012 (UTC)
Hi,
Wikimedia India Chapter is organising WLM in India, and hence would like to invite all Indian Wikipedians to participate. Find the invitation message here at my sandbox: User:Karthikndr/sandbox. Message needs to be send to all the Wiki Project India members.
Needs to be delivered by tomorrow. Thanks! -- ♪Karthik♫ ♪Nadar♫ 18:45, 31 August 2012 (UTC)
Following Wikipedia:Categories for discussion/Log/2012 July 22#Category:Film redirects, we need the following changes for every member of Category:Film redirects (currently 80 or so):
{{WikiProject Film|class=redirect}}
{{
Film}}
may have been used instead of {{
WikiProject Film}}
Once the category is empty we will deal with it speedily. Thanks. -- Mirokado ( talk) 21:20, 23 August 2012 (UTC)
{{
WikiProject_Film/class}}
. Probably better to wait until that support is in place before making these changes and I will post here once that has happened. --
Mirokado (
talk) 11:39, 28 August 2012 (UTC)Hi. I was thinking we could create a bot that closes Wikipedia A-class reviews after two supports, and updates the article's talk page to that. TBran dl ey 02:19, 1 September 2012 (UTC)
Is there a way we can get a bot to remove all transclusions of {{ wikify}} now that the template has been deprecated at TFD? Ten Pound Hammer • ( What did I screw up now?) 18:55, 30 August 2012 (UTC)
I merely closed the discussion. If people want to have a discussion about the most prudent way to remove the template, that sounds like a good thing, regardless of whether the changeover is done manually from the start (replacing each instance with whatever more specific templates are appropriate), or automatically (removing every instance, and leaving a specific hidden category so that every page can be then dealt with manually eventually) or whatever. As long as we have conscientious Wikipedians working on this, I would hope that everything comes to a fair result. - jc37 20:26, 2 September 2012 (UTC)
Is there any way to find every article that has a section reaing "Trivia", "Miscellaneous", "Miscellany", or any variation thereof, and automatically tag it with {{ trivia}}? Ten Pound Hammer • ( What did I screw up now?) 00:10, 3 September 2012 (UTC)
The National Portrait Gallery, London has a website with a very well-organised index of its collection, which is also searchable. The database allocates a unique numerical ID (of the form mp01234) to each sitter (i.e. subject of a portrait), which allows the creation of a link to a list of whatever portraits of a particular person are in the catalogue. (see for example this list for Philip Rea)
In 2006 I created the template {{ NPG name}} to facilitate adding external links from biographical articles to the site's collection, but so far it appears to be used in less than 1,000 articles ... whereas the NPG has 175,000 portraits, and their online catalogue appears to list over 100,000 sitters.
It seems to me that it should be possible to have a bot which at least identified possible matches between sitters in the NPG database and biographies on Wikipedia, and maybe added the links. -- BrownHairedGirl (talk) • ( contribs) 23:32, 31 August 2012 (UTC)
Done. The page is extremely large, if you're having problems viewing it in your browser, I can cut it down to a couple of pages. Some of the guess results are pretty good, whilst others are miles off. There were 57972 sitter names to check, 22613 had exact title matches, with the other 35359 being guessed. Noom talk stalk 16:50, 3 September 2012 (UTC)
Could someone add a task to an existing bot so that it would monitor broken redirects? Wikipedia:Database reports/Broken redirects, updated daily, is a list of all redirects that (as of the update) existed but pointed to deleted pages. I'm imagining the bot looking at that page each time it's updated and marking each broken redirect with a template that would be a modified version of {{ db-redirnone}}. I'd be willing to create said template, which should bear a warning to admins that the history should be checked and the page re-redirected to a suitable target instead of deletion, if possible. I'm thinking of something akin to the F8 tag that Multichill's bot was placing on also-on-Commons images some times ago, which bore a message that was essentially "This image is probably on Commons, but because it was placed by a bot, you must check the Commons image before deleting". Nyttend ( talk) 22:20, 1 September 2012 (UTC)
This is what the script currently does: Get a list of broken redirects from either a dump or Special:AllPages, then verifies that each page is in fact a redirect, and that the target does not exist. If both of those conditions are met, it would mark the redirect for deletion. The modifications I would make is to get a list from the DBR (instead of a dump), and use your template instead of {{ db-r1}}. Lego Kontribs TalkM 04:04, 3 September 2012 (UTC)
[perhaps a job for AWB rather than a true bot; if there;s a better place to ask, please advise]
I'd like someone, please, to covert lists in Category:Lists of aviation accidents and incidents (such as List of accidents and incidents involving the Vickers Viscount) to use {{ Timeline-event}}, like in this similar edit. Some (like List of accidents and incidents involving the DC-3 in the 1980s) will need the years to be copied from subheadings. Date formatting (DMY vs. MDY) will need to be preserved. There's also scope to convert sections within articles, such as Bristol Britannia#Accidents and incidents. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:50, 4 September 2012 (UTC)
Hi. I wonder if a programmer would be willing to write a bot that would go through NRHP geographic-based list-articles, and where a photo is present in the list-article, check to see if it is present in the corresponding, linked NRHP place article, if it exists. Often there is no individual article yet, the link can just be a red-link. Or vice versa: go through the existing NRHP place articles in one state, and check to see if each corresponding list-article has a photo for it.
Some NRHP editors have been dissatisfied at times that new individual articles created by others did not immediately include available photos. Also there are often photos added to list-articles that don't immediately get added to already-existing individual place articles. This would address that complaint. I think it should be a one-way thing, just bringing pics from the list-articles to the individual NRHP place articles, not the other way around. During September there is a Wikipedia Loves Monuments (WLM) campaign going on which may bring a lot of new photos to the list-articles, by the way.
The system of NRHP geographic list-articles is indexed from List of RHPs. It links to state list-articles which link to county list-articles which in some cases link to city- and neighborhood list-articles. The list-articles are all identified in Category:List-Class National Register of Historic Places articles and should also have state or county or other geographic categories. The individual NRHP place articles all should have the NRHP infobox template, which could be tweaked to indicate something if useful, and also all fall within NRHP categories and county or other geographic categories.
The benefit would be to have a bot that could be run occasionally, and to allow quicker improvement of the individual NRHP articles. Any first reaction on whether this could be done? Have similar bots already been written? -- do ncr am 15:09, 31 August 2012 (UTC)
{{DISPLAYTITLE}}
If the title of a page is "List of Foo bar" and "Foo" has {{
italic title}}
on it, can {{DISPLAYTITLE:List of ''Foo'' bar}}
be added to it? This would be useful for pages like "List of Whatever TV Show characters".
David
1217
What I've done 02:06, 4 September 2012 (UTC)
Could we get a bot to remove redlinks that have been tagged for say 6 months. The reason is obviously because links that have been there for that long have very little chance of being created as per Wikipedia:Red link. And of course if the article is created then the user can plug the link directly into the article (only after its creation). It doesn't have to be six months, necessarily. It can be shorter or longer depending on what the higher-ups say. Let me know if anybody has any questions. Thanks. Lighthead þ 07:35, 8 September 2012 (UTC)
I'm pretty new at this stuff, but I wanted to know if it is possible for bots to archive urls. I think it can be very convenient for users. Sorry if this is a stupid proposal. Good day. ComputerJA ( talk) 22:52, 8 September 2012 (UTC)
Great! So if I add an article here it will get archived by a bot in the upcoming days? ComputerJA ( talk) 22:30, 9 September 2012 (UTC)
javascript:window.location.href='http://toolserver.org/~betacommand/cgi-bin/sandbox?page='%20+%20encodeURIComponent(mw.config.get('wgPageName'));
I think there is one available? Will need to walk subcategories. Need not weed out duplicates or sort entries; I could easily do that. Thanks in advance. Churn and change ( talk) 19:38, 11 September 2012 (UTC)
At the idea lab, I made a request for help in finding a way to track how long it's been since the baseball players infobox stats were updated. I was advised to come here. I would appreciate if someone could look over the brief request there, and determine if a bot would be helpful in any way. Automatic Strikeout 21:24, 12 September 2012 (UTC)
{{
Infobox MLB player}}
, then filtered out players which were "retired" players. I didn't sort by namespace so userspace drafts will show up, however I can easily change that. Obviously this doesn't cover all baseball players, but as a quick example it works. I can easily expand it based on other templates/categories. I'll cross post this to the idea lab.
Lego
Kontribs
TalkM 23:23, 12 September 2012 (UTC)This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 45 | ← | Archive 47 | Archive 48 | Archive 49 | Archive 50 | Archive 51 | → | Archive 55 |
Is there a bot that can scan all Wikipedia articles and make a report of all cases, where links to Wikipedia articles exist between ref-tags? See for example ref no.7 at Data (Star Trek) for what I mean. Then I could go through that list and fix them systematically. I don't know whether the fixing of this also could/should be done by a bot or not. If it could/should be fixed by a bot, then of course there is no need to do it manually. -- Toshio Yamaguchi ( tlk− ctb) 13:50, 15 July 2012 (UTC)
Also, I don't know how common a problem that is. Perhaps it's just a few cases but I have no idea how to find out. -- Toshio Yamaguchi ( tlk− ctb) 14:40, 15 July 2012 (UTC)
I've already coded a bot for updating US cities to 2010 census data and I've been informed I'm suppose to request permission to run it? I've never did any other significant editing on Wikipedia before, a few small edits here and there for current events, so I was unaware of this policy. I've already updated Iowa, North Dakota, South Dakota, Nebraska, Kansas and half of Missouri if you want to see what my work looks like. The person who posted on my talk page said I could get banned if I don't get permission so here I am. Is it okay with the community if I continue to update cities to the 2010 census data or not? Jamo2008 ( talk) 17:16, 15 July 2012 (UTC)
Is it possible to generate a list of biography articles (all those with a 'XXXX births' category) that lack a {{ WPBiography}} tag on their talk pages? I've noticed a lot of these around, and was wondering if anyone was able to help clear that backlog or find out how large it is? I think there were bots or editors who dealt with this in the past, but it doesn't seem to be being done any more. Carcharoth ( talk) 12:12, 2 July 2012 (UTC)
Yobot can add the banner in all the pages need it and don't have it. In the old days I ve been doing only this but at some point we ran out of pages. :) -- Magioladitis ( talk) 03:50, 6 July 2012 (UTC)
I generated a list of untagged bio articles and put it at http://toolserver.org/~cbm/untagged.txt . The process I used to create it is described at [1] so that someone else with toolserver access could recreate it later. I tried to save it to a wiki page but the system choked on it :(. There is a 0 or 1 beside each title to say whether the page is in Category:Living people. — Carl ( CBM · talk) 12:29, 7 July 2012 (UTC)
Thanks to Carl for generating the list. How many articles are on that list? It is not particularly urgent, but is there any reason Yobot has to do it rather than another bot? It seems a bit pointless unblocking a bot to do this if the owner is busy doing other stuff. I'd be happy to wait until after Wikimania, but I think Izno is right that this should be a database report that would allow more people to help do this, rather than leaving it all to one person or one bot. Carcharoth ( talk) 01:24, 9 July 2012 (UTC)
Grrrr, thanks alot guys. I've had
Category:Biography articles without listas parameter and
Category:Biography articles without living parameter cleaned out for awhile. Now you go and make work for me again. Think of the poor schmuck (aka me) that gets to add parameters to the tag.
I do have one request. Could LaraBot go thru the list first. LaraBot will add living and stub class to the Biography banner for living biographies. That will help save me some time. I'm currently the only one that tags new living articles, adds living and adds listas, plus I add class and work-groups. Anyone want to volunteer to help out? I use AWB with some scripts by Magioladitis and Kumioko that makes things go faster. Bgwhite ( talk) 04:39, 9 July 2012 (UTC)
The bot is running as fast as it gets but I wonder if we should use group runs in the future to reduce time of huge tasks to be finished. -- Magioladitis ( talk) 04:06, 17 July 2012 (UTC)
The New Zealand Electronic Text Centre recently moved domain from http://www.nzetc.org/ to http://nzetc.victoria.ac.nz/ The old links redirect to the new links, no content was removed and all the links should redirect to the same content in the new domain. There are 2455 links in en Wikipedia (mainly in references) to this site. Some of the links include fragment identifiers pointing to specific places in long pages of text. Would it be possible to change these in an automated or semi-automated fashion? Ideally I'd like to also (a) being able to check that links actually work, putting a list of broken links in my user space or user talk space for me to fix and/or (b) checking for links on foriegn wikipedias. See previous discussion of move. I have a conflict of interest with this website. Stuartyeates ( talk) 00:19, 13 July 2012 (UTC)
I've made a request at Commons:Commons:Bots/Work requests, section "Category merger", for a bot to merge categories for nearly two thousand images, but the only help I've gotten so far is a pointer to a bot that hasn't been active in six months. Could anyone here write a bot to help there? Nyttend ( talk) 02:45, 17 July 2012 (UTC)
Chzz is retired since Jan/Feb and his bot stopped in Mai, so would somebody overtake this task? mabdul 15:45, 11 July 2012 (UTC)
Wikilivres is a site operated by Wikimedia Canada to host texts and images that are free in Canada but can't be hosted on WMF projects (i.e. Commons requires works to be free both in the US and the source country; due to differing copyright durations and the URAA restoration, many works are now free in their source country but remain copyrighted in the US). It appears that they recently moved from wikilivres.info to wikilivres.ca. The old URLs are not forwarding, and this has left us with a fair number of broken links that could be easily fixed, either by replacing "wikilivres.info" with "wikilivres.ca" or by using the interwikilink prefix wikilivres:. Thanks, cmadler ( talk) 15:51, 18 July 2012 (UTC)
Is someone with a bot able to turn the data at User:Ryan Vesey/Minnesota State Legislators into a table with 4 or 5 cells? Right now the information exists as "Legislator and link", "Term/district/body of legislature", "link to website", "citation". It would be great if that could be turned into a table and a notes cell could be added. Ryan Vesey Review me! 15:57, 20 July 2012 (UTC)
Legislator | District | Link | citation | note |
---|---|---|---|---|
Bruce W. Anderson | House 1977-82 (District 26A); House 1983-84 (District 28A) | Anderson, Bruce W. "Buzz" | {{cite web|title=Anderson, Bruce W. "Buzz"|url=http://www.leg.state.mn.us/legdb/fulldetail.aspx?ID=10014|work=Legislators Past and Present|publisher=Minnesota Legislative Reference Library|accessdate=19 June 2012}} |
User:RM bot is currently down, and causing problems for WP:RM. So, a fix, or replacement would be useful. See the discussions at Wikipedia:Village_pump_(technical)#RM_bot_inactive and WT:RM#What's the deal? -- 76.65.131.160 ( talk) 08:58, 23 July 2012 (UTC)
Okay, the Latin music task force was converted to its own WikiProject. When it was still a task force, it used the Template:WikiProject Latin America banner, but now it has it's own banner. What I need is for a bot to replace the previous banner with the new banner. If it's possible, carry the assessment from the old banner to the new one. That is the quality of the Latin American articles and the music-importance parameter to the new one. Here are the categories that need to be replaced: Category:South American music Category:Central American music Category:Mexican music Category:Cuban music Category:Dominican Republic music Category:Puerto Rican music
In addition to the above, I need a bot to tag these articles as the scope of the project has expanded: Category:Spanish music Category:Portuguese music Category:Cape Verdean music
Thanks! Erick ( talk) 02:52, 24 July 2012 (UTC)
As far as I can tell the majority of the tasks previously done by Rich F's bots have not been picked up by anyone. I'm not going to write a long comment about it because I believe few if any besides me care so there's no reason to waste my time. I just want to say that there are still a lot of tasks that Rich's bots did that still need adopting if anyone feels so inclined. One in particular is the WikiProject Watchlists that Femtobot was creating. Kumioko ( talk) 18:13, 7 July 2012 (UTC)
Kumioko, I just ran Catscan, according to above instructions. There was an output, all right. It's not in any kind of format that would be usable to the projects. For the targeted audience - the casual lurker or reader at a given project - this is all gobblegook that doesn't make sense. I don't even know the why or the wherefore of its output. It's just a long list of anything and everything in the project, and no rhyme or reason why it's there. There is nothing clickable on the output, so that right there torpedoes its usefulness. And you tell your browser to go back to the Catscan page, there's a list (yes, clickable) that just seems to be a random listing of any old article in the project. Big whoop. But to give you an example. In the last week, I've created three different articles for the Hawaii project. None of them showed up on the output. So, if I narrow the search to one of those three articles I actually created, the output is this:
== Subset == {| border='1' ! |- |} ---- ;querytime_sec:1.480544090271 ;total_categories_searched:0 ;query_url:http://toolserver.org/~magnus/catscan_rewrite.php?ns%5B0%5D=1&ns%5B4%5D=1&templates_yes=mahi+beamer&format=wiki
If clicked, that just takes you back to Catscan and nothing useful. Whatever else this tool is, it's no comparison to what Femtobot did for the projects. As least Rich's bots were automated. Even if the Catscan were more average user friendly, it would still take a dedicated individual to run it manually. Do you know any projects that have someone with nothing else to do with their time, and plan on being around and available to the projects indefinitely? Somewhere, I swear I hear Rich Farmbrough singing, "One of these days...you're gonna miss me, honey..." Maile66 ( talk) 13:35, 19 July 2012 (UTC)
<span class="mw-title"><a href="[^<">]*?/wiki/([^<">]*)"
with the "HTML Scraper (Advanced Regex)" with Group 1. —
Dispenser 13:36, 25 July 2012 (UTC)
Hi, Can you please provide the updated number of articles in Wikipedia:WikiProject Chennai and their respective ratings? thank you so much. Challengethelimits ( talk) 11:06, 24 July 2012 (UTC)
Thank you so much Challengethelimits ( talk) 04:10, 25 July 2012 (UTC)
I think when we get drive by IPs add articles and the basis for a stub i think we put a lot off editing by them not knowing how to add references or not knowing how to see the finished result. So what about a bot (similar to the "this article cites no sources" message but) which adds the reference section to the bottom of these articles, which don't already have one to make it easier? It's a pain for me to do it without copying and pasting from another article so a bot would be a big help. Thanks ツ Jenova 20 ( email) 10:45, 25 July 2012 (UTC)
I want to make a bot able to detect vandalism. ( Charlie22712 ( talk) 11:22, 25 July 2012 (UTC))
For those which don't already have it, could you add a section to redirects to List of Latin-script digraphs? That is, Dr (digraph) should redirect to List of Latin-script digraphs#D (just use the first letter of the title of the redirect page, ignoring acute accents on vowels: anything funky has been done manually). There are about 250 of them. Exceptions are variants of the article title: anything which has "digraphs" in the plural.
Also, could rd's to Trigraph (orthography) be redirected to List of Latin-script trigraphs, also with a letter-section link? (Don't worry about matching; there are anchors for each letter.)
Thanks, — kwami ( talk) 20:58, 17 July 2012 (UTC)
Hi, I've seen this: Wikipedia:PEACOCK#Puffery,
and I think that a bot could be implemented to automatically remove some of these words. For example, the word "virtuoso" could be easily removed. This word appears almost exclusively on the first sentence, and the formula is always similar:
Ferdinand David (19 June 1810 – 18 July 1873[1]) was a German virtuoso violinist and composer.
or:
Shlomo Mintz (born October 30, 1957) is an Israeli violin virtuoso and conductor.
In the first example, the word "virtuoso" could simply be removed, whereas in the second, word should be removed and "violin" turned into "violinist".
What do you think?
Thank you-- Fauban 16:25, 21 July 2012 (UTC)
Well, I've chosen the word "virtuoso" because of the formulaic nature of most sentences where it appears. It's always the first sentence, and int involves the wordings "virtuoso + [instrumentist]" or "[instrument] + virtuoso".
The bot's actions could be restricted to the first sentence, and only when it's one of these two kinds:
[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] VIRTUOSO [pianist/violinist...] [other stuff (may not appear)] [.]
[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] [piano/violin...] VIRTUOSO [other stuff(may not appear)] [.]
By restricting the bot's actions to this, we would avoid the 100% of mistakes.-- Fauban 17:34, 21 July 2012 (UTC)
Sorry, but if you check some good articles about undisputed virtuosos (e.g. Franz_Liszt), you will see that the thing is different; and that if only the suggestions I wrote above are followed, these sourced claims would be left untouched.-- Fauban 09:48, 24 July 2012 (UTC)
Hello, everyone.
Category:Good articles without an oldid collects articles which have been tagged as good articles, but do not have oldids in the talk page box linking to the reviewed version of the article. I have been regularly removing articles by going through this list and adding the oldids, but considering that the process is relatively straightforward, I wanted to inquire whether it was possible to automate the process. I'm not sure if this is something that GA bot might do or not.
Here is the process I use:
There are several ways to find the oldid of a GA version and add it to the talk page, but this is the process I use. I figured that this might be possibly automated.
Just a suggestion! Keep up the great work, bot owners! Kind regards, Matt ( talk) 22:01, 29 July 2012 (UTC)
I'd like to get this test-page working: here, which is a list of countries participating in the Olympics this year. I'll then copy and paste it to: here, where one of our users has started doing it manually. You would need to use both country names (in Welsh) and the IOC column which you will find on this page, and redirect them to the existing flags (same Welsh names). Any problems or can we do it? Many thanks. Llywelyn2000 ( talk) 15:03, 29 July 2012 (UTC)
Hello-- I clear a lot of red links, and we could eliminate a substantial number with bot-created redirects for certain articles starting with "The," specifically rivers and battles for starters. For instance:
I think this would be a great bot job. I did some for the major rivers by hand the other day (a dozen redirects fixed about a dozen red links), but it was way too tedious. This is a pretty standard procedure in indexing, and it would make the site more navigable. River seems like a pretty easy one: any article ending in text string "River." There may be false positives, but they are just redirects and would not be visible to most readers. Battles seem a little more complicated, but I suppose the string "battle of" would get most of them. If it goes well, we could also do ships or other things down the road. Let me know your thoughts. Thanks! Jokestress ( talk) 01:10, 30 July 2012 (UTC)
Not done - late.
Can I ask for a filtered template usage (transclusion) overview here? I'd like to check a template's param input (600 transclusions, so AWB would be cumbersome for me and still not secure)
Topic: {{
IPAlink}}
has to-be-deprecated params (awkward names & background). I'd like to have a list of pages that use these params off-regular.
2= > or [
(param 2 has closing bracket)bracket= > or [
(named param bracket has closing bracket)errortext=
(named param errortext is used)name=
(named param name is used)- DePiep ( talk) 23:31, 20 July 2012 (UTC)
{{#if:{{{2|}}}{{{bracket|}}}{{{errortext|}}}{{{name|}}}|[[Category:IPAlink articles with invalid parameters|{{NAMESPACE}} {{PAGENAME}}]]}} If you want to do that let me know and I can do it in the sandbox and we can test to see if it will work. Kumioko ( talk) 01:02, 26 July 2012 (UTC)
{{\s*IPAlink(.*?)(2|bracket|errortext|name)\s*([\|}{<\n])
Kumioko ( talk) 19:35, 26 July 2012 (UTC)+
param1=x
, or catch like (param3)x
?) AWB cannot catch these 100% (does regex really catch the input value, or just the input?). That is why I turned to this bot's page. I was only just asking: can some bot do that more easily and systematicaly (catch & select the values of a param, 100%) and throw me a list? No. Also, the responses being this late - for such a simple q - is disappointing. -
DePiep (
talk) 01:10, 28 July 2012 (UTC)
If you look at Category:Olympics stubs, you'll see it contains about 600 articles. About 1/3 of those have "2012" somewhere in their title, (i.e., Canoeing at the 2012 Summer Olympics – Men's slalom C-1, Chad at the 2012 Summer Olympics).
Those articles all contain either {{
Olympics-stub}}
or {{
Olympic-stub}}
where they should have {{
2012-Olympic-stub}}
.
Is there a bot that can easily update all these articles? I've been doing them by hand slowly, but a bot seems to make a lot more sense.
Alternatively, given that this would likely be a one-time process, is there a simple way someone fairly geeky can babysit a process that does it semi-automatically? If it doesnt involve using Windows, I'm up for it.
Thanks, Dori ☾ Talk ☯ Contribs☽ 23:45, 31 July 2012 (UTC)
Please refer to this TfD section where Template:Gotras of Jats is discussed.
The template is intended to be for the navigation of the entire set of articles contained in the article List of Jat Clans, and applied to all named articles in that list. However, the template both does not contain all the articles in that list and contains articles that are not in that list. For the template to be useful, and the discusson is heading towards the weak consensus that it is useful, it must be 100% in agreement with the list article, and the articles in the list and the template must have the template applied to them.
So the Bot should:
A bot is needed because the number of Gotras (clans) of Jats is likely to be 2,700 in the end, and manual maintenance is next to impossible. A full analysis may design a better procedure than my outline above Fiddle Faddle ( talk) 14:32, 27 July 2012 (UTC)
Would it be possible for a bot to remove all links to http://soccerdatabase.eu - it appears to be an illegal website which is simply copying content from the now defunct http://www.playerhistory.com - the owner of the playerhistory website is Polarman ( talk · contribs) and he has said he is launching legal action against soccerdatabse. Regards, Giant Snowman 13:52, 2 August 2012 (UTC)
It was decided in a recent RM to use "X baronets" instead of "X Baronets" in all baronetcy article titles. This would mean renaming all the non-redirect pages in that category to use the lowercase version. I hope this can be done by bot as it involves moving a very large number of pages. Jafeluv ( talk) 09:11, 25 July 2012 (UTC)
I've requested this before, but perhaps I made it too elaborate.
We have a search engine that allows the reader to look up a language with its ISO639-3 code. However, it only works as well as the redirects are maintained, and they're badly out of date: either directing to the wrong page, or in many cases missing altogether.
Create or update rd pages for all piped blue links at Wikipedia:WikiProject Languages/Articles by code:
For each link of the form
[[ABC|xyz]]
a rd should be located at
ISO 639:xyz
and the syntax should be
#REDIRECT [[ABC]]{{R from ISO 639|XYZ}}
For instance, the first link is
[[Ghotuo language|aaa]]
so the rd at
should read
#REDIRECT [[Ghotuo language]]{{R from ISO 639|AAA}}
(as indeed it does, so this one would be skipped as not requiring any change)
Some of these links will themselves be to redirects, but that can be fixed by one of the automated link patrollers. (Unless you want this bot to follow the rd chain? That might be a better use of server load, as it would avoid changing rd's only to have them changed back for articles which are not at their ISO name.)
It would be nice if the bot would verify that the article at the end of the rd chain contains an infobox listing the ISO code in question, but that starts getting more involved. If it wouldn't be too much trouble, maybe flag the rd with an error category for manual review if it doesn't?
— kwami ( talk) 10:50, 3 August 2012 (UTC)
Many of the articles in Category:United States Supreme Court cases and its subcategories (all of which I've checked; there are no irrelevant subcategories, such as Category:American military personnel killed in the Gulf War being a subcategory of Category:Morocco) cover individual Supreme Court cases; almost all of these articles are entitled "PARTY1 v. PARTY2". One sometimes sees "PARTY1 v PARTY2" in ordinary writing, and it's easy to omit the punctuation after the "v". Could a bot go around and create redirects for all of these articles, using Virginia v West Virginia as a model? Nyttend ( talk) 21:25, 27 July 2012 (UTC)
As the AFC reviewer has somehow a part of the responsibility, can somebody write a bot, which monitors the articles listed at AfD (or xfd) and informing the AFC reviewer/accepter? The reviewer is listed in the AFC project template at the talk page in the parameter |reviewer=
. So rather a simple task. ;-)
mabdul 11:58, 4 August 2012 (UTC)
RM bot has maintained the WP:Requested moves page for quite a while now, but it's owner has stopped editing, and the bot has shut down. We desperately need someone to adopt it and start it up again, as the Requested moves page is currently being manually updated. The source code is at User:RM bot/requestedmoves.php for anyone that is capable of compiling and adopting it. Thank you.-- Aervanath ( talk) 14:49, 4 August 2012 (UTC)
Вы могли бы мне помоч сделать облако тегов вот для этого сайта adrenalin-css.3dn.ru Напишите код в ответе всиысле htm код блога который бы получился — Preceding unsigned comment added by 178.122.3.140 ( talk) 20:18, 5 August 2012 (UTC)
See the "Question about protection" section of WP:VP/T (which will probably end up at Wikipedia:Village pump (technical)/Archive 101 rather soon) — an admin recently put temporary full protection on an article that had been indefinitely semiprotected, but due to the nature of protection, the page will be completely unprotected at the expiration of full protection. Since (at least right now) software doesn't permit any alternatives, we'll have to go back to the page and add semiprotection after the full protection expires (or cut short the time for full protection), but this requires that someone realise that it's time to semiprotect. What if we had a bot to post a notice at WP:AN to remind admins to protect pages like this? The bot could have a page in its userspace where someone could list pages that would need protection and times; the bot could be programmed always to notify WP:AN a certain number of minutes before the protection was needed. Because I'm asking only for a notification bot, not a protection bot, we wouldn't need the bot to be an admin, and the list page wouldn't need to be protected.
Five days ago, I posted a request at the proposals VP asking for input on this idea; the only response was "I take it as obvious that the best approach would be a change tot he software, but this is a temporary measure, until such changes are made. Sounds like a good idea to me." by SPhilbrick. Nyttend ( talk) 00:37, 10 August 2012 (UTC)
I often post at
WP:MFD.
Wikipedia:User pages lets users know that they many not have excessive unrelated content in their user pages. I'm looking for a bot to populate a series of
administration categories entitled
Category:User pages with excessive unrelated content having subcategories
Category:User pages with excessive unrelated content 2010, Category:User pages with excessive unrelated content 2009, 2008, 2007, ..., 2001 (based on the date the page was created), each of which could be a subcategory of the MfD project, such as
Category:Miscellaneous pages for deletion. The bot would look for only those user pages which meet each of the following four criteria:
1. of users who have only posted edits to any user space page (user and user talk),
2. user pages that at least two years old (user pages that were created in the year 2010 or earlier),
3. of users that have at least 5 edits collectively in all user space pages, and
4. where the user page now has at least 10,000 bytes.
I got the idea from the user page
User:Skverma1949's listing at MfD.
[2] The Skverma1949 page originated in August 2009, the user has only posted edits to user space (three different user space pages of users having similar user names), has more than 5 edits to all user space pages, and the page now has 17,000+ bytes. There are many user pages where the user has one or two edits total, usually to their user page. I'm looking to screen those out, at this time, via the requirement of at least 5 edits collectively in all user space pages to focus on further human screening of each page listed in the catetorirs for
Wikipedia:User pages and
WP:NOTWEBHOST issues. --
Uzma Gamal (
talk) 15:07, 10 August 2012 (UTC)
[[Category:...]]
tag, and then the bot might get reverted (although most of these users are probably long gone) and you'd never know. It's more likely you just want a list of these users/pages, which someone with Toolserver access could probably create for you easily enough.
Anomie
⚔ 20:07, 10 August 2012 (UTC)
It would be usefull to us at
WP:GA to have an easy way to keep track of who is reviewing Good articles and who is submitting them. It will potentially help in identifying trends, identifying new or inexperienced reviewers so we can guide them, make it easier to fix problems from poor or bad-faith reviewers and to identify potential new reviewers. To submit a review someone puts this {{
subst:GAN}}
template on the page. This will list the article at
WP:GAN. To review the article the reviewer will create the review page (it will be in the form Talk:articlename/GAreview numer eg
Talk:Prussian Homage (painting)/GA1). The more information we could get the better, but even the raw numers (nominations/reviews) would be useful. Other details that may prove useful would be a link to the articles and/or reviews, how many have failed, passed, or been re-assessed and the dates the nominations/reviews were conducted.
If a new page could be updated with this information that would be great. The most usefull presentation though would be to have it next to new nominations at WP:GA. Something like:
where N=Nominations and R=reviews. That list is currently populated by GA bot ( talk · contribs) so maybe it could pull information from the new page and add it when posting? Any help much appreciated. N.B. Chris (the owner of GA bot) has been notified of this thread. AIRcorn (talk) 09:39, 8 August 2012 (UTC)
Looks like all varieties of palm.com urls have been discontinued. We have >> 100 links at Special:Linksearch/*.palm.com. It would be great if someone could run a detox run through to {{ deadlink}} them all. Thanks. — billinghurst sDrewth 16:22, 11 August 2012 (UTC)
I don't know whether DASHBot still performs this task or not, but according to User:DASHBot/Logs the bot didn't remove any NFCC#9 violations since February 2012. If DASHBot is no longer performing this task, then I think a replacement is needed. -- Toshio Yamaguchi ( tlk− ctb) 08:47, 12 August 2012 (UTC)
Could a friendly bot go through my userspace and add {{ Noindex}} to every subpage wrapped in include only tags? (Some pages are transcluded or substituted). I keep trying to add them, but continually find ones I've missed. Thanks. Ryan Vesey 07:05, 12 August 2012 (UTC)
As per Wikipedia talk:WikiProject Automobiles#Image formatting in articles, we have decided to implement some changes to:
{{
Infobox automobile}}
(4332 transclusions){{
Infobox automobile platform}}
(68 transclusions){{
Infobox automobile engine}}
(305 transclisions)However, the scope of these changes is beyond what could reasonably be done manually so I, on behalf of WikiProject Automobiles would like to request the necessary assistance.
Before a bot is required, Stepho-wrs will update the templates to include a new "image_file" parameter.
Stage 1:
{{
Infobox automobile}}
only:
{{
Infobox automobile}}
only:
Not done Stage 2:
Regards, OSX ( talk • contributions) 09:29, 14 August 2012 (UTC)
{{
Infobox electric vehicle}}
but that presumably should also be included in the above changes to ensure consistency?
Warren (
talk) 14:39, 14 August 2012 (UTC)
Stage 2 is not needed. I added support to barefilenames to the existing parameter. I also added a tracking category and after I fix all pages in there I'll update the code once more. -- Magioladitis ( talk) 19:38, 14 August 2012 (UTC)
I updated my script to convert to bare filenames and move caption to the correct position. But the script still needs some adjustments to run automatically. Otherwise, I'll have to run it manually for 4,000 pages. -- Magioladitis ( talk) 09:04, 15 August 2012 (UTC)
Pages that still use the old format can be found in Category:Infobox automobile image param needs updating. -- Magioladitis ( talk) 11:53, 15 August 2012 (UTC)
WP:VG could use a bot to automatically archive our deletion page. This page features transclusions from the main deletion form, and keeping up with archiving completed discussions can be a bit much. Even if the bot could only move the discussions to the closed section at the bottom of the page that would be appreciated. Currently the "goal" is to archive discussions moved to the closed sections monthly into Wikipedia:WikiProject Video games/Deletion/2012, but if this can't be automated the latter would be appreciated. Thanks. -- Teancum ( talk) 14:11, 16 August 2012 (UTC)
There are a lot of olympic athletes whose only link is to the main olympics page. Can a bot be created that would crawl Category:Competitors at the 2012 Summer Olympics find pages whose only source is http://www.london2012.com/ then find the link to the athlete? It might be something that a bot can't do without human assistance. Ryan Vesey 21:24, 15 August 2012 (UTC)
Yes what we need is something which can search the athlete name, retrieve the url and replace the main page link and also read the birth date. Its definitely bot programmable but its finding somebody to do it.♦ Dr. Blofeld 21:37, 15 August 2012 (UTC)
Hello. I'd like a bot to help with the following task. There's a newly created series of by-year-categories Category:Transport infrastructure by year of completion. In part, these have to be populated manually but here's the part I'd like to have performed by a bot. The category Category:Transport infrastructure completed in XXXX should at a minimum contain the following six subcategories:
So it's a fairly straightforward task: add [[Category:Transport infrastructure completed in XXXX]] to the six corresponding categories. As an extra task: not all transport infrastructure by year categories already exist so the bot would have to create them if need be. They should be created using the following 1992 example.
{{10years|20th|Transport infrastructure completed in|1992|Transport infrastructure by year of completion}}
[[Category:Transport infrastructure by year of completion|1992]]
[[Category:Infrastructure completed in 1992]]
[[Category:1992 in transport|Infrastructure]]
It might make sense to start with the years of the 20th and 21st century as Vegaswikian are still trying to figure out the best way to populate these categories fully. Thanks in advance for any help. Pichpich ( talk) 21:39, 16 August 2012 (UTC)
I think I stumbled across a big problem that involves all referencing, and probably copyvio, by a blocked user named Billy Hathorn in the entire state of Louisiana. Louisiana is not my usual territory. In recently creating an article there, I found everything created by Bill Hathorn used bare urls that are now primarily dead links. One of the reasons he got blocked is cut-and-paste copyvio, so that's an additional possible problem there. He made over 100,000 edits, many of them in Louisiana. Big mess with the Louisiana articles. Is it possible - and is anyone willing - to create one or more bots to search all articles affiliated with Billy Hathorn in Louisiana? If you need examples, please read This Thread. Kumioko has created a table of Hathorn's edits to help. Maile66 ( talk) 01:10, 17 August 2012 (UTC)
It seems that http://www.findarticles.com now redirects to search.com and archive.org is blocked by robots.txt. Should a bot add {{ dead link}} to all the urls pointing to findarticles? Smallman12q ( talk) 21:07, 18 August 2012 (UTC)
Old ITF-links doesn't work anymore. Here is old and new links:
-- Stryn ( talk) 15:10, 15 August 2012 (UTC)
It seems that category is currently not used and empty. Bulwersator ( talk) 11:37, 19 August 2012 (UTC)
I recently tried getting good information on a particular concept that doesn't have "straightforward" translations: While the concept itself may be said to exist in several cultures and languages, it is not conceptualised in the same way; Germans conceptualise it with a link to "images" (bildung) while Norwegians conceptualise it with a link to "forming" (dannelse), and the english language has no particular translation (thus the German word is usually used to signify the concept in English). Now, then, if I go to the Norwegian wiki for it, and see that the information is both short and partly erroneous (or in the very least I haven't seen anything that documents the truth of what appears to be a common myth), and I want therefore to check out the other languages, I get redirected to manners in English, and Umgangsformen in German. If I go to the german page for bildung, and want to see whether there is some good information on it in English, I get sent to the page for education rather than the English entry for bildung (which does exist); if someone who speaks both English and a bit of German, wants to know what the German wiki says on education, they will be sent to bildung (instead of ausbildung, which is more correct); and if they want to get the Danish sense of bildung from the German article, they'll get sent to the Danish page for Uddannelse (rather than dannelse); etc., etc.
In short, someone, at some time, has made some sort of link somewhere that seems to have taken on a life of its own -- probably through bots -- and that it is now very difficult to untangle, simply because there are so many wrongly linked pages (not to mention the fact that it is highly unlikely that any single individual knows enough languages to know whether the link from the English page on education really links to both the German and the Malay page on education (rather than bildung or something completely different altogether)). However, the solution may be simple: If someone could make a bot that could be prompted to visit a certain page and erase all interwiki links from all the linked wikis (in a short enough time that no "competing" interwiki-linking bot could re-establish them), the linking process could be started anew by users (who are now effectively left helpless against the bots), and hopefully getting it right this time around (possibly also with the added possibility of adding nobot-tags to all the pages to avoid autolinking if the concepts eventually turn out to be too confused). It does, of course, carry a real possibility for misuse, and I don't know enough about how bots work, but perhaps activation could be left up to some moderator?
Der Zeitgeist ( talk) 09:29, 21 August 2012 (UTC)
Hey everyone! So, I have a question, that I realized a bot would be best to fix. A few months ago, I tried to tackle the "Requested moves" category backlog, but to no avail. It is simply too large and populates itself too quickly to be adequately tackled by a user. One of the issues that I noticed is that a lot of times, someone would slap a tag onto the article, but not initiate an discussion. As a result, probably half of the tags do not have any reasoning behind then, making them not only showing a false positive, but making the categories unnecessarily huge. Is there any way that a bot could be coded to rectify this problem? If this is AutoWikiBrowser-doable, I would be willing to do it, but I fear that this is something only a bot can accomplish. Thanks a lot, and I look forward to your suggestions! Kevin Rutherford ( talk) 20:42, 22 August 2012 (UTC)
Hello. This spring Armenian Wikipedia changed its name from "Վիքիփեդիա" to "Վիքիպեդիա". So did its "Wikipedia" namespace. Articles were moved from old namespace to new one, without leaving redirects. That broke interwiki links and as it seems, confused IW bots. Some interwikis were removed from articles, some remain but take to "nonexisting" pages. What I'd like to ask is to look for articles in en:wp containing [[hy:Վիքիփեդիա: and change it to [[hy:Վիքիպեդիա:. This will restore interwikis between English and Armenian wikipedias, and I hope bots will update it in other Wikipedias, after that. Thanks. -- Aleksey Chalabyan a.k.a. Xelgen ( talk) 01:44, 24 August 2012 (UTC)
Is there a bot that removes {{ Orphan}} from pages with more than one incoming link? I've got a couple of articles that are currently orphaned, but will be de-orphaned as I create more articles; however, I might not know when they are de-orphaned. Does a bot currently remove the tag and if not, is there any reason one can't? Ryan Vesey 01:45, 24 August 2012 (UTC)
Please remove {{ Newinfobox}}/{{ New infobox}} from these talk pages, because I have replaced infoboxes with new ones. Thanks. -- Makecat Talk 07:47, 24 August 2012 (UTC)
Please add the navbox {{ John Zorn}} to all exist articles, which refers from template. -- Marek Koudelka ( talk) 16:30, 26 August 2012 (UTC)
Let's say there's a wikilink to a specific section of an article. Then, let's say that section is renamed. Is there a bot in place for this particular scenario? Said bot should search all of Wikipedia, upon any section being renamed, for any links to that section, and then rename the link to correspond to the new section name.
====Quantum world====
" were changed to "====Quantum realm====
"[[Determinism#Quantum_world]]
" and change them to read "[[Determinism#Quantum_realm]]
"Bot would scan for edits to existing lines beginning and ending in equals signs. If the entire line were removed, and the bot found a link to the removed sectioning line, it could either rename any links to point to the above section that contained the removed section, or simply rename any links to the main article at large (i.e. "[[Determinism#Quantum_world]]
" becomes "[[Determinism]]
" if the sectioning line of code were removed).
If such a bot already exists, please post its link here, so that I might scrutinize it further. If, on the other hand, such a bot does not exist (and is not simply inactive at this time), I would be interested in attempting to make it myself. Thank you. JimsMaher ( talk) 17:57, 27 August 2012 (UTC)
There are several articles with a broken link to the IUCN Red List ( iucnredlist.org) entry. Is anyone able to generate a list of such articles? -- Leyo 22:29, 28 August 2012 (UTC)
I have created the navbox Template:Louisiana Political Museum and Hall of Fame and have inserted it at the bottom of Louisiana Political Museum and Hall of Fame. Can anyone please run a bot for me that will insert this template at the bottom of the existing articles listed on the template? And...if possible..at the same time, remove any Orphan Tags that may exist on those articles, as this navbox will resolve the orphan issue. Thanks for your help. Maile66 ( talk) 14:59, 30 August 2012 (UTC)
Thank you. Maile66 ( talk) 17:12, 30 August 2012 (UTC)
Could a programmer write a bot to go through NRHP place articles and check for availability of an NRHP nomination document at the National Park Service website, and add a reference to ones that are found, if not already included in the article? This would be hugely helpful in updating thousands of NRHP articles, as the National Park Service puts more and more states' nomination documents on-line.
For example, there are 2,543 NRHP-listed places in California, indexed from List of RHPs in CA. There are articles already for about half of those, i think. Few have references yet to the relevant online NRHP nomination documents, because the National Park Service only just made them all available for that state. The bot would draw a reference number from the NRHP infobox in an article, and use that to look up the nomination document at the National Park Service. It would add a reference (such as <ref name=nrhpdoc>{{cite web|url=http://pdfhost.focus.nps.gov/docs/NRHP/Text/REFNUM.pdf |title=National Register of Historic Places Inventory/Nomination: NAME |author= |date= |publisher=National Park Service}} and [http://pdfhost.focus.nps.gov/docs/NRHP/Photos/REFNUM.pdf accompanying photos]</ref> but with REFNUM and NAME filled in by field values from the NRHP infobox in the article) into the article, just above the references section. Such as in this diff adding NRHP nomination doc reference to a California NRHP article
A complication stems from the fact that the National Park Service's website returns a dummy document saying "Not yet digitised" in cases where the real document is not available. We want the reference to be added only when a real document is in fact available at the expected URL.
I expect this is more difficult to program than #bot to bring photos from NRHP list-articles to individual NRHP place articles item above, but it is actually far more important and useful. -- do ncr am 20:13, 31 August 2012 (UTC)
Hi,
Wikimedia India Chapter is organising WLM in India, and hence would like to invite all Indian Wikipedians to participate. Find the invitation message here at my sandbox: User:Karthikndr/sandbox. Message needs to be send to all the Wiki Project India members.
Needs to be delivered by tomorrow. Thanks! -- ♪Karthik♫ ♪Nadar♫ 18:45, 31 August 2012 (UTC)
Following Wikipedia:Categories for discussion/Log/2012 July 22#Category:Film redirects, we need the following changes for every member of Category:Film redirects (currently 80 or so):
{{WikiProject Film|class=redirect}}
{{
Film}}
may have been used instead of {{
WikiProject Film}}
Once the category is empty we will deal with it speedily. Thanks. -- Mirokado ( talk) 21:20, 23 August 2012 (UTC)
{{
WikiProject_Film/class}}
. Probably better to wait until that support is in place before making these changes and I will post here once that has happened. --
Mirokado (
talk) 11:39, 28 August 2012 (UTC)Hi. I was thinking we could create a bot that closes Wikipedia A-class reviews after two supports, and updates the article's talk page to that. TBran dl ey 02:19, 1 September 2012 (UTC)
Is there a way we can get a bot to remove all transclusions of {{ wikify}} now that the template has been deprecated at TFD? Ten Pound Hammer • ( What did I screw up now?) 18:55, 30 August 2012 (UTC)
I merely closed the discussion. If people want to have a discussion about the most prudent way to remove the template, that sounds like a good thing, regardless of whether the changeover is done manually from the start (replacing each instance with whatever more specific templates are appropriate), or automatically (removing every instance, and leaving a specific hidden category so that every page can be then dealt with manually eventually) or whatever. As long as we have conscientious Wikipedians working on this, I would hope that everything comes to a fair result. - jc37 20:26, 2 September 2012 (UTC)
Is there any way to find every article that has a section reaing "Trivia", "Miscellaneous", "Miscellany", or any variation thereof, and automatically tag it with {{ trivia}}? Ten Pound Hammer • ( What did I screw up now?) 00:10, 3 September 2012 (UTC)
The National Portrait Gallery, London has a website with a very well-organised index of its collection, which is also searchable. The database allocates a unique numerical ID (of the form mp01234) to each sitter (i.e. subject of a portrait), which allows the creation of a link to a list of whatever portraits of a particular person are in the catalogue. (see for example this list for Philip Rea)
In 2006 I created the template {{ NPG name}} to facilitate adding external links from biographical articles to the site's collection, but so far it appears to be used in less than 1,000 articles ... whereas the NPG has 175,000 portraits, and their online catalogue appears to list over 100,000 sitters.
It seems to me that it should be possible to have a bot which at least identified possible matches between sitters in the NPG database and biographies on Wikipedia, and maybe added the links. -- BrownHairedGirl (talk) • ( contribs) 23:32, 31 August 2012 (UTC)
Done. The page is extremely large, if you're having problems viewing it in your browser, I can cut it down to a couple of pages. Some of the guess results are pretty good, whilst others are miles off. There were 57972 sitter names to check, 22613 had exact title matches, with the other 35359 being guessed. Noom talk stalk 16:50, 3 September 2012 (UTC)
Could someone add a task to an existing bot so that it would monitor broken redirects? Wikipedia:Database reports/Broken redirects, updated daily, is a list of all redirects that (as of the update) existed but pointed to deleted pages. I'm imagining the bot looking at that page each time it's updated and marking each broken redirect with a template that would be a modified version of {{ db-redirnone}}. I'd be willing to create said template, which should bear a warning to admins that the history should be checked and the page re-redirected to a suitable target instead of deletion, if possible. I'm thinking of something akin to the F8 tag that Multichill's bot was placing on also-on-Commons images some times ago, which bore a message that was essentially "This image is probably on Commons, but because it was placed by a bot, you must check the Commons image before deleting". Nyttend ( talk) 22:20, 1 September 2012 (UTC)
This is what the script currently does: Get a list of broken redirects from either a dump or Special:AllPages, then verifies that each page is in fact a redirect, and that the target does not exist. If both of those conditions are met, it would mark the redirect for deletion. The modifications I would make is to get a list from the DBR (instead of a dump), and use your template instead of {{ db-r1}}. Lego Kontribs TalkM 04:04, 3 September 2012 (UTC)
[perhaps a job for AWB rather than a true bot; if there;s a better place to ask, please advise]
I'd like someone, please, to covert lists in Category:Lists of aviation accidents and incidents (such as List of accidents and incidents involving the Vickers Viscount) to use {{ Timeline-event}}, like in this similar edit. Some (like List of accidents and incidents involving the DC-3 in the 1980s) will need the years to be copied from subheadings. Date formatting (DMY vs. MDY) will need to be preserved. There's also scope to convert sections within articles, such as Bristol Britannia#Accidents and incidents. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:50, 4 September 2012 (UTC)
Hi. I wonder if a programmer would be willing to write a bot that would go through NRHP geographic-based list-articles, and where a photo is present in the list-article, check to see if it is present in the corresponding, linked NRHP place article, if it exists. Often there is no individual article yet, the link can just be a red-link. Or vice versa: go through the existing NRHP place articles in one state, and check to see if each corresponding list-article has a photo for it.
Some NRHP editors have been dissatisfied at times that new individual articles created by others did not immediately include available photos. Also there are often photos added to list-articles that don't immediately get added to already-existing individual place articles. This would address that complaint. I think it should be a one-way thing, just bringing pics from the list-articles to the individual NRHP place articles, not the other way around. During September there is a Wikipedia Loves Monuments (WLM) campaign going on which may bring a lot of new photos to the list-articles, by the way.
The system of NRHP geographic list-articles is indexed from List of RHPs. It links to state list-articles which link to county list-articles which in some cases link to city- and neighborhood list-articles. The list-articles are all identified in Category:List-Class National Register of Historic Places articles and should also have state or county or other geographic categories. The individual NRHP place articles all should have the NRHP infobox template, which could be tweaked to indicate something if useful, and also all fall within NRHP categories and county or other geographic categories.
The benefit would be to have a bot that could be run occasionally, and to allow quicker improvement of the individual NRHP articles. Any first reaction on whether this could be done? Have similar bots already been written? -- do ncr am 15:09, 31 August 2012 (UTC)
{{DISPLAYTITLE}}
If the title of a page is "List of Foo bar" and "Foo" has {{
italic title}}
on it, can {{DISPLAYTITLE:List of ''Foo'' bar}}
be added to it? This would be useful for pages like "List of Whatever TV Show characters".
David
1217
What I've done 02:06, 4 September 2012 (UTC)
Could we get a bot to remove redlinks that have been tagged for say 6 months. The reason is obviously because links that have been there for that long have very little chance of being created as per Wikipedia:Red link. And of course if the article is created then the user can plug the link directly into the article (only after its creation). It doesn't have to be six months, necessarily. It can be shorter or longer depending on what the higher-ups say. Let me know if anybody has any questions. Thanks. Lighthead þ 07:35, 8 September 2012 (UTC)
I'm pretty new at this stuff, but I wanted to know if it is possible for bots to archive urls. I think it can be very convenient for users. Sorry if this is a stupid proposal. Good day. ComputerJA ( talk) 22:52, 8 September 2012 (UTC)
Great! So if I add an article here it will get archived by a bot in the upcoming days? ComputerJA ( talk) 22:30, 9 September 2012 (UTC)
javascript:window.location.href='http://toolserver.org/~betacommand/cgi-bin/sandbox?page='%20+%20encodeURIComponent(mw.config.get('wgPageName'));
I think there is one available? Will need to walk subcategories. Need not weed out duplicates or sort entries; I could easily do that. Thanks in advance. Churn and change ( talk) 19:38, 11 September 2012 (UTC)
At the idea lab, I made a request for help in finding a way to track how long it's been since the baseball players infobox stats were updated. I was advised to come here. I would appreciate if someone could look over the brief request there, and determine if a bot would be helpful in any way. Automatic Strikeout 21:24, 12 September 2012 (UTC)
{{
Infobox MLB player}}
, then filtered out players which were "retired" players. I didn't sort by namespace so userspace drafts will show up, however I can easily change that. Obviously this doesn't cover all baseball players, but as a quick example it works. I can easily expand it based on other templates/categories. I'll cross post this to the idea lab.
Lego
Kontribs
TalkM 23:23, 12 September 2012 (UTC)