This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 35 | ← | Archive 38 | Archive 39 | Archive 40 | Archive 41 | Archive 42 | → | Archive 45 |
{{ Infobox NFLretired}} should be replaced with {{ Infobox NFLactive}}. I could do it myself but I don't know exactly which fields should be renamed, etc. Can someone do it or hint me how to do it?
Instructions say: This template is being retired per discussion
here
Instead, please use {{
Infobox NFLactive}} setting the |final_team=
& |final_year=
fields appropriately; and leaving the |current_team=
field blank.
-- Magioladitis ( talk) 11:18, 7 January 2011 (UTC)
Right now there's a heated discussion at WP:AN over a user involved in WP:NFCC; the whole case is unimportant for here, but a key idea that fell out so far was that when pages are moved without leaving behind redirects, all non-free images are "broken" per NFCC#10c until the rationale for that image are fixed. If the mover doesn't do this themselves, usually no one does it.
The idea behind a bot here would be to identify page moves w/o redirects, whether those pages include NFCC images (as determined by the licenses on the page), and then replace the old article name with the new one in the image rationale. (This may either be a link, or may be the "article" field in a limited number of templates). This would remove many of the problems with #10c enforcement.
(One could argue this bot could be expanded to handle any page move and any links anywhere on WP to that new page, but right now, this is a much smaller and important focus). -- MASEM ( t) 20:13, 11 January 2011 (UTC)
While working with WPUS I have recently found a ton of Usernames that have been inactive for long periods of time, some for years. I would like to recommend a bot that would tag the inactive User pages with an inactive tag if the account hasn't made an edit in a year, maybe even 6 months. Although I have a bot account pending I do not know how to do something like this so I thought I would just submit it here. Plus I thought this would be a good place to solicite comments on the merits of the task itself. -- Kumioko ( talk) 12:33, 11 January 2011 (UTC)
user
table. I never did find a good algorithm to identify inactive users. If I were to write it again today, I'd add WikiDashboad graphs [Now implemented] and provide a checkbox to move entries to an active section. —
Dispenser 15:19, 11 January 2011 (UTC)
Just a quick comment, I would think a Database report would be ideal for something like this. – MuZemike 01:26, 12 January 2011 (UTC)
I would like a bot that goes through articles that relate to buses and transport and checking everything.
RcsprinterBot 12:01, 16 January 2011 (UTC)
I need a bot that can delete all of its inclusions and delete this template once and for All! Please help? -- Hinata talk 21:03, 15 January 2011 (UTC)
I don't think that this has been suggested before. Jarry1250 has a tool on Toolserver that checks for image existence, so why not have a bot that automatically deletes the {{ req-photo}} template on article talk pages (if there isn't a parameter for a specific type of image) if an image already exists in the article? It would definitely clear up the backlog. Logan Talk Contributions 04:11, 14 January 2011 (UTC)
1_E+sq._km_m² is on most wanted pages as frequently linked. Probably the result of inappropriate use of {{ Infobox Indian jurisdiction}} and similar templates. In the case of {{ Infobox Indian jurisdiction}} The following change needs to be made:
As per this diff I don't know if the problem affects other templates. Taemyr ( talk) 11:50, 4 January 2011 (UTC)
{{#if:{{{area_magnitude|}}}
would do the job well enough?
Rich
Farmbrough, 02:08, 12 January 2011 (UTC).Since
WildBot hasn't made an edit since September 21, 2010 and no longer updates it's talk page templates for disambiguation and section links I would like to see another bot go around and delete the following templates from the talk-pages: {{
User:WildBot/m01}}
, {{
User:WildBot/m04}}
, each of these templates has more than 5000 transclusions, and are in many/most cases outdated.
Xeworlebi (
talk) 18:45, 18 January 2011 (UTC)
wget http...wildbot.txt; python replace.py -file:wildbot.txt -regex "\{\{(User:WildBot/m01|User:WildBot/msg)\|([^{}]|\{\{User:WildBot/[^{}]*\}\})*\}\}\n?" ""
(untested) added to cron. —
Dispenser 21:04, 18 January 2011 (UTC)
What exactly is to be done? Remove the tag from the daily generated list or all pages? I would prefer the first option because some people maybe still using the tag to identify and fix things. Btw, I am running on the list every 3-4 days. I can do it every days if you think is necessary. -- Magioladitis ( talk) 01:28, 19 January 2011 (UTC)
Is there a bot or would it be possible to create one, for On This Day and DYK's to be updated automatically by a bot for Portal:United States (and potentially others)? Currently these 2 sections must be manually updated but I would like to make these 2 section of Portal:United States as user friendly and maintenance free as possible. Any ideas? -- Kumioko ( talk) 19:33, 20 January 2011 (UTC)
Hi. Anyone is interested in programming a bot to insert {{ commons}} or {{ commonscat}} templates in articles when needed using interwiki? An example, adding gallery link from Spanish article.
The bot can use the "External links" section or adding the template at the bottom of the article just before the categories. Is there any infobox adding links to Commons? We need to excluded those cases. Thanks. emijrp ( talk) 20:56, 18 January 2011 (UTC)
Is there a bot that can remove or comment out redlink files? Such as the one that was here? There is currently a database report with over 16,000 pages with this condition. I assume Pywikipediabot with the delinker.py script would be able to do it (similar to CommonsDelinker). Avic ennasis @ 01:53, 20 Shevat 5771 / 25 January 2011 (UTC)
Science has recently released " The Science Hall of Fame" which ranks scientists in terms of impact (as measured through the number of times their names are present in books which can be found in Google Books. It seems to me that Wikipedia could greatly benefit from a comparison with this list. So what I have in mind is basically a bot that fetched the information found in this table, then checks Wikipedia and builds a report of the status of these articles. Something like
Name |
Born |
Died |
milliDarwins |
Name |
Born |
Died |
Rating |
---|---|---|---|---|---|---|---|
Bertrand Russell | 1872 | 1970 | 1500 | Bertrand Russell | 1872 | 1970 | B |
Charles Darwin | 1809 | 1882 | 1000 | Charles Darwin | 1809 | 1882 | FA |
Albert Einstein | 1879 | 1955 | 878 | Albert Einstein | 1879 | 1956 | A |
Sir Fake Name | 1900 | 1950 | 50 | Sir Fake Name | — | — | NA |
When our data matches that of Science, use {{
yes|YYYY}}
, otherwise {{
no|YYYY}}
. I purposefully misreported Einstein's death year just to illustrate what I meant. The bot results would be uploaded and updated daily/weekly/whateverly at something like
Wikipedia:Comparison of Wikipedia articles with Science "Science Hall of Fame". This would allow us to track quality of high-impact scientists and other science-related people, as well as find gaps in our coverage.
Headbomb {
talk /
contribs /
physics /
books} 03:34, 20 January 2011 (UTC)
{{
no|—}}
. This way some categorization work can happen.
Headbomb {
talk /
contribs /
physics /
books} 23:08, 20 January 2011 (UTC)About the issue of years of birth/death, we could take the information from the {{ Persondata}} template, if nothing else. עוד מישהו Od Mishehu 08:13, 23 January 2011 (UTC)
Split the list in chunks of 500 and transclude them into a master list? Something like
{|class="wikitable sortable" |- {{/01}} |- {{/02}} |- {{/03}} |- {{/04}} |- {{/05}} |- {{/06}} |- ... |- |}
Headbomb { talk / contribs / physics / books} 21:45, 24 January 2011 (UTC)
As stated here old files might have to be scanned for file description pages without any license template. I assume this would be feasible since such files are not in the Category:Wikipedia files by copyright status category tree. -- Leyo 17:34, 12 January 2011 (UTC)
cough -- Chris 09:15, 17 January 2011 (UTC)
Requesting/suggesting a bot be made that checks the number of times an article has had an IP address or new user reverted, and if it is a large number then making a list of the article and the number of reverts for someone to look over. Also if any new editor or IP address is blocked do to vandalism, can a bot check to see if any of their contributions haven't been reverted yet? Add that to a list for people to check up on. When I revert someone, I always click to see their contributions and see what other articles they have vandalized as well. Not everyone does that though. If its too much of a load to scan through all articles at times, perhaps a tool approved users can run at Wikipedia:Requests for page protection. See long term problems that keep emerging, instead of just the most recent events. Dream Focus 05:34, 27 January 2011 (UTC)
Not sure if the new article alert bot thing would work in this format—it has useful XfD monitoring things that this WikiProject can check on but that doesn't need to be in this bulletin template. Would it be possible to have a bot auto-update this template, or would customizing the AAlertBot output work? / ƒETCH COMMS / 15:54, 27 January 2011 (UTC)
Hi,
I am a member of the typo team, but due to school and prior commitments, I do not have much time to spend on WP. I would like to request a bot to assist me with spelling and grammar corrections.
Paper
fork
♠ 13:22, 22 January 2011 (UTC)
Not a good task for a bot. Logan Talk Contributions 09:21, 28 January 2011 (UTC)
It would be nice if there was a bot to automatically add project tags to the talk pages of appropriate articles. Is there such a bot already in existence? WikiManOne ( talk) 22:34, 26 January 2011 (UTC)
Deferred Logan Talk Contributions 09:15, 28 January 2011 (UTC)
The webmasters at http://www.parliament.uk appear to have decided against permanent URLs, breaking a widely used link to http://www.parliament.uk/commons/lib/research/briefings/snpc-04731.pdf
The document is now at http://www.parliament.uk/documents/commons/lib/research/briefings/snpc-04731.pdf
I can't recall where the tool is to count such links, but I know that I have added dozens of them.
Would any bot owner be kind enough to update the link? -- BrownHairedGirl (talk) • ( contribs) 14:40, 5 February 2011 (UTC)
Since User:WildBot has not been working for awhile and User:Josh Parris is no longer active can another bot be used to cleanup the tags that have been left on the talk pages? – Allen4 names 03:16, 28 January 2011 (UTC)
/* SLOW OK */
instead of /* SLOW_OK */
and the query was killed during high replication. I have taken the liberty of updating WildBot's pages and template to reflect the inactive status. —
Dispenser 22:02, 31 January 2011 (UTC)
Page seems broken to me today. -- Magioladitis ( talk) 12:58, 6 February 2011 (UTC)
Would it be possible for a bot to replace all instances of {{ oscoor}} with {{ gbmappingsmall}}? Mjroots ( talk) 22:14, 30 January 2011 (UTC)
After several years of using the "unranked taxon" parameters in the {{ taxobox}}, someone's pointed out that a few parameters are inconsistent with the rest of the taxobox. I've spent the last few hours resolving this issue, and I've got a bit left to go before I'm done, but there's one gargantuan mountain standing in my way-- about 26K articles that all (thankfully) share exactly the same problem.
All the articles appearing in the automatically updated
Category:Taxoboxes employing both unranked_familia and superfamilia need to have the text unranked_familia
replaced with unranked_superfamilia
. The text appears only once on each of these pages, so a search-and-replace with no second pass should suffice. There are currently 25,892 pages catalogued under this category. Once this category is emptied out, the task should be terminated.
I appreciate any help you can offer! Thanks! Bob the WikipediaN ( talk • contribs) 23:56, 30 January 2011 (UTC)
python replace.py -cat:Taxoboxes_employing_both_unranked_familia_and_superfamilia "unranked_familia" "unranked_superfamilia"
?
Smallman12q (
talk) 00:36, 31 January 2011 (UTC)
unranked_familia
with unranked_superfamilia
in all cases in that category?
Plastikspork
―Œ(talk) 01:01, 31 January 2011 (UTC)
|unranked_familia_authority=
would need changed to |unranked_superfamilia_authority=
.
Bob the WikipediaN (
talk •
contribs) 06:38, 31 January 2011 (UTC)
super
in one or two places-- a change that will have zero effect on the appearance/functionality of the taxoboxes, but once completed, will allow for the final revision needing made to {{
taxobox/core}} in order to normalize the functionality of unranked taxa.
Bob the WikipediaN (
talk •
contribs) 00:13, 1 February 2011 (UTC)
What about creating another bot that automatically archives urls at WebCite and adds the archive links to articles? I am aware of User:WebCiteBOT, but
I consider WP:LINKROT to be a major threat to Wikipedia, that asks for a response, thus I hereby request an efficiently working bot for that purpose. Regards. Toshio Yamaguchi ( talk) 16:59, 1 February 2011 (UTC)
This category has an enormous backlog, and I think the clearing of the backlog could benefit from bot assistance. Would it be possible to code a bot to output possible coordinates (found by a bot search on google maps) into a list, and then have people go through the list to check these and manually add them to articles? The bot might also consider what the province is based on categories to do an even smarter search. (This might have to be worked out on a country-by-country basis.) Calliopejen1 ( talk) 18:42, 2 February 2011 (UTC)
Delta, you don't need an API key for this function. You can search their location database using this url: http://maps.google.com/maps/api/geocode/json?address=LOCATION&sensor=false. I have code written for this function that you can see here. Its pretty crappy looking, but it works. I'm going to try and contact google and see if we can get an exception to their TOS. Tim 1357 talk 21:27, 2 February 2011 (UTC)
I am requesting that there is a bot created who clears the completed requests of
Wikipedia:Requested_articles. For example,
User:Jimbo Wales requests that the article
Wikipedia be made, because it doesn't exist yet. Then I create that article, but do not delete the request at the project page. What I am requesting is a bot who automatically deletes the request of
Wikipedia at that project page. I do not know whether or not a bot that does this already exists, but it could sure be helpful. Unfortunately, this could possibly also eliminate inspecific requests, such as one that already has a Wikipedia page but the requester is referring to something else. --
114.250.35.13 (
talk) 08:13, 3 February 2011 (UTC)
:Strongly oppose-- humans are much more effective at creating stubs, and bots have failed numerous times in creating stubs within the
WP:TOL WikiProject. If a branch of science with unique names for everything has issues, imagine the issues that would arise in other fields. Also, there is a very nice-sized article at
Wikipedia and has been for a long time.
Bob the WikipediaN (
talk •
contribs) 12:58, 3 February 2011 (UTC)
Request a bot which can detect when an article page is a dab page but the talk page associated with it is a redirect, and then fix it by replacing the talkpage redirect with the {{ WikiProject Disambiguation}} template. If it could also tag the talk pages of all dab pages which are missing the template as well that would be even better :) Thanks, DuncanHill ( talk) 15:17, 3 February 2011 (UTC)
(edit conflict) The SQL so no one actually have to write it is:
SELECT CONCAT("* [[Talk:", talk.page_title, "]]")
FROM page
JOIN categorylinks ON cl_from=page.page_id
JOIN page AS talk ON talk.page_title=page.page_title AND talk.page_namespace=1
WHERE page.page_namespace=0
AND page.page_is_redirect=0
/* in categories */
AND cl_to IN ("All_disambiguation_pages", "All_set_index_articles")
AND talk.page_is_redirect=1;
4607 rows in set (9.34 sec)
if you're wondering. If you want redirects with tagged talk pages you can use catscan. —
Dispenser 18:47, 3 February 2011 (UTC)
Could someone please train a bot to make a one-time sweep for WP:WPVG? Basically, we are hoping that a bot will check all of the articles tagged with Template:WikiProject Video games, make note of all local images that are image-linked on those articles, and then tag the talk page of those articles with Template:WikiProject Video games. The template can automatically tell when it is on a file, and will place that image in a file category. Editors have done this manually for a while, but there are still a lot more to tag. This will help us maintain our images. We may want the bot to run again at some point, but I don't think we need one constantly sweeping. Thanks! ▫ JohnnyMrNinja 22:06, 3 February 2011 (UTC)
Category:Articles needing coordinates includes articles that need to have the relevant coordinates (latitude;longitude) added. Lewis Ainsworth House is listed in a subcategory of Category:Articles needing coordinates. In addition, Lewis Ainsworth House uses a template where an address location is listed. In particular, its {{ Infobox nrhp}} lists "location = 414 E. Chapman Ave<br>[[Orange, California]]." Now, if you add 414 E. Chapman Ave, Orange, California to gpsvisualizer.com, you get 33.787698,-117.849617. So here's the bot idea: Have the bot search out all articles listed in Category:All articles needing coordinates that also use {{ Infobox nrhp}} AND have the parameter "location =" filled in. The bot should then get the location info, pass the stree address information through a program such as gpsvisualizer.com to find their latitude and longitude. Then, take that latitude and longitude and add it to that article's {{ Infobox nrhp}}. Then remove the article from Category:Articles needing coordinates. There might be other templates that use address locations that also are missing their geo coordinates. -- Uzma Gamal ( talk) 14:09, 4 February 2011 (UTC)
The external link Brazilian Tourism Portal, present in about 700 articles, is broken. Even the "good" link [6] does not seem to be a good link to the articles (no useful info). Does anyone could create a bot to solve this problem? I think the best thing to do is just remove all links. Caiaffa ( talk) 14:21, 4 February 2011 (UTC)
does any one developed interwiki coordination bot that can check another wikis and like interwiki bot compare them and add or remove coordination template? also i found [library for python http://py-googlemaps.sourceforge.net/] that can use google map coordination Reza1615 ( talk) 14:56, 4 February 2011 (UTC)
Hey All, I was wondering if someone could do an assessment job on all the articles in Category:Unassessed Albemarle County articles via a bot. They would just need to match the assessments of the already exsisting templates. Like if WP:FOO is Class C with Low Importance, WP:ALVA (the WP link for the project connected to this category) would be the same. Could someone do this? - Neutralhomer • Talk • 02:36, 7 February 2011 (UTC) • Go Steelers!
{{ cite doi}} templates should have as an argument, the doi. Some of these templates were malformed, so need to be cleaned up.
The full list is
These template would need to be moved from {{
cite doi/doi:foobar}} to {{
cite doi/foobar}}. Then, the articles linking to {{
cite doi/doi:foobar}} should have their {{
cite doi}} template be updated from {{
cite doi|doi:foobar}}
to {{
cite doi|foobar}}
. When that's done, the {{
cite doi/doi:foobar}} should be tagged as {{
db-g6}} per uncontroversial maintenance, as it would be an unlikely and unused redirect.
Headbomb {
talk /
contribs /
physics /
books} 04:38, 9 February 2011 (UTC)
hello,
I need a bot which archives my user talk page each month (and don't ask me why each month). Thank you.-- ♫Greatorangepumpkin♫ T 15:20, 10 February 2011 (UTC)
Would a bot kindly update Portal:Tropical cyclones/Active tropical cyclones? -- Perseus 8235 17:45, 10 February 2011 (UTC)
I'd like for someone to go through Category:Cite doi templates / Category:Cite hdl templates / Category:Cite jstor templates / Category:Cite pmc templates / Category:Cite pmid templates and build tables akin to
Template:Cite doi/... |
author(s) | date |
|title= |
|url= |
|format= |
|journal= |
|publisher= |
|volume= |
|issue= |
|pages= |
|bibcode= |
|oclc= |
|doi= |
|isbn= |
|issn= |
|pmc= |
|pmid= |
|id= |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1001/ archgenpsychiatry.2007.2 |
|last1=Fombonne |first1=E.
|
2007 | Thimerosal Disappears but Autism Remains | — | — | Archives of General Psychiatry | — | 65 | 1 | 15 | — | — |
10.1001/ archgenpsychiatry.2007.2 |
— | — | — | 18180423 | — |
10.1001/ archgenpsychiatry.2009.30 |
|last1=King |first1=B. |last2=Hollander |first2=E. |last3=Sikich |first3=L. |last4=Mccracken |first4=J. |last5=Scahill |first5=L. |last6=Bregman |first6=J. |last7=Donnelly |first7=C. |last8=Anagnostou |first8=E. |last9=Dukes |first9=K.
|
2009 | Lack of efficacy of citalopram in children with autism spectrum disorders and high levels of repetitive behavior: citalopram ineffective in children with autism | — | — | Archives of General Psychiatry | — | 66 | 6 | 583–590 | — | — |
10.1001/ archgenpsychiatry.2009.30 |
— | — | — | 19487623 | — |
10.5367/ 000000000101293149 |
|last1=Dumont |first1=R. |last2=Vernier |first2=P.
|
2000 | Domestication of yams (Dioscorea cayenensis-rotundata) within the Bariba ethnic group in Benin | — | — | Outlook on Agriculture | — | 29 | — | 137 | — | — |
10.5367/ 000000000101293149 |
— | — | — | — | — |
Template:Cite doi/... |
author(s) |
date |
|chapter= |
|chapterurl= |
editor(s) |
|title= |
|url= |
|format= |
|publisher= |
|series= |
|volume= |
|issue= |
|pages= |
|bibcode= |
|oclc= |
|doi= |
|isbn= |
|issn= |
|pmc= |
|pmid= |
|id= |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1002/ 0471238961.0315131619030818.a01.pub2 |
|last=Schrobilgen |first=G. J. |last2=Moran |first2=M. D.
|
2003 | Noble-Gas Compounds | — | — | Kirk-Othmer Encyclopedia of Chemical Technology | — | — | John Wiley & Sons | — | — | — | — | — | — |
10.1002/ 0471238961.0315131619030818.a01.pub2 |
— | — | — | — | — |
Etc...
These tables could be hosted somewhere at WP:JOURNALS and maybe in parallel on the toolserver? These tables would immensely help with template cleanup/completion. Headbomb { talk / contribs / physics / books} 22:46, 10 February 2011 (UTC)
Several templates created by Citation bot have had their structure rot over time. Some of it was due to sloppy bot edits, others to human mistakes, and others to sloppy human editing. There's general a dislike for bot-edits which do not create actual changes in appearance, but these are all hosted in the template space, with very very few people watching them (aka, this would really not annoy a lot of people, and people likely to be watching these templates would also be the one to appreciate their tidying up). A "good" template should be formatted in this manner (aka in this order):
{{cite journal |last= |first= |authorlink= |last1= |first1= |author1link= Remove empty parameters, if no author-related parameter remains, keep |last1= and |first1= |last2= |first2= |author2link= ... |author= |authorlink= |author1= |author1link= Remove empty parameters, if no author-related parameter remains, add |last1= and |first1= |author2= |author2link= ... |date= |year= Remove empty parameters, if no date-related parameter remains, add |year= |month= ... |title= Remove empty parameters, |title= should always be present |language= |transtitle= |url= |format= ... |journal= Remove |series= if empty; the others should always be present |series= |volume= |issue= |pages= ... |publisher= Remove empty parameters, also remove |location= if |publisher= is empty |location= ... |arxiv= Remove |asin=, |doi_brokendate=, |isbn=, |issn=, |lccn=, |jfm=, |oclc=, |asin= |ol=, |rfc=, |pmc-embargo-date=, and |id= ,if empty, |bibcode= |doi= Add |arxiv=, |bibcode=, |doi=, |jstor=, |mr=, |osti= |pmc=, |pmid=, |doi_brokendate= |ssrn=, and |zbl= if missing |isbn= |issn= |jfm= |jstor= |lccn= |mr= |oclc= |ol= |osti= |pmc= |pmc-embargo-date= |pmid= |rfc= |ssrn= |zbl= |id= ... |accessdate= Remove empty parameters, except |accessdate= if |url= is present |archiveurl= |archivedate= |laysource= |laysummary= |laydate= ... |quote= Remove empty parameters |ref= |separator= |postscript= ... <!--UNUSED DATA--> All other parameters should be moved here }}
This would have two great benefits. If the templates are formatted consistently and legibly, newcomers would be much less intimidated by the structure of these templates, and both regular and newcomers will benefit from the improved readability. Compare [7] and [8] for instance. Plus, with the parameter pruning, you can immediately see what is missing, and what should be added instead of being mislead into finding a journal's publisher or wonder what a "separator" is or what the "series" refer to. This would in turn facilitate and encourage good citation completion/maintenance by humans. A daily run wouldn't be needed for this, but a one-time run over all citations combined with monthly runs would be so incredibly helpful here. Once we do {{ cite journal}}, we could move on to the other templates ({{ citation}}, {{ cite book}}, etc...). I've notified Smith609 (who runs Citation bot) for feedback here. Headbomb { talk / contribs / physics / books} 23:42, 10 February 2011 (UTC)
If the requested move discussion at Talk:New York City Subway succeeds, there will need to be a couple of hundred of page moves and several thousands of find/replace runs for "New York City Subway". The find/replace runs can be done automatically. If the bot ignores image/interwiki links (but not wikilinks), there shouldn't be any false positives. It's too much for assisted AWB to do, so can someone with a bot take on this task, assuming the discussion results in "move"? — Train2104 ( talk • contribs • count) 16:41, 12 February 2011 (UTC)
Regarding Wikipedia:Requests for feedback/navigation, which is transcluded on WP:FEED,
Could someone possibly make a bot which automatically adds links each month, as I did manually here?
If you need more info, give me a shout. Cheers! Chzz ► 14:10, 13 February 2011 (UTC)
How hard would it be for a bot to tag all pages listed at Wikipedia talk:Requests for comment/Jagged 85/Cleanup1 etc. with a template? — Ruud 15:53, 13 February 2011 (UTC)
{{
Jagged 85 cleanup|subpage=Cleanup1a}}
, where subpage
is the cleanup list the article appears on? —
Ruud 18:25, 13 February 2011 (UTC)Hi all,
I have the problem described in this post, but am not clear on how to apply a solution: http://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval/Erik9bot_5
Like the Irish times, as mentioned in this post, my news website has changed its url. How do I update the 2,000+ links to my site in Wikipedia?
Apparently this fix was created by someone who has been banned so I can't ask him to explain. "This account is a sock puppet of John254 and has been blocked indefinitely."
Thanks much Michelle — Preceding unsigned comment added by Mnicolosi ( talk • contribs) 22:59, 12 February 2011 (UTC)
Thanks for the help with the signature. Sorry, not quite used to this system. All urls in wikipedia that are seattlepi.nwsource.com need to be updated to seattlepi.com. Details on the background behind our url change are here if you're interested: http://en.wikipedia.org/wiki/Seattle_Post-Intelligencer
Thanks much,
Can you assist WP:PINOY in changing www.t-macs.com/kiso/local/ (which is unofficial and a dead link) to http://www.census.gov.ph/census2000/index.html which is live and is the official 2000 Philippine Census by the National Statistics Office? Thanks.
See background here:
The help desk referred me to you for assistance.-- Lenticel ( talk) 06:36, 14 February 2011 (UTC)
At
User talk:Citation bot#Withdrawn_papers it is noted that on occasion cited papers are updated or withdrawn. A maintenance bot or other tool could follow cited pubmed, doi, or other database identifiers to check for such, then (where appropriate) tag the citation for human attention, possibly amending the citation in the process (e.g. changing |title=Dewey Wins!
to |title=Withdrawn:Dewey Wins!
(or whatever the database indicates). Martin advises this is beyond Citationbot's scope, so it would need to be a different tool. Given that {{
cite doi}} and its ilk bury information in subpages where it is rarely seen, these should get priority.
LeadSongDog
come howl! 16:24, 11 February 2011 (UTC)
and
LeadSongDog come howl! 16:50, 11 February 2011 (UTC)
|status=withdrawn
or |status=superceded
in the {{
cite xxx}}/{{
citation}} templates.
Headbomb {
talk /
contribs /
physics /
books} 04:45, 14 February 2011 (UTC)
I posted something similar a while ago, but i guess it was rather a daunting task, so I'm scaling it down a bit. Previously, I wanted a bot that creates report of all problems with the books ({{ citation needed}} tags, {{ POV}} tags ...), but that doesn't look like it'll happen. So, I'm scaling the request down to only report what assessment class the articles of a book are.
Book:Helium | Talk page report | |
---|---|---|
|
→ |
These reports would be done for all Category:Wikipedia books (community books), and updated daily. Of course, if someone feels like taking up the original request, that would also be peachy. WikiProject Wikipedia-Books has been notified of this request. Headbomb { talk / contribs / physics / books} 04:08, 11 February 2011 (UTC)
See User:NoomBot/BookTest for 3-4 examples of book reports. Going to set the bot to append a couple more reports to see if the formatting works for several of them. Also adding more 'problem' templates to detect. Noom talk contribs 18:35, 12 February 2011 (UTC)
Olaf Davis retired last October, but he left the code behind so that anyone who knows how to run Python Bots could replicate what User:Botlaf did. I've just manually gone through the nearly 800 articles that contained the word pubic and fixed 23 that were typos or vandalism, some of which had been up for months. But it is very time consuming to do this manually without Botlaf, and pubic is only one of many reports that Botlaf used to run every week. Please could someone, ideally a Python writer, take over the Botlaf code? It only needs to run weekly. Thanks Ϣere SpielChequers 13:04, 14 February 2011 (UTC)
I've come to the realization that I can't maintain all my bots under my current workload, and as such, would like for some people to take over the task of WP:UAA helperbot. (BRFA: Wikipedia:Bots/Requests for approval/SoxBot 23) It is a clone of HBC's bot, so it shouldn't be too hard to bring up. ( X! · talk) · @841 · 19:10, 16 February 2011 (UTC)
Would it be possible to have a bot automatically add {{ TFA-editnotice}} into the editnotice of today's featured article? I'm not quite sure how easy it would be for a bot to figure out what tomorrow's TFA is, but I suspect somebody can do it, and the template just needs adding once with the appropriate date specified (the template does the rest). Rd232 talk 02:53, 14 February 2011 (UTC)
There are strong feelings that generally, links should only be in the "see also" section if they aren't in the body of the article. (no, it isn't a guideline, there is certainly a lot of wiggle room in it)
Anyhow, has anyone built a bot that identifies and/or removes items that are redundant? It could note them at the top of the talk page (like how disambiguations are/were marked), it could remove them, it could add a category indicating duplicate items, it could put a comment or template next to duplicates, or it could actually remove them.
This seems well-suited to a bot because it's nontrivial for a human to scan the article for the duplicate links. tedder ( talk) 21:47, 17 February 2011 (UTC)
I was wondering if I could enlist the help of a bot to do some stub sorting for me for the
Washington WikiProject. Each county has their own geographic location stub type (e.g. {{
KingWA-geo-stub}} for
Category:King County, Washington). Most articles use {{
Washington-geo-stub}}. What I would like to do is for articles in Foo County, Washington and use {{
Washington-geo-stub}} to change the stub type to {{FooWA-geo-stub
}}. Any articles that are not located in either a) a county category or b) are in multiple countty categories should be left with the base template. There are currently
400 transclusions of the base template, and should be a fairly simple, and uncontroversial edit. --
Admr
Boltz 05:16, 18 February 2011 (UTC)
I am not sure if a page like this already exists, but I am guessing people here would be aware of it. I was thinking it would be pretty useful if every week or month a list of the top 500 or 1000 most viewed/accessed pages that are neither FA/FL/GA nor A/B-class articles in at least one project would be listed. This way those interested in directing attention to the most visited pages that are not in a decent shape yet. Only C-class, starts, stubs, lists, and unassesed pages that are not rated B-class or above in any project should be listed. Alternatively a top 500 for each of the bottom classes would be good also. Nergaal ( talk) 07:25, 20 February 2011 (UTC)
The Manual of Style reads:
Moreover, dates should not have "th" on the them. Am I OK to think that AWB can be set to remove superscript from ordinals? -- Magioladitis ( talk) 20:04, 16 February 2011 (UTC)
Wikipedia:Bots/Requests for approval/Yobot 20. -- Magioladitis ( talk) 00:59, 21 February 2011 (UTC)
A discussion is going on about disabling the "reviews" parameter of {{ Infobox album}}, as consensus has it to move reviews to a separate section. There are hundreds of thousands of articles to have the infobox data moved to {{ album ratings}}, and could takes years to do manually. Thus, it would be very much appreciated if a bot coder could take a look. Thanks, Adabow ( talk · contribs) 03:39, 21 February 2011 (UTC)
All instances of http://www.pmars.imsg.com/ need changing to https://pmars.marad.dot.gov/ . The url path after .gov/ will not need changing. There are about 200 of these and they're all dead links. Thanks. Brad ( talk) 13:58, 18 February 2011 (UTC)
pl-0: 1 Page; ja-0: 2 Pages; commons-6: 2 Pages; en-0: 10 Pages;
Usually, when pages are moved, a redirect is left behind; this redirect clues in the interwiki bots that the page in English Wikipedia still exists, it's just been moved to a new name. However, categories are renamed by creating new categories and deleting the old ones. Should an interwiki bot then start handling the category in an other language Wikipedia, it would decide that we no longer have such a category, and remove the English interwiki (incorrectly!) from the other languages - see here for an example. I think the best solution is to have an interwiki bot look at all new categories (defined as having been created since the beginning of the previous run), and handle their interwiki as the interwiki bots always do - and that would include updating the English name on all the other Wikipedias' interwiki lists. עוד מישהו Od Mishehu 12:06, 22 February 2011 (UTC)
Many articles on settlements in the US contain boilerplate text (presumably bot-generated a long time ago) along the lines of:
In several thousand of these articles, 'married couples' is in a piped link to marriage. This is clearly a worthless link - could a bot unlink these? To be precise, the articles to be covered are those in the intersection of (What links to: marriage) and Category:Populated places in the United States, and its subcategories). Colonies Chris ( talk) 16:48, 23 February 2011 (UTC)
Is there (or could somebody create) a bot that can go through the "Special:UnusedFiles" page and delete it all? Neocarleen ( talk) 05:46, 17 February 2011 (UTC)
In general, the idea that a bot would delete anything is simply nuts; it opens the door to vandals. In the case of images they could delete them by removing them from articles. Choyoołʼįįhí:Seb az86556 > haneʼ 01:11, 18 February 2011 (UTC)
Hey, can you make me a bot that cleans up spam please?
^^— Preceding unsigned comment added by 64.228.147.57 ( talk • contribs) 02:07, February 25, 2011
National Census 2010 results [10] were published. We need infoboxes population, pop rank, density update for 1-st and 2-d level divisions. Bogomolov.PL ( talk) 23:22, 25 February 2011 (UTC)
As discussed in these threads, Mobius Bot ( talk · contribs) went berserk last year and its owner has disappeared. Can its functionality be replicated in a new bot, or incorporated into an existing bot? The source is at http://pastebin.com/i2ZYQBRD. Adrian J. Hunter( talk• contribs) 14:40, 24 February 2011 (UTC)
Per some discussion on the EN wiki mailing list about disability access, we have lots of images that need alt text so that blind people and anyone using a text reader can get an idea as to what our images are displaying. There was a suggestion on the mailing list from User:Martijn Hoekstra "Would an automated category Images without alt text be feasible?", alternatively I would have thought that a weekly regenerated list would do the job just as well. It would also make for a very good entry level task for newbies finding their feet. Can someone kind bot writer code it please? Ϣere SpielChequers 14:50, 27 February 2011 (UTC)
For some reason, there are quite a number of people who create their own Wikipedia article. While this is not prohibited per se, it is still strongly discouraged. What most of these autobiographies have in common is an editor who created most (if not all) of the article's content but did not contribute anywhere else (I ran into a couple of those lately). So what I'm suggesting is a bot that scans Category:Living people for articles where more than, say, 90% of the content came from a single-purpose account, and flag them (maybe with {{ COI}}, or something else, or add them to a separate list similar to User:AlexNewArtBot/COISearchResult). -- bender235 ( talk) 18:34, 26 February 2011 (UTC)
Extended content
|
---|
SELECT editor_name,
article_title,
round(deltas),
( ( edits_to_article / ( numberofedits + 0.0 ) ) * 100 ) AS
percent_of_all_edits_to_article,
edits_to_article AS
user_edits_to_article,
all_edits AS
user_editcount,
( ( edits_to_article / ( all_edits + 0.0 ) ) * 100 ) AS
percent_of_user_edits
FROM (SELECT DISTINCT Concat(Concat(main.rev_page, '-'), main.rev_user) AS
distinctify,
main.rev_user_text AS
editor_name,
mainp.page_title AS
article_title,
(SELECT COUNT(*)
FROM revision AS bla
WHERE bla.rev_page = mainp.page_id) AS
numberofedits,
mainp.page_id AS
pageid,
(SELECT SUM(IF(Isnull(prev.rev_len), NOW.rev_len,
IF(NOW.rev_len > prev.rev_len,
NOW.rev_len - prev.rev_len + 0.0, (
-1.0 * (
prev.rev_len - NOW.rev_len ) )))) AS d
FROM revision AS NOW
LEFT JOIN revision AS prev
ON prev.rev_id = NOW.rev_parent_id
AND prev.rev_id!=0
WHERE NOW.rev_user = main.rev_user
AND main.rev_page = NOW.rev_page) AS
deltas,
(SELECT COUNT(*)
FROM revision AS s
WHERE s.rev_user = main.rev_user
AND s.rev_page = main.rev_page) AS
edits_to_article,
user_editcount AS
all_edits
FROM revision AS main
JOIN page AS mainp
ON mainp.page_id = main.rev_page
AND mainp.page_namespace = 0
JOIN categorylinks
ON cl_from = mainp.page_id
AND cl_to = 'Living_people'
JOIN USER
ON main.rev_user = user_id
LEFT JOIN user_groups
ON main.rev_user = ug_user
AND ug_group IN ( 'sysop', 'bot' )
WHERE Isnull(ug_group)
LIMIT 5000) AS p
ORDER BY ( percent_of_all_edits_to_article + percent_of_user_edits ) DESC
LIMIT 100;
+------------------------------+---------------------------------+---------------+---------------------------------+-----------------------+----------------+-----------------------+ | editor_name | article_title | round(deltas) | percent_of_all_edits_to_article | user_edits_to_article | user_editcount | percent_of_user_edits | +------------------------------+---------------------------------+---------------+---------------------------------+-----------------------+----------------+-----------------------+ | Ruhe | Rushworth_Kidder | 5905 | 25.0000 | 3 | 3 | 100.0000 | | Lolamangha | Brittany_Tiplady | 2414 | 6.8182 | 3 | 3 | 100.0000 | | Evandrobaron | Paulo_Afonso_Evangelista_Vieira | 44 | 6.2500 | 1 | 1 | 100.0000 | | Jerryhansen | Peter_Hyman | 1886 | 5.8824 | 2 | 2 | 100.0000 | | Viktorbuehler | Rolf_Dobelli | 39560 | 33.3333 | 21 | 29 | 72.4138 | | Mrpuddles | Leslie_Cochran | 18117 | 5.0147 | 17 | 17 | 100.0000 | | Alon.rozen@gmail.com | Eric_Britton | 3492 | 4.8780 | 2 | 2 | 100.0000 | | Zagatt | Barbara_Sukowa | 3327 | 4.3956 | 4 | 4 | 100.0000 | | Mawjj | Carmen_Boullosa | 3793 | 2.6316 | 1 | 1 | 100.0000 | | Habibrahbar | Massy_Tadjedin | 152 | 2.6316 | 1 | 1 | 100.0000 | | Lem | Malcolm_Azania | 1841 | 1.1236 | 1 | 1 | 100.0000 | | Preludes | Alexander_Beyer | 3248 | 1.0000 | 1 | 1 | 100.0000 | | Pnut123 | Paul_Carr_(writer) | 1209 | 0.7576 | 1 | 1 | 100.0000 | | KidRose | Tim_White_(wrestling) | 604 | 0.7042 | 2 | 2 | 100.0000 | | Ajmaher | David_Parnas | 1297 | 0.5917 | 1 | 1 | 100.0000 | | Themadhatter | Tim_Sköld | 2523 | 0.5747 | 4 | 4 | 100.0000 | | Danistheman | Doug_Dohring | 35 | 0.5618 | 1 | 1 | 100.0000 | | Rogersdrums | Steve_Ferrone | 481 | 0.5348 | 1 | 1 | 100.0000 | | LiangY | Keiko_Agena | 516 | 0.3676 | 1 | 1 | 100.0000 | | Nignuk | Paul_Posluszny | 3506 | 0.2475 | 1 | 1 | 100.0000 | | Jennifer B | Ashley_Hartman | 85669 | 10.4651 | 27 | 34 | 79.4118 | | Zfeuer | Robert_Schwentke | 971 | 7.1429 | 3 | 4 | 75.0000 | | Aguecheek | Harald_Schmidt | 20679 | 3.7234 | 7 | 9 | 77.7778 | | Alanhtripp | Alan_Tripp | 2328 | 6.8966 | 2 | 3 | 66.6667 | | Lotzoflaughs | Shian-Li_Tsang | 82217 | 63.3333 | 76 | 759 | 10.0132 | | Realmagic | Paul_W._Draper | 4377 | 1.4815 | 2 | 3 | 66.6667 | | TommyBoy | Assad_Kotaite | 44212 | 66.6667 | 34 | 15552 | 0.2186 | | Thivierr | Sally_Gifford | 105960 | 64.5570 | 51 | 22476 | 0.2269 | | Stilltim | David_P._Buckson | 568996 | 63.1148 | 77 | 21011 | 0.3665 | | TommyBoy | Joseph_M._Watt | 33986 | 61.7021 | 29 | 15552 | 0.1865 | | Badagnani | Matthias_Ziegler | 33357 | 61.3636 | 27 | 136593 | 0.0198 | | TommyBoy | Robert_E._Lavender | 16471 | 61.1111 | 22 | 15552 | 0.1415 | | DickClarkMises | Robert_Higgs | 139999 | 60.0000 | 99 | 9655 | 1.0254 | | Thivierr | Kim_Schraner | 282139 | 60.2041 | 59 | 22476 | 0.2625 | | DickClarkMises | Robert_P._Murphy | 374104 | 56.9231 | 111 | 9655 | 1.1497 | | Christine912 | Sandra_Hess | 39694 | 8.0292 | 11 | 22 | 50.0000 | | TommyBoy | Robert_Poydasheff | 36757 | 56.2500 | 36 | 15552 | 0.2315 | | TommyBoy | Tom_Colbert | 29282 | 56.0000 | 28 | 15552 | 0.1800 | | Xiathorn | Sheetal_Sheth | 1475 | 2.0270 | 3 | 6 | 50.0000 | | Lightning Striking a Viking! | Jessica_Cutler | 1488 | 1.0526 | 3 | 6 | 50.0000 | | Gbrumfiel | Russel_L._Honoré | 3150 | 0.8403 | 2 | 4 | 50.0000 | | Adar | Christopher_John_Boyce | 692 | 0.7407 | 1 | 2 | 50.0000 | | Massgiorgini | Mass_Giorgini | 1534 | 0.7353 | 1 | 2 | 50.0000 | | TommyBoy | Robert_E._Kramek | 43670 | 50.0000 | 31 | 15552 | 0.1993 | | TommyBoy | Joseph_Sinde_Warioba | 40451 | 49.2063 | 31 | 15552 | 0.1993 | | TommyBoy | Charles_Thone | 52003 | 48.1013 | 38 | 15552 | 0.2443 | | TommyBoy | Yvonne_Kauger | 13906 | 48.0000 | 24 | 15552 | 0.1543 | | Mckradio | Michael_C._Keith | 393 | 10.6383 | 5 | 14 | 35.7143 | | TommyBoy | Raymond_Ranjeva | 14022 | 44.7368 | 17 | 15552 | 0.1093 | | Killerdark | Andrew_Kahr | 49988 | 36.9048 | 31 | 446 | 6.9507 | | TommyBoy | Joseph_P._Teasdale | 20888 | 43.1373 | 22 | 15552 | 0.1415 | | TommyBoy | James_S._Gracey | 22007 | 40.0000 | 22 | 15552 | 0.1415 | | Kaiobrien | Beat_Richner | 24183 | 23.9437 | 17 | 105 | 16.1905 | | Darius Dhlomo | Mohamed_Allalou | 6531 | 40.0000 | 14 | 162679 | 0.0086 | | Throwingbolts | Randy_Torres | 1273 | 7.7922 | 6 | 19 | 31.5789 | | TommyBoy | Thomas_P._Salmon | 25726 | 39.1304 | 27 | 15552 | 0.1736 | | TommyBoy | George_Nigh | 89990 | 38.6667 | 58 | 15552 | 0.3729 | | Gziegler | Catherine_Barclay | 335 | 5.5556 | 1 | 3 | 33.3333 | | ReidarM | Dominik_Burkhalter | 11683 | 36.8421 | 7 | 375 | 1.8667 | | Douglasshearer | Guthrie_Govan | 10579 | 0.9709 | 3 | 8 | 37.5000 | | DbA | Timothy_Crouse | 1672 | 5.0000 | 4 | 12 | 33.3333 | | Lumos3 | Stuart_Prebble | 10665 | 37.5000 | 9 | 21769 | 0.0413 | | TommyBoy | David_Hall_(Oklahoma_governor) | 145824 | 36.1963 | 59 | 15552 | 0.3794 | | Dananderson | Susan_Golding | 100963 | 35.3535 | 35 | 2872 | 1.2187 | | Phase1 | Bertil_Wedin | 64312 | 34.8837 | 15 | 975 | 1.5385 | | Gabe boldt | Wolfgang_Becker | 824 | 2.7778 | 1 | 3 | 33.3333 | | Lainay | Koharu_Kusumi | 161 | 0.3215 | 1 | 3 | 33.3333 | | Oddtoddnm | Howard_Morgan | 14819 | 31.2500 | 10 | 476 | 2.1008 | | TommyBoy | Steven_W._Taylor | 59973 | 32.2581 | 40 | 15552 | 0.2572 | | TommyBoy | Robert_Harlan_Henry | 19485 | 31.2500 | 20 | 15552 | 0.1286 | | Jliberty | Jesse_Liberty | 79237 | 23.8462 | 31 | 415 | 7.4699 | | Jacrosse | Eric_Garris | 64853 | 28.9157 | 24 | 1223 | 1.9624 | | YUL89YYZ | Arthur_Mauro | 8579 | 30.7692 | 8 | 83262 | 0.0096 | | TommyBoy | Pieter_Kooijmans | 34059 | 30.5263 | 29 | 15552 | 0.1865 | | Michiko | Michael_Marsh_(journalist) | 14841 | 21.6216 | 8 | 91 | 8.7912 | | Fys | Tag_Taylor | 12149 | 29.4118 | 5 | 14706 | 0.0340 | | Snrub | Cathy_Wilcox | 1152 | 4.3478 | 1 | 4 | 25.0000 | | Craig Currier | Robert_Picardo | 1845 | 0.4651 | 2 | 7 | 28.5714 | | Kegill | Britain_J._Williams | 30275 | 23.9130 | 11 | 229 | 4.8035 | | Jess Cully | Kathy_Leander | 27138 | 28.2609 | 13 | 4986 | 0.2607 | | Phase1 | Thomas_Thurman | 20641 | 27.2727 | 12 | 975 | 1.2308 | | Jdcooper | William_Harjo_LoneFight | 19055 | 28.2609 | 13 | 9998 | 0.1300 | | Julianortega | Reika_Hashimoto | 94748 | 25.9259 | 35 | 1473 | 2.3761 | | Gidonb | Christoph_Meili | 21434 | 27.6190 | 29 | 27653 | 0.1049 | | DantheCowMan | Micah_Ortega | 15136 | 27.2727 | 12 | 11710 | 0.1025 | | Loled | Jacques_Cheminade | 775 | 1.8519 | 1 | 4 | 25.0000 | | Jdcooper | Edward_Lone_Fight | 3639 | 26.6667 | 4 | 9998 | 0.0400 | | Rangerdude | Cragg_Hines | 3724 | 26.3158 | 5 | 3171 | 0.1577 | | WillemJoker | Clive_Merrison | 1285 | 1.2346 | 1 | 4 | 25.0000 | | CrevanReaver | Carol_Lin | 784 | 0.5236 | 1 | 4 | 25.0000 | | Riсky Martin | Pete_Rose,_Jr. | 705 | 0.5051 | 1 | 4 | 25.0000 | | TommyBoy | Walter_Dale_Miller | 19673 | 25.0000 | 19 | 15552 | 0.1222 | | Ted Wilkes | Leonie_Frieda | 15114 | 25.0000 | 7 | 18934 | 0.0370 | | TommyBoy | Ed_Schafer | 135806 | 24.5370 | 53 | 15552 | 0.3408 | | Futuretvman | Irene_Ng | 1921 | 4.8544 | 5 | 25 | 20.0000 | | Bronks | Lisbet_Palme | 10127 | 23.6364 | 13 | 9170 | 0.1418 | | Jokestress | Chris_Brand | 15621 | 22.9630 | 31 | 28533 | 0.1086 | | TommyBoy | William_Scranton | 356381 | 22.7723 | 46 | 15552 | 0.2958 | |
percent_of_user_edits
can be misleading, because I've seen a couple of times that an SPA creates a bio and then adds that name to all kind of lists and other articles. They might have created the autobio with 2-3 edits, and then spent their next 10 edits spreading the Wikilink. So it might be better to measure something like percentage of contribution (that is all text a single editor contributed) to a single article. The typical COI-SPA would add like 1,000 or more bytes to a single article, and then 30-50 bytes to half a dozen lists, so in that case it would like "75% of user doe's contributions went to article John Doe". And again, I don't know if that can be programmed, either.This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 35 | ← | Archive 38 | Archive 39 | Archive 40 | Archive 41 | Archive 42 | → | Archive 45 |
{{ Infobox NFLretired}} should be replaced with {{ Infobox NFLactive}}. I could do it myself but I don't know exactly which fields should be renamed, etc. Can someone do it or hint me how to do it?
Instructions say: This template is being retired per discussion
here
Instead, please use {{
Infobox NFLactive}} setting the |final_team=
& |final_year=
fields appropriately; and leaving the |current_team=
field blank.
-- Magioladitis ( talk) 11:18, 7 January 2011 (UTC)
Right now there's a heated discussion at WP:AN over a user involved in WP:NFCC; the whole case is unimportant for here, but a key idea that fell out so far was that when pages are moved without leaving behind redirects, all non-free images are "broken" per NFCC#10c until the rationale for that image are fixed. If the mover doesn't do this themselves, usually no one does it.
The idea behind a bot here would be to identify page moves w/o redirects, whether those pages include NFCC images (as determined by the licenses on the page), and then replace the old article name with the new one in the image rationale. (This may either be a link, or may be the "article" field in a limited number of templates). This would remove many of the problems with #10c enforcement.
(One could argue this bot could be expanded to handle any page move and any links anywhere on WP to that new page, but right now, this is a much smaller and important focus). -- MASEM ( t) 20:13, 11 January 2011 (UTC)
While working with WPUS I have recently found a ton of Usernames that have been inactive for long periods of time, some for years. I would like to recommend a bot that would tag the inactive User pages with an inactive tag if the account hasn't made an edit in a year, maybe even 6 months. Although I have a bot account pending I do not know how to do something like this so I thought I would just submit it here. Plus I thought this would be a good place to solicite comments on the merits of the task itself. -- Kumioko ( talk) 12:33, 11 January 2011 (UTC)
user
table. I never did find a good algorithm to identify inactive users. If I were to write it again today, I'd add WikiDashboad graphs [Now implemented] and provide a checkbox to move entries to an active section. —
Dispenser 15:19, 11 January 2011 (UTC)
Just a quick comment, I would think a Database report would be ideal for something like this. – MuZemike 01:26, 12 January 2011 (UTC)
I would like a bot that goes through articles that relate to buses and transport and checking everything.
RcsprinterBot 12:01, 16 January 2011 (UTC)
I need a bot that can delete all of its inclusions and delete this template once and for All! Please help? -- Hinata talk 21:03, 15 January 2011 (UTC)
I don't think that this has been suggested before. Jarry1250 has a tool on Toolserver that checks for image existence, so why not have a bot that automatically deletes the {{ req-photo}} template on article talk pages (if there isn't a parameter for a specific type of image) if an image already exists in the article? It would definitely clear up the backlog. Logan Talk Contributions 04:11, 14 January 2011 (UTC)
1_E+sq._km_m² is on most wanted pages as frequently linked. Probably the result of inappropriate use of {{ Infobox Indian jurisdiction}} and similar templates. In the case of {{ Infobox Indian jurisdiction}} The following change needs to be made:
As per this diff I don't know if the problem affects other templates. Taemyr ( talk) 11:50, 4 January 2011 (UTC)
{{#if:{{{area_magnitude|}}}
would do the job well enough?
Rich
Farmbrough, 02:08, 12 January 2011 (UTC).Since
WildBot hasn't made an edit since September 21, 2010 and no longer updates it's talk page templates for disambiguation and section links I would like to see another bot go around and delete the following templates from the talk-pages: {{
User:WildBot/m01}}
, {{
User:WildBot/m04}}
, each of these templates has more than 5000 transclusions, and are in many/most cases outdated.
Xeworlebi (
talk) 18:45, 18 January 2011 (UTC)
wget http...wildbot.txt; python replace.py -file:wildbot.txt -regex "\{\{(User:WildBot/m01|User:WildBot/msg)\|([^{}]|\{\{User:WildBot/[^{}]*\}\})*\}\}\n?" ""
(untested) added to cron. —
Dispenser 21:04, 18 January 2011 (UTC)
What exactly is to be done? Remove the tag from the daily generated list or all pages? I would prefer the first option because some people maybe still using the tag to identify and fix things. Btw, I am running on the list every 3-4 days. I can do it every days if you think is necessary. -- Magioladitis ( talk) 01:28, 19 January 2011 (UTC)
Is there a bot or would it be possible to create one, for On This Day and DYK's to be updated automatically by a bot for Portal:United States (and potentially others)? Currently these 2 sections must be manually updated but I would like to make these 2 section of Portal:United States as user friendly and maintenance free as possible. Any ideas? -- Kumioko ( talk) 19:33, 20 January 2011 (UTC)
Hi. Anyone is interested in programming a bot to insert {{ commons}} or {{ commonscat}} templates in articles when needed using interwiki? An example, adding gallery link from Spanish article.
The bot can use the "External links" section or adding the template at the bottom of the article just before the categories. Is there any infobox adding links to Commons? We need to excluded those cases. Thanks. emijrp ( talk) 20:56, 18 January 2011 (UTC)
Is there a bot that can remove or comment out redlink files? Such as the one that was here? There is currently a database report with over 16,000 pages with this condition. I assume Pywikipediabot with the delinker.py script would be able to do it (similar to CommonsDelinker). Avic ennasis @ 01:53, 20 Shevat 5771 / 25 January 2011 (UTC)
Science has recently released " The Science Hall of Fame" which ranks scientists in terms of impact (as measured through the number of times their names are present in books which can be found in Google Books. It seems to me that Wikipedia could greatly benefit from a comparison with this list. So what I have in mind is basically a bot that fetched the information found in this table, then checks Wikipedia and builds a report of the status of these articles. Something like
Name |
Born |
Died |
milliDarwins |
Name |
Born |
Died |
Rating |
---|---|---|---|---|---|---|---|
Bertrand Russell | 1872 | 1970 | 1500 | Bertrand Russell | 1872 | 1970 | B |
Charles Darwin | 1809 | 1882 | 1000 | Charles Darwin | 1809 | 1882 | FA |
Albert Einstein | 1879 | 1955 | 878 | Albert Einstein | 1879 | 1956 | A |
Sir Fake Name | 1900 | 1950 | 50 | Sir Fake Name | — | — | NA |
When our data matches that of Science, use {{
yes|YYYY}}
, otherwise {{
no|YYYY}}
. I purposefully misreported Einstein's death year just to illustrate what I meant. The bot results would be uploaded and updated daily/weekly/whateverly at something like
Wikipedia:Comparison of Wikipedia articles with Science "Science Hall of Fame". This would allow us to track quality of high-impact scientists and other science-related people, as well as find gaps in our coverage.
Headbomb {
talk /
contribs /
physics /
books} 03:34, 20 January 2011 (UTC)
{{
no|—}}
. This way some categorization work can happen.
Headbomb {
talk /
contribs /
physics /
books} 23:08, 20 January 2011 (UTC)About the issue of years of birth/death, we could take the information from the {{ Persondata}} template, if nothing else. עוד מישהו Od Mishehu 08:13, 23 January 2011 (UTC)
Split the list in chunks of 500 and transclude them into a master list? Something like
{|class="wikitable sortable" |- {{/01}} |- {{/02}} |- {{/03}} |- {{/04}} |- {{/05}} |- {{/06}} |- ... |- |}
Headbomb { talk / contribs / physics / books} 21:45, 24 January 2011 (UTC)
As stated here old files might have to be scanned for file description pages without any license template. I assume this would be feasible since such files are not in the Category:Wikipedia files by copyright status category tree. -- Leyo 17:34, 12 January 2011 (UTC)
cough -- Chris 09:15, 17 January 2011 (UTC)
Requesting/suggesting a bot be made that checks the number of times an article has had an IP address or new user reverted, and if it is a large number then making a list of the article and the number of reverts for someone to look over. Also if any new editor or IP address is blocked do to vandalism, can a bot check to see if any of their contributions haven't been reverted yet? Add that to a list for people to check up on. When I revert someone, I always click to see their contributions and see what other articles they have vandalized as well. Not everyone does that though. If its too much of a load to scan through all articles at times, perhaps a tool approved users can run at Wikipedia:Requests for page protection. See long term problems that keep emerging, instead of just the most recent events. Dream Focus 05:34, 27 January 2011 (UTC)
Not sure if the new article alert bot thing would work in this format—it has useful XfD monitoring things that this WikiProject can check on but that doesn't need to be in this bulletin template. Would it be possible to have a bot auto-update this template, or would customizing the AAlertBot output work? / ƒETCH COMMS / 15:54, 27 January 2011 (UTC)
Hi,
I am a member of the typo team, but due to school and prior commitments, I do not have much time to spend on WP. I would like to request a bot to assist me with spelling and grammar corrections.
Paper
fork
♠ 13:22, 22 January 2011 (UTC)
Not a good task for a bot. Logan Talk Contributions 09:21, 28 January 2011 (UTC)
It would be nice if there was a bot to automatically add project tags to the talk pages of appropriate articles. Is there such a bot already in existence? WikiManOne ( talk) 22:34, 26 January 2011 (UTC)
Deferred Logan Talk Contributions 09:15, 28 January 2011 (UTC)
The webmasters at http://www.parliament.uk appear to have decided against permanent URLs, breaking a widely used link to http://www.parliament.uk/commons/lib/research/briefings/snpc-04731.pdf
The document is now at http://www.parliament.uk/documents/commons/lib/research/briefings/snpc-04731.pdf
I can't recall where the tool is to count such links, but I know that I have added dozens of them.
Would any bot owner be kind enough to update the link? -- BrownHairedGirl (talk) • ( contribs) 14:40, 5 February 2011 (UTC)
Since User:WildBot has not been working for awhile and User:Josh Parris is no longer active can another bot be used to cleanup the tags that have been left on the talk pages? – Allen4 names 03:16, 28 January 2011 (UTC)
/* SLOW OK */
instead of /* SLOW_OK */
and the query was killed during high replication. I have taken the liberty of updating WildBot's pages and template to reflect the inactive status. —
Dispenser 22:02, 31 January 2011 (UTC)
Page seems broken to me today. -- Magioladitis ( talk) 12:58, 6 February 2011 (UTC)
Would it be possible for a bot to replace all instances of {{ oscoor}} with {{ gbmappingsmall}}? Mjroots ( talk) 22:14, 30 January 2011 (UTC)
After several years of using the "unranked taxon" parameters in the {{ taxobox}}, someone's pointed out that a few parameters are inconsistent with the rest of the taxobox. I've spent the last few hours resolving this issue, and I've got a bit left to go before I'm done, but there's one gargantuan mountain standing in my way-- about 26K articles that all (thankfully) share exactly the same problem.
All the articles appearing in the automatically updated
Category:Taxoboxes employing both unranked_familia and superfamilia need to have the text unranked_familia
replaced with unranked_superfamilia
. The text appears only once on each of these pages, so a search-and-replace with no second pass should suffice. There are currently 25,892 pages catalogued under this category. Once this category is emptied out, the task should be terminated.
I appreciate any help you can offer! Thanks! Bob the WikipediaN ( talk • contribs) 23:56, 30 January 2011 (UTC)
python replace.py -cat:Taxoboxes_employing_both_unranked_familia_and_superfamilia "unranked_familia" "unranked_superfamilia"
?
Smallman12q (
talk) 00:36, 31 January 2011 (UTC)
unranked_familia
with unranked_superfamilia
in all cases in that category?
Plastikspork
―Œ(talk) 01:01, 31 January 2011 (UTC)
|unranked_familia_authority=
would need changed to |unranked_superfamilia_authority=
.
Bob the WikipediaN (
talk •
contribs) 06:38, 31 January 2011 (UTC)
super
in one or two places-- a change that will have zero effect on the appearance/functionality of the taxoboxes, but once completed, will allow for the final revision needing made to {{
taxobox/core}} in order to normalize the functionality of unranked taxa.
Bob the WikipediaN (
talk •
contribs) 00:13, 1 February 2011 (UTC)
What about creating another bot that automatically archives urls at WebCite and adds the archive links to articles? I am aware of User:WebCiteBOT, but
I consider WP:LINKROT to be a major threat to Wikipedia, that asks for a response, thus I hereby request an efficiently working bot for that purpose. Regards. Toshio Yamaguchi ( talk) 16:59, 1 February 2011 (UTC)
This category has an enormous backlog, and I think the clearing of the backlog could benefit from bot assistance. Would it be possible to code a bot to output possible coordinates (found by a bot search on google maps) into a list, and then have people go through the list to check these and manually add them to articles? The bot might also consider what the province is based on categories to do an even smarter search. (This might have to be worked out on a country-by-country basis.) Calliopejen1 ( talk) 18:42, 2 February 2011 (UTC)
Delta, you don't need an API key for this function. You can search their location database using this url: http://maps.google.com/maps/api/geocode/json?address=LOCATION&sensor=false. I have code written for this function that you can see here. Its pretty crappy looking, but it works. I'm going to try and contact google and see if we can get an exception to their TOS. Tim 1357 talk 21:27, 2 February 2011 (UTC)
I am requesting that there is a bot created who clears the completed requests of
Wikipedia:Requested_articles. For example,
User:Jimbo Wales requests that the article
Wikipedia be made, because it doesn't exist yet. Then I create that article, but do not delete the request at the project page. What I am requesting is a bot who automatically deletes the request of
Wikipedia at that project page. I do not know whether or not a bot that does this already exists, but it could sure be helpful. Unfortunately, this could possibly also eliminate inspecific requests, such as one that already has a Wikipedia page but the requester is referring to something else. --
114.250.35.13 (
talk) 08:13, 3 February 2011 (UTC)
:Strongly oppose-- humans are much more effective at creating stubs, and bots have failed numerous times in creating stubs within the
WP:TOL WikiProject. If a branch of science with unique names for everything has issues, imagine the issues that would arise in other fields. Also, there is a very nice-sized article at
Wikipedia and has been for a long time.
Bob the WikipediaN (
talk •
contribs) 12:58, 3 February 2011 (UTC)
Request a bot which can detect when an article page is a dab page but the talk page associated with it is a redirect, and then fix it by replacing the talkpage redirect with the {{ WikiProject Disambiguation}} template. If it could also tag the talk pages of all dab pages which are missing the template as well that would be even better :) Thanks, DuncanHill ( talk) 15:17, 3 February 2011 (UTC)
(edit conflict) The SQL so no one actually have to write it is:
SELECT CONCAT("* [[Talk:", talk.page_title, "]]")
FROM page
JOIN categorylinks ON cl_from=page.page_id
JOIN page AS talk ON talk.page_title=page.page_title AND talk.page_namespace=1
WHERE page.page_namespace=0
AND page.page_is_redirect=0
/* in categories */
AND cl_to IN ("All_disambiguation_pages", "All_set_index_articles")
AND talk.page_is_redirect=1;
4607 rows in set (9.34 sec)
if you're wondering. If you want redirects with tagged talk pages you can use catscan. —
Dispenser 18:47, 3 February 2011 (UTC)
Could someone please train a bot to make a one-time sweep for WP:WPVG? Basically, we are hoping that a bot will check all of the articles tagged with Template:WikiProject Video games, make note of all local images that are image-linked on those articles, and then tag the talk page of those articles with Template:WikiProject Video games. The template can automatically tell when it is on a file, and will place that image in a file category. Editors have done this manually for a while, but there are still a lot more to tag. This will help us maintain our images. We may want the bot to run again at some point, but I don't think we need one constantly sweeping. Thanks! ▫ JohnnyMrNinja 22:06, 3 February 2011 (UTC)
Category:Articles needing coordinates includes articles that need to have the relevant coordinates (latitude;longitude) added. Lewis Ainsworth House is listed in a subcategory of Category:Articles needing coordinates. In addition, Lewis Ainsworth House uses a template where an address location is listed. In particular, its {{ Infobox nrhp}} lists "location = 414 E. Chapman Ave<br>[[Orange, California]]." Now, if you add 414 E. Chapman Ave, Orange, California to gpsvisualizer.com, you get 33.787698,-117.849617. So here's the bot idea: Have the bot search out all articles listed in Category:All articles needing coordinates that also use {{ Infobox nrhp}} AND have the parameter "location =" filled in. The bot should then get the location info, pass the stree address information through a program such as gpsvisualizer.com to find their latitude and longitude. Then, take that latitude and longitude and add it to that article's {{ Infobox nrhp}}. Then remove the article from Category:Articles needing coordinates. There might be other templates that use address locations that also are missing their geo coordinates. -- Uzma Gamal ( talk) 14:09, 4 February 2011 (UTC)
The external link Brazilian Tourism Portal, present in about 700 articles, is broken. Even the "good" link [6] does not seem to be a good link to the articles (no useful info). Does anyone could create a bot to solve this problem? I think the best thing to do is just remove all links. Caiaffa ( talk) 14:21, 4 February 2011 (UTC)
does any one developed interwiki coordination bot that can check another wikis and like interwiki bot compare them and add or remove coordination template? also i found [library for python http://py-googlemaps.sourceforge.net/] that can use google map coordination Reza1615 ( talk) 14:56, 4 February 2011 (UTC)
Hey All, I was wondering if someone could do an assessment job on all the articles in Category:Unassessed Albemarle County articles via a bot. They would just need to match the assessments of the already exsisting templates. Like if WP:FOO is Class C with Low Importance, WP:ALVA (the WP link for the project connected to this category) would be the same. Could someone do this? - Neutralhomer • Talk • 02:36, 7 February 2011 (UTC) • Go Steelers!
{{ cite doi}} templates should have as an argument, the doi. Some of these templates were malformed, so need to be cleaned up.
The full list is
These template would need to be moved from {{
cite doi/doi:foobar}} to {{
cite doi/foobar}}. Then, the articles linking to {{
cite doi/doi:foobar}} should have their {{
cite doi}} template be updated from {{
cite doi|doi:foobar}}
to {{
cite doi|foobar}}
. When that's done, the {{
cite doi/doi:foobar}} should be tagged as {{
db-g6}} per uncontroversial maintenance, as it would be an unlikely and unused redirect.
Headbomb {
talk /
contribs /
physics /
books} 04:38, 9 February 2011 (UTC)
hello,
I need a bot which archives my user talk page each month (and don't ask me why each month). Thank you.-- ♫Greatorangepumpkin♫ T 15:20, 10 February 2011 (UTC)
Would a bot kindly update Portal:Tropical cyclones/Active tropical cyclones? -- Perseus 8235 17:45, 10 February 2011 (UTC)
I'd like for someone to go through Category:Cite doi templates / Category:Cite hdl templates / Category:Cite jstor templates / Category:Cite pmc templates / Category:Cite pmid templates and build tables akin to
Template:Cite doi/... |
author(s) | date |
|title= |
|url= |
|format= |
|journal= |
|publisher= |
|volume= |
|issue= |
|pages= |
|bibcode= |
|oclc= |
|doi= |
|isbn= |
|issn= |
|pmc= |
|pmid= |
|id= |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1001/ archgenpsychiatry.2007.2 |
|last1=Fombonne |first1=E.
|
2007 | Thimerosal Disappears but Autism Remains | — | — | Archives of General Psychiatry | — | 65 | 1 | 15 | — | — |
10.1001/ archgenpsychiatry.2007.2 |
— | — | — | 18180423 | — |
10.1001/ archgenpsychiatry.2009.30 |
|last1=King |first1=B. |last2=Hollander |first2=E. |last3=Sikich |first3=L. |last4=Mccracken |first4=J. |last5=Scahill |first5=L. |last6=Bregman |first6=J. |last7=Donnelly |first7=C. |last8=Anagnostou |first8=E. |last9=Dukes |first9=K.
|
2009 | Lack of efficacy of citalopram in children with autism spectrum disorders and high levels of repetitive behavior: citalopram ineffective in children with autism | — | — | Archives of General Psychiatry | — | 66 | 6 | 583–590 | — | — |
10.1001/ archgenpsychiatry.2009.30 |
— | — | — | 19487623 | — |
10.5367/ 000000000101293149 |
|last1=Dumont |first1=R. |last2=Vernier |first2=P.
|
2000 | Domestication of yams (Dioscorea cayenensis-rotundata) within the Bariba ethnic group in Benin | — | — | Outlook on Agriculture | — | 29 | — | 137 | — | — |
10.5367/ 000000000101293149 |
— | — | — | — | — |
Template:Cite doi/... |
author(s) |
date |
|chapter= |
|chapterurl= |
editor(s) |
|title= |
|url= |
|format= |
|publisher= |
|series= |
|volume= |
|issue= |
|pages= |
|bibcode= |
|oclc= |
|doi= |
|isbn= |
|issn= |
|pmc= |
|pmid= |
|id= |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1002/ 0471238961.0315131619030818.a01.pub2 |
|last=Schrobilgen |first=G. J. |last2=Moran |first2=M. D.
|
2003 | Noble-Gas Compounds | — | — | Kirk-Othmer Encyclopedia of Chemical Technology | — | — | John Wiley & Sons | — | — | — | — | — | — |
10.1002/ 0471238961.0315131619030818.a01.pub2 |
— | — | — | — | — |
Etc...
These tables could be hosted somewhere at WP:JOURNALS and maybe in parallel on the toolserver? These tables would immensely help with template cleanup/completion. Headbomb { talk / contribs / physics / books} 22:46, 10 February 2011 (UTC)
Several templates created by Citation bot have had their structure rot over time. Some of it was due to sloppy bot edits, others to human mistakes, and others to sloppy human editing. There's general a dislike for bot-edits which do not create actual changes in appearance, but these are all hosted in the template space, with very very few people watching them (aka, this would really not annoy a lot of people, and people likely to be watching these templates would also be the one to appreciate their tidying up). A "good" template should be formatted in this manner (aka in this order):
{{cite journal |last= |first= |authorlink= |last1= |first1= |author1link= Remove empty parameters, if no author-related parameter remains, keep |last1= and |first1= |last2= |first2= |author2link= ... |author= |authorlink= |author1= |author1link= Remove empty parameters, if no author-related parameter remains, add |last1= and |first1= |author2= |author2link= ... |date= |year= Remove empty parameters, if no date-related parameter remains, add |year= |month= ... |title= Remove empty parameters, |title= should always be present |language= |transtitle= |url= |format= ... |journal= Remove |series= if empty; the others should always be present |series= |volume= |issue= |pages= ... |publisher= Remove empty parameters, also remove |location= if |publisher= is empty |location= ... |arxiv= Remove |asin=, |doi_brokendate=, |isbn=, |issn=, |lccn=, |jfm=, |oclc=, |asin= |ol=, |rfc=, |pmc-embargo-date=, and |id= ,if empty, |bibcode= |doi= Add |arxiv=, |bibcode=, |doi=, |jstor=, |mr=, |osti= |pmc=, |pmid=, |doi_brokendate= |ssrn=, and |zbl= if missing |isbn= |issn= |jfm= |jstor= |lccn= |mr= |oclc= |ol= |osti= |pmc= |pmc-embargo-date= |pmid= |rfc= |ssrn= |zbl= |id= ... |accessdate= Remove empty parameters, except |accessdate= if |url= is present |archiveurl= |archivedate= |laysource= |laysummary= |laydate= ... |quote= Remove empty parameters |ref= |separator= |postscript= ... <!--UNUSED DATA--> All other parameters should be moved here }}
This would have two great benefits. If the templates are formatted consistently and legibly, newcomers would be much less intimidated by the structure of these templates, and both regular and newcomers will benefit from the improved readability. Compare [7] and [8] for instance. Plus, with the parameter pruning, you can immediately see what is missing, and what should be added instead of being mislead into finding a journal's publisher or wonder what a "separator" is or what the "series" refer to. This would in turn facilitate and encourage good citation completion/maintenance by humans. A daily run wouldn't be needed for this, but a one-time run over all citations combined with monthly runs would be so incredibly helpful here. Once we do {{ cite journal}}, we could move on to the other templates ({{ citation}}, {{ cite book}}, etc...). I've notified Smith609 (who runs Citation bot) for feedback here. Headbomb { talk / contribs / physics / books} 23:42, 10 February 2011 (UTC)
If the requested move discussion at Talk:New York City Subway succeeds, there will need to be a couple of hundred of page moves and several thousands of find/replace runs for "New York City Subway". The find/replace runs can be done automatically. If the bot ignores image/interwiki links (but not wikilinks), there shouldn't be any false positives. It's too much for assisted AWB to do, so can someone with a bot take on this task, assuming the discussion results in "move"? — Train2104 ( talk • contribs • count) 16:41, 12 February 2011 (UTC)
Regarding Wikipedia:Requests for feedback/navigation, which is transcluded on WP:FEED,
Could someone possibly make a bot which automatically adds links each month, as I did manually here?
If you need more info, give me a shout. Cheers! Chzz ► 14:10, 13 February 2011 (UTC)
How hard would it be for a bot to tag all pages listed at Wikipedia talk:Requests for comment/Jagged 85/Cleanup1 etc. with a template? — Ruud 15:53, 13 February 2011 (UTC)
{{
Jagged 85 cleanup|subpage=Cleanup1a}}
, where subpage
is the cleanup list the article appears on? —
Ruud 18:25, 13 February 2011 (UTC)Hi all,
I have the problem described in this post, but am not clear on how to apply a solution: http://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval/Erik9bot_5
Like the Irish times, as mentioned in this post, my news website has changed its url. How do I update the 2,000+ links to my site in Wikipedia?
Apparently this fix was created by someone who has been banned so I can't ask him to explain. "This account is a sock puppet of John254 and has been blocked indefinitely."
Thanks much Michelle — Preceding unsigned comment added by Mnicolosi ( talk • contribs) 22:59, 12 February 2011 (UTC)
Thanks for the help with the signature. Sorry, not quite used to this system. All urls in wikipedia that are seattlepi.nwsource.com need to be updated to seattlepi.com. Details on the background behind our url change are here if you're interested: http://en.wikipedia.org/wiki/Seattle_Post-Intelligencer
Thanks much,
Can you assist WP:PINOY in changing www.t-macs.com/kiso/local/ (which is unofficial and a dead link) to http://www.census.gov.ph/census2000/index.html which is live and is the official 2000 Philippine Census by the National Statistics Office? Thanks.
See background here:
The help desk referred me to you for assistance.-- Lenticel ( talk) 06:36, 14 February 2011 (UTC)
At
User talk:Citation bot#Withdrawn_papers it is noted that on occasion cited papers are updated or withdrawn. A maintenance bot or other tool could follow cited pubmed, doi, or other database identifiers to check for such, then (where appropriate) tag the citation for human attention, possibly amending the citation in the process (e.g. changing |title=Dewey Wins!
to |title=Withdrawn:Dewey Wins!
(or whatever the database indicates). Martin advises this is beyond Citationbot's scope, so it would need to be a different tool. Given that {{
cite doi}} and its ilk bury information in subpages where it is rarely seen, these should get priority.
LeadSongDog
come howl! 16:24, 11 February 2011 (UTC)
and
LeadSongDog come howl! 16:50, 11 February 2011 (UTC)
|status=withdrawn
or |status=superceded
in the {{
cite xxx}}/{{
citation}} templates.
Headbomb {
talk /
contribs /
physics /
books} 04:45, 14 February 2011 (UTC)
I posted something similar a while ago, but i guess it was rather a daunting task, so I'm scaling it down a bit. Previously, I wanted a bot that creates report of all problems with the books ({{ citation needed}} tags, {{ POV}} tags ...), but that doesn't look like it'll happen. So, I'm scaling the request down to only report what assessment class the articles of a book are.
Book:Helium | Talk page report | |
---|---|---|
|
→ |
These reports would be done for all Category:Wikipedia books (community books), and updated daily. Of course, if someone feels like taking up the original request, that would also be peachy. WikiProject Wikipedia-Books has been notified of this request. Headbomb { talk / contribs / physics / books} 04:08, 11 February 2011 (UTC)
See User:NoomBot/BookTest for 3-4 examples of book reports. Going to set the bot to append a couple more reports to see if the formatting works for several of them. Also adding more 'problem' templates to detect. Noom talk contribs 18:35, 12 February 2011 (UTC)
Olaf Davis retired last October, but he left the code behind so that anyone who knows how to run Python Bots could replicate what User:Botlaf did. I've just manually gone through the nearly 800 articles that contained the word pubic and fixed 23 that were typos or vandalism, some of which had been up for months. But it is very time consuming to do this manually without Botlaf, and pubic is only one of many reports that Botlaf used to run every week. Please could someone, ideally a Python writer, take over the Botlaf code? It only needs to run weekly. Thanks Ϣere SpielChequers 13:04, 14 February 2011 (UTC)
I've come to the realization that I can't maintain all my bots under my current workload, and as such, would like for some people to take over the task of WP:UAA helperbot. (BRFA: Wikipedia:Bots/Requests for approval/SoxBot 23) It is a clone of HBC's bot, so it shouldn't be too hard to bring up. ( X! · talk) · @841 · 19:10, 16 February 2011 (UTC)
Would it be possible to have a bot automatically add {{ TFA-editnotice}} into the editnotice of today's featured article? I'm not quite sure how easy it would be for a bot to figure out what tomorrow's TFA is, but I suspect somebody can do it, and the template just needs adding once with the appropriate date specified (the template does the rest). Rd232 talk 02:53, 14 February 2011 (UTC)
There are strong feelings that generally, links should only be in the "see also" section if they aren't in the body of the article. (no, it isn't a guideline, there is certainly a lot of wiggle room in it)
Anyhow, has anyone built a bot that identifies and/or removes items that are redundant? It could note them at the top of the talk page (like how disambiguations are/were marked), it could remove them, it could add a category indicating duplicate items, it could put a comment or template next to duplicates, or it could actually remove them.
This seems well-suited to a bot because it's nontrivial for a human to scan the article for the duplicate links. tedder ( talk) 21:47, 17 February 2011 (UTC)
I was wondering if I could enlist the help of a bot to do some stub sorting for me for the
Washington WikiProject. Each county has their own geographic location stub type (e.g. {{
KingWA-geo-stub}} for
Category:King County, Washington). Most articles use {{
Washington-geo-stub}}. What I would like to do is for articles in Foo County, Washington and use {{
Washington-geo-stub}} to change the stub type to {{FooWA-geo-stub
}}. Any articles that are not located in either a) a county category or b) are in multiple countty categories should be left with the base template. There are currently
400 transclusions of the base template, and should be a fairly simple, and uncontroversial edit. --
Admr
Boltz 05:16, 18 February 2011 (UTC)
I am not sure if a page like this already exists, but I am guessing people here would be aware of it. I was thinking it would be pretty useful if every week or month a list of the top 500 or 1000 most viewed/accessed pages that are neither FA/FL/GA nor A/B-class articles in at least one project would be listed. This way those interested in directing attention to the most visited pages that are not in a decent shape yet. Only C-class, starts, stubs, lists, and unassesed pages that are not rated B-class or above in any project should be listed. Alternatively a top 500 for each of the bottom classes would be good also. Nergaal ( talk) 07:25, 20 February 2011 (UTC)
The Manual of Style reads:
Moreover, dates should not have "th" on the them. Am I OK to think that AWB can be set to remove superscript from ordinals? -- Magioladitis ( talk) 20:04, 16 February 2011 (UTC)
Wikipedia:Bots/Requests for approval/Yobot 20. -- Magioladitis ( talk) 00:59, 21 February 2011 (UTC)
A discussion is going on about disabling the "reviews" parameter of {{ Infobox album}}, as consensus has it to move reviews to a separate section. There are hundreds of thousands of articles to have the infobox data moved to {{ album ratings}}, and could takes years to do manually. Thus, it would be very much appreciated if a bot coder could take a look. Thanks, Adabow ( talk · contribs) 03:39, 21 February 2011 (UTC)
All instances of http://www.pmars.imsg.com/ need changing to https://pmars.marad.dot.gov/ . The url path after .gov/ will not need changing. There are about 200 of these and they're all dead links. Thanks. Brad ( talk) 13:58, 18 February 2011 (UTC)
pl-0: 1 Page; ja-0: 2 Pages; commons-6: 2 Pages; en-0: 10 Pages;
Usually, when pages are moved, a redirect is left behind; this redirect clues in the interwiki bots that the page in English Wikipedia still exists, it's just been moved to a new name. However, categories are renamed by creating new categories and deleting the old ones. Should an interwiki bot then start handling the category in an other language Wikipedia, it would decide that we no longer have such a category, and remove the English interwiki (incorrectly!) from the other languages - see here for an example. I think the best solution is to have an interwiki bot look at all new categories (defined as having been created since the beginning of the previous run), and handle their interwiki as the interwiki bots always do - and that would include updating the English name on all the other Wikipedias' interwiki lists. עוד מישהו Od Mishehu 12:06, 22 February 2011 (UTC)
Many articles on settlements in the US contain boilerplate text (presumably bot-generated a long time ago) along the lines of:
In several thousand of these articles, 'married couples' is in a piped link to marriage. This is clearly a worthless link - could a bot unlink these? To be precise, the articles to be covered are those in the intersection of (What links to: marriage) and Category:Populated places in the United States, and its subcategories). Colonies Chris ( talk) 16:48, 23 February 2011 (UTC)
Is there (or could somebody create) a bot that can go through the "Special:UnusedFiles" page and delete it all? Neocarleen ( talk) 05:46, 17 February 2011 (UTC)
In general, the idea that a bot would delete anything is simply nuts; it opens the door to vandals. In the case of images they could delete them by removing them from articles. Choyoołʼįįhí:Seb az86556 > haneʼ 01:11, 18 February 2011 (UTC)
Hey, can you make me a bot that cleans up spam please?
^^— Preceding unsigned comment added by 64.228.147.57 ( talk • contribs) 02:07, February 25, 2011
National Census 2010 results [10] were published. We need infoboxes population, pop rank, density update for 1-st and 2-d level divisions. Bogomolov.PL ( talk) 23:22, 25 February 2011 (UTC)
As discussed in these threads, Mobius Bot ( talk · contribs) went berserk last year and its owner has disappeared. Can its functionality be replicated in a new bot, or incorporated into an existing bot? The source is at http://pastebin.com/i2ZYQBRD. Adrian J. Hunter( talk• contribs) 14:40, 24 February 2011 (UTC)
Per some discussion on the EN wiki mailing list about disability access, we have lots of images that need alt text so that blind people and anyone using a text reader can get an idea as to what our images are displaying. There was a suggestion on the mailing list from User:Martijn Hoekstra "Would an automated category Images without alt text be feasible?", alternatively I would have thought that a weekly regenerated list would do the job just as well. It would also make for a very good entry level task for newbies finding their feet. Can someone kind bot writer code it please? Ϣere SpielChequers 14:50, 27 February 2011 (UTC)
For some reason, there are quite a number of people who create their own Wikipedia article. While this is not prohibited per se, it is still strongly discouraged. What most of these autobiographies have in common is an editor who created most (if not all) of the article's content but did not contribute anywhere else (I ran into a couple of those lately). So what I'm suggesting is a bot that scans Category:Living people for articles where more than, say, 90% of the content came from a single-purpose account, and flag them (maybe with {{ COI}}, or something else, or add them to a separate list similar to User:AlexNewArtBot/COISearchResult). -- bender235 ( talk) 18:34, 26 February 2011 (UTC)
Extended content
|
---|
SELECT editor_name,
article_title,
round(deltas),
( ( edits_to_article / ( numberofedits + 0.0 ) ) * 100 ) AS
percent_of_all_edits_to_article,
edits_to_article AS
user_edits_to_article,
all_edits AS
user_editcount,
( ( edits_to_article / ( all_edits + 0.0 ) ) * 100 ) AS
percent_of_user_edits
FROM (SELECT DISTINCT Concat(Concat(main.rev_page, '-'), main.rev_user) AS
distinctify,
main.rev_user_text AS
editor_name,
mainp.page_title AS
article_title,
(SELECT COUNT(*)
FROM revision AS bla
WHERE bla.rev_page = mainp.page_id) AS
numberofedits,
mainp.page_id AS
pageid,
(SELECT SUM(IF(Isnull(prev.rev_len), NOW.rev_len,
IF(NOW.rev_len > prev.rev_len,
NOW.rev_len - prev.rev_len + 0.0, (
-1.0 * (
prev.rev_len - NOW.rev_len ) )))) AS d
FROM revision AS NOW
LEFT JOIN revision AS prev
ON prev.rev_id = NOW.rev_parent_id
AND prev.rev_id!=0
WHERE NOW.rev_user = main.rev_user
AND main.rev_page = NOW.rev_page) AS
deltas,
(SELECT COUNT(*)
FROM revision AS s
WHERE s.rev_user = main.rev_user
AND s.rev_page = main.rev_page) AS
edits_to_article,
user_editcount AS
all_edits
FROM revision AS main
JOIN page AS mainp
ON mainp.page_id = main.rev_page
AND mainp.page_namespace = 0
JOIN categorylinks
ON cl_from = mainp.page_id
AND cl_to = 'Living_people'
JOIN USER
ON main.rev_user = user_id
LEFT JOIN user_groups
ON main.rev_user = ug_user
AND ug_group IN ( 'sysop', 'bot' )
WHERE Isnull(ug_group)
LIMIT 5000) AS p
ORDER BY ( percent_of_all_edits_to_article + percent_of_user_edits ) DESC
LIMIT 100;
+------------------------------+---------------------------------+---------------+---------------------------------+-----------------------+----------------+-----------------------+ | editor_name | article_title | round(deltas) | percent_of_all_edits_to_article | user_edits_to_article | user_editcount | percent_of_user_edits | +------------------------------+---------------------------------+---------------+---------------------------------+-----------------------+----------------+-----------------------+ | Ruhe | Rushworth_Kidder | 5905 | 25.0000 | 3 | 3 | 100.0000 | | Lolamangha | Brittany_Tiplady | 2414 | 6.8182 | 3 | 3 | 100.0000 | | Evandrobaron | Paulo_Afonso_Evangelista_Vieira | 44 | 6.2500 | 1 | 1 | 100.0000 | | Jerryhansen | Peter_Hyman | 1886 | 5.8824 | 2 | 2 | 100.0000 | | Viktorbuehler | Rolf_Dobelli | 39560 | 33.3333 | 21 | 29 | 72.4138 | | Mrpuddles | Leslie_Cochran | 18117 | 5.0147 | 17 | 17 | 100.0000 | | Alon.rozen@gmail.com | Eric_Britton | 3492 | 4.8780 | 2 | 2 | 100.0000 | | Zagatt | Barbara_Sukowa | 3327 | 4.3956 | 4 | 4 | 100.0000 | | Mawjj | Carmen_Boullosa | 3793 | 2.6316 | 1 | 1 | 100.0000 | | Habibrahbar | Massy_Tadjedin | 152 | 2.6316 | 1 | 1 | 100.0000 | | Lem | Malcolm_Azania | 1841 | 1.1236 | 1 | 1 | 100.0000 | | Preludes | Alexander_Beyer | 3248 | 1.0000 | 1 | 1 | 100.0000 | | Pnut123 | Paul_Carr_(writer) | 1209 | 0.7576 | 1 | 1 | 100.0000 | | KidRose | Tim_White_(wrestling) | 604 | 0.7042 | 2 | 2 | 100.0000 | | Ajmaher | David_Parnas | 1297 | 0.5917 | 1 | 1 | 100.0000 | | Themadhatter | Tim_Sköld | 2523 | 0.5747 | 4 | 4 | 100.0000 | | Danistheman | Doug_Dohring | 35 | 0.5618 | 1 | 1 | 100.0000 | | Rogersdrums | Steve_Ferrone | 481 | 0.5348 | 1 | 1 | 100.0000 | | LiangY | Keiko_Agena | 516 | 0.3676 | 1 | 1 | 100.0000 | | Nignuk | Paul_Posluszny | 3506 | 0.2475 | 1 | 1 | 100.0000 | | Jennifer B | Ashley_Hartman | 85669 | 10.4651 | 27 | 34 | 79.4118 | | Zfeuer | Robert_Schwentke | 971 | 7.1429 | 3 | 4 | 75.0000 | | Aguecheek | Harald_Schmidt | 20679 | 3.7234 | 7 | 9 | 77.7778 | | Alanhtripp | Alan_Tripp | 2328 | 6.8966 | 2 | 3 | 66.6667 | | Lotzoflaughs | Shian-Li_Tsang | 82217 | 63.3333 | 76 | 759 | 10.0132 | | Realmagic | Paul_W._Draper | 4377 | 1.4815 | 2 | 3 | 66.6667 | | TommyBoy | Assad_Kotaite | 44212 | 66.6667 | 34 | 15552 | 0.2186 | | Thivierr | Sally_Gifford | 105960 | 64.5570 | 51 | 22476 | 0.2269 | | Stilltim | David_P._Buckson | 568996 | 63.1148 | 77 | 21011 | 0.3665 | | TommyBoy | Joseph_M._Watt | 33986 | 61.7021 | 29 | 15552 | 0.1865 | | Badagnani | Matthias_Ziegler | 33357 | 61.3636 | 27 | 136593 | 0.0198 | | TommyBoy | Robert_E._Lavender | 16471 | 61.1111 | 22 | 15552 | 0.1415 | | DickClarkMises | Robert_Higgs | 139999 | 60.0000 | 99 | 9655 | 1.0254 | | Thivierr | Kim_Schraner | 282139 | 60.2041 | 59 | 22476 | 0.2625 | | DickClarkMises | Robert_P._Murphy | 374104 | 56.9231 | 111 | 9655 | 1.1497 | | Christine912 | Sandra_Hess | 39694 | 8.0292 | 11 | 22 | 50.0000 | | TommyBoy | Robert_Poydasheff | 36757 | 56.2500 | 36 | 15552 | 0.2315 | | TommyBoy | Tom_Colbert | 29282 | 56.0000 | 28 | 15552 | 0.1800 | | Xiathorn | Sheetal_Sheth | 1475 | 2.0270 | 3 | 6 | 50.0000 | | Lightning Striking a Viking! | Jessica_Cutler | 1488 | 1.0526 | 3 | 6 | 50.0000 | | Gbrumfiel | Russel_L._Honoré | 3150 | 0.8403 | 2 | 4 | 50.0000 | | Adar | Christopher_John_Boyce | 692 | 0.7407 | 1 | 2 | 50.0000 | | Massgiorgini | Mass_Giorgini | 1534 | 0.7353 | 1 | 2 | 50.0000 | | TommyBoy | Robert_E._Kramek | 43670 | 50.0000 | 31 | 15552 | 0.1993 | | TommyBoy | Joseph_Sinde_Warioba | 40451 | 49.2063 | 31 | 15552 | 0.1993 | | TommyBoy | Charles_Thone | 52003 | 48.1013 | 38 | 15552 | 0.2443 | | TommyBoy | Yvonne_Kauger | 13906 | 48.0000 | 24 | 15552 | 0.1543 | | Mckradio | Michael_C._Keith | 393 | 10.6383 | 5 | 14 | 35.7143 | | TommyBoy | Raymond_Ranjeva | 14022 | 44.7368 | 17 | 15552 | 0.1093 | | Killerdark | Andrew_Kahr | 49988 | 36.9048 | 31 | 446 | 6.9507 | | TommyBoy | Joseph_P._Teasdale | 20888 | 43.1373 | 22 | 15552 | 0.1415 | | TommyBoy | James_S._Gracey | 22007 | 40.0000 | 22 | 15552 | 0.1415 | | Kaiobrien | Beat_Richner | 24183 | 23.9437 | 17 | 105 | 16.1905 | | Darius Dhlomo | Mohamed_Allalou | 6531 | 40.0000 | 14 | 162679 | 0.0086 | | Throwingbolts | Randy_Torres | 1273 | 7.7922 | 6 | 19 | 31.5789 | | TommyBoy | Thomas_P._Salmon | 25726 | 39.1304 | 27 | 15552 | 0.1736 | | TommyBoy | George_Nigh | 89990 | 38.6667 | 58 | 15552 | 0.3729 | | Gziegler | Catherine_Barclay | 335 | 5.5556 | 1 | 3 | 33.3333 | | ReidarM | Dominik_Burkhalter | 11683 | 36.8421 | 7 | 375 | 1.8667 | | Douglasshearer | Guthrie_Govan | 10579 | 0.9709 | 3 | 8 | 37.5000 | | DbA | Timothy_Crouse | 1672 | 5.0000 | 4 | 12 | 33.3333 | | Lumos3 | Stuart_Prebble | 10665 | 37.5000 | 9 | 21769 | 0.0413 | | TommyBoy | David_Hall_(Oklahoma_governor) | 145824 | 36.1963 | 59 | 15552 | 0.3794 | | Dananderson | Susan_Golding | 100963 | 35.3535 | 35 | 2872 | 1.2187 | | Phase1 | Bertil_Wedin | 64312 | 34.8837 | 15 | 975 | 1.5385 | | Gabe boldt | Wolfgang_Becker | 824 | 2.7778 | 1 | 3 | 33.3333 | | Lainay | Koharu_Kusumi | 161 | 0.3215 | 1 | 3 | 33.3333 | | Oddtoddnm | Howard_Morgan | 14819 | 31.2500 | 10 | 476 | 2.1008 | | TommyBoy | Steven_W._Taylor | 59973 | 32.2581 | 40 | 15552 | 0.2572 | | TommyBoy | Robert_Harlan_Henry | 19485 | 31.2500 | 20 | 15552 | 0.1286 | | Jliberty | Jesse_Liberty | 79237 | 23.8462 | 31 | 415 | 7.4699 | | Jacrosse | Eric_Garris | 64853 | 28.9157 | 24 | 1223 | 1.9624 | | YUL89YYZ | Arthur_Mauro | 8579 | 30.7692 | 8 | 83262 | 0.0096 | | TommyBoy | Pieter_Kooijmans | 34059 | 30.5263 | 29 | 15552 | 0.1865 | | Michiko | Michael_Marsh_(journalist) | 14841 | 21.6216 | 8 | 91 | 8.7912 | | Fys | Tag_Taylor | 12149 | 29.4118 | 5 | 14706 | 0.0340 | | Snrub | Cathy_Wilcox | 1152 | 4.3478 | 1 | 4 | 25.0000 | | Craig Currier | Robert_Picardo | 1845 | 0.4651 | 2 | 7 | 28.5714 | | Kegill | Britain_J._Williams | 30275 | 23.9130 | 11 | 229 | 4.8035 | | Jess Cully | Kathy_Leander | 27138 | 28.2609 | 13 | 4986 | 0.2607 | | Phase1 | Thomas_Thurman | 20641 | 27.2727 | 12 | 975 | 1.2308 | | Jdcooper | William_Harjo_LoneFight | 19055 | 28.2609 | 13 | 9998 | 0.1300 | | Julianortega | Reika_Hashimoto | 94748 | 25.9259 | 35 | 1473 | 2.3761 | | Gidonb | Christoph_Meili | 21434 | 27.6190 | 29 | 27653 | 0.1049 | | DantheCowMan | Micah_Ortega | 15136 | 27.2727 | 12 | 11710 | 0.1025 | | Loled | Jacques_Cheminade | 775 | 1.8519 | 1 | 4 | 25.0000 | | Jdcooper | Edward_Lone_Fight | 3639 | 26.6667 | 4 | 9998 | 0.0400 | | Rangerdude | Cragg_Hines | 3724 | 26.3158 | 5 | 3171 | 0.1577 | | WillemJoker | Clive_Merrison | 1285 | 1.2346 | 1 | 4 | 25.0000 | | CrevanReaver | Carol_Lin | 784 | 0.5236 | 1 | 4 | 25.0000 | | Riсky Martin | Pete_Rose,_Jr. | 705 | 0.5051 | 1 | 4 | 25.0000 | | TommyBoy | Walter_Dale_Miller | 19673 | 25.0000 | 19 | 15552 | 0.1222 | | Ted Wilkes | Leonie_Frieda | 15114 | 25.0000 | 7 | 18934 | 0.0370 | | TommyBoy | Ed_Schafer | 135806 | 24.5370 | 53 | 15552 | 0.3408 | | Futuretvman | Irene_Ng | 1921 | 4.8544 | 5 | 25 | 20.0000 | | Bronks | Lisbet_Palme | 10127 | 23.6364 | 13 | 9170 | 0.1418 | | Jokestress | Chris_Brand | 15621 | 22.9630 | 31 | 28533 | 0.1086 | | TommyBoy | William_Scranton | 356381 | 22.7723 | 46 | 15552 | 0.2958 | |
percent_of_user_edits
can be misleading, because I've seen a couple of times that an SPA creates a bio and then adds that name to all kind of lists and other articles. They might have created the autobio with 2-3 edits, and then spent their next 10 edits spreading the Wikilink. So it might be better to measure something like percentage of contribution (that is all text a single editor contributed) to a single article. The typical COI-SPA would add like 1,000 or more bytes to a single article, and then 30-50 bytes to half a dozen lists, so in that case it would like "75% of user doe's contributions went to article John Doe". And again, I don't know if that can be programmed, either.