This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 65 | ← | Archive 69 | Archive 70 | Archive 71 | Archive 72 | Archive 73 | → | Archive 75 |
DNV GL, a large ship register, has changed the link scheme for their online ship register at https://exchange.dnv.com/Exchange/Redirect.aspx and moved it to http://vesselregister.dnvgl.com/vesselregister/vesselregister.html
I updated the {{
DNV}} citation template to use the new link scheme, but there are still
120 pages that used bare URLs that are now dead links. I am requesting a bot to look for {{
cite web}} templates that link to URLs containing "https://exchange.dnv.com/exchange/main.aspx?" and replace them with {{
DNV|vesselid=vesselid|title=title|accessdate=accessdate}}
|title=
parameter of the {{
cite web}} template. However, if part of the title is between ''
s (it's in italic), only use that part of the title. For example: Vessel info: ''Freewinds'': Dimensions
would just be Freewinds
and ''Mariella'' - Summary
would be Mariella
.|accessdate=
parameter of the {{
cite web}} template.--
Ahecht (
TALK
PAGE) 23:22, 11 May 2016 (UTC)
Hi, is there a tool (or could one be created) that would report, for all the articles in a category, what are the coordinates appearing in them? This would be useful for editors creating list-articles that will use the {{ GeoGroup}} template, allowing readers to see all the locations in an OSM map, a Google map, or a Bing map. It would also be useful for updating list-articles, because it would allow for comparison to see if coordinates had been added or changed.
For example, there is Category:Dams in Maharashtra, which has multiple sub-categories, in which all or most articles include coordinates.
It would be great if the tool returned something like:
It should work if the coordinates were in either Degrees-Minutes-Seconds format or in decimal format, like the item
which is the one temple in Draft:List of Hindu temples in Cuttack not having DMS format coordinates.
If an article included more than one set of coordinates, I suppose it should return all of them.
This question came up at Wikipedia talk:Noticeboard for India-related topics#Getting coordinates and displaying maps for lists of places in India, asked by User:Dharmadhyaksha.
Thank you for your consideration! :) -- do ncr am 19:19, 23 April 2016 (UTC)
I need help replacing 343 external links. The webmaster of the IreAtlas Townland Database brought it to my attention that the url for his website changed from www.seanruad.com (now a spam site) to http://www.thecore.com/seanruad. Using the Internet Archive, I can confirm that the two web addresses formerly hosted the same website, compare current with archived site. According to Special:LinkSearch, there are 343 links currently pointing to the spam website that should be redirected to www.thecore.com/seanruad. For the record, I do not have AWB and don't know how to use it. Thanks for your assistance. Altamel ( talk) 04:52, 17 May 2016 (UTC)
I think it would be cool if we had a bot that would output the number of edits a user has, per request. So say that I want a subpage in my userspace to be automatically updated to show the most accurate edit count number, this bot would do that. And for the sake of uniformity, the subpage could be named "edits" or "editcount" or something along that line.
This would be useful for automatically updating templates in the user space that rely on the edit count of the user, such as the {{
service awards}}
template.
I think that this could be done by using a tool over at WMF labs, like the User Analysis Tool. -- MorbidEntree - ( Talk to me! (っ◕‿◕)っ♥) 06:34, 6 June 2016 (UTC)
I would like to request the use of a bot to remove all flags from any transclusions of Template:Infobox national football team. Per MOS:FLAG, flags should not be used for purely decorative purposes, and since the nations' names are included anyway, the flags do not aid identification of the nations in question. At the top of each national football team's infobox, a flag is often included next to the country's name; this should be removed, leaving only the country's name in plaintext (no link). At the bottom of each infobox, the team's first match is listed, usually using the {{ fb}} or {{ fb-rt}} templates; once the flags are removed, the opposition's name should remain linked, while the name of the team whose article it is should be in bolded plaintext. Please let me know if I haven't explained this properly; I can provide diffs for how the changes should appear once enacted. – Pee Jay 10:48, 5 February 2016 (UTC)
@ PeeJay2K3: Any progress on this? I seem to remember seeing you start a discussion on this, but I'm having trouble finding it. ~ Rob Talk 14:19, 11 May 2016 (UTC)
hi, i do not know programming and I want to control a bot. how can I do it? can someone else create a bot for me which can tell users that their added data needs citations? In fact a bot with any use would do. thanks -- VarunFEB2003 ( talk) 08:33, 9 June 2016 (UTC)
Can someone make a bot focusing on fixing and de-spamming nation pages? Even better, can you tell me how to make one?-- 91.125.46.171 ( talk) 22:20, 11 June 2016 (UTC)thanks! 91.125.46.171 ( talk) 22:20, 11 June 2016 (UTC)--
Does a bot exist that can place a relevant navbox at the end of an article?
JohannSnow ( talk) 23:13, 29 June 2016 (UTC)
Done I did this on my main account using AWB as it was pretty straightforward, with only around 80 pages to check in total. All the navboxes should be added now. Omni Flames ( talk) 11:45, 30 June 2016 (UTC)
I would like to make a request for a bot that refills unformatted references. I think that it could be really useful as a large number of articles have unformatted references that lies bare. BabbaQ ( talk) 18:00, 29 May 2016 (UTC)
I am a Wikipedian in Residence at the Harold B. Lee library at Brigham Young University. I would like a bot that changes our finding aid external links to HTTPS. So https://findingaid.lib.byu.edu/viewItem/MSS%201115 would become https://findingaid.lib.byu.edu/viewItem/MSS%201115. The reason I want to make this change is so that our analytics can see what Wikipedia subpage the link referral came from. As I understand it and have observed with our data, referrer data is not transmitted when going to an HTTP link (see also this stack exchange discussion).
I'm aware of the Cosmetic changes policy. I believe changing links to HTTPS is aligned with this WP:VPP. Admittedly for the user, changing the URL will not make much of a difference to them. But if I can get more specific referrer data, I can make a better case for how contributing to Wikipedia benefits my institution, which I believe is also beneficial to Wikipedia in general.
There are also only 290 of these links on Wikipedia--if making a bot for this would take longer than a few hours (I hope not, but I've never made a bot before), it might be more efficient to change by hand or with AWB. Rachel Helps (BYU) ( talk) 17:28, 5 May 2016 (UTC)
I would like to request a bot that will record, on a daily basis, the status of various areas requiring administrator attention, whether backlogged or not. It would update a page that will host the data. Each day would add a new line to a table.
Example: (this is not exhaustive for potential things to record)
Date | WP:AIV | WP:UAA | WP:AN3RR | CSD | Active Admins |
10 April 2016 | 9 | 87 | 29 | 171 | 559 |
11 April 2016 | 7 | 95 | 26 | 172 | 555 |
The last column above is derived from edit summary at [1].
There are plenty of potential areas to list. Initially, I would not want to get bound up in having too many, preferring to get this launched with some minimal set and add later as we can.
Rationale: There have been several discussions, seemingly unending, regarding how many administrators we need to keep the project running. We know the numbers passing RfA have declined. We know that things become backlogged from time to time. We do not have any data showing backlogs over time. This data would be useful to inform discussions on how to best benefit the project with perhaps administrator bots, unbundling of permissions, etc. Without this data, we're guessing. -- Hammersoft ( talk) 14:19, 11 April 2016 (UTC)
RfC: BC births and deaths categorization scheme has just been closed on:
(option 5:) Return to earlier guideline-conforming scheme adding "rollup" categories by decade/century
Could we have bot-assistance on realising that? Pinging a few people that may be able to give some assistance:
If I need to be more specific on possible tasks involved, please ask me. -- Francis Schonken ( talk) 17:18, 14 October 2015 (UTC)
mode=pages
. For more info see
MW:Extension:CategoryTree. So AFAIK this "rollup" code will have to be added manually.As the work cannot be processed by bot, I have listed the CFDs listing the births/deaths categories to be reinstated at
WT:WikiProject Years#BC births and deaths categories. –
Fayenatic
L
ondon 13:50, 20 October 2015 (UTC)
I'm not sure but from some comments I deduce this task has been done partially or completely – can someone give an overview whether this is done?
Have any BC births or deaths categories been undeleted that weren't populated before these categories were deleted? (I'd advise against that but have no clue where we are with that). Can someone give an update? -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
I've no clue where we are with this task? Have rollups been added to BC birth and death cats apart from the few examples that came up in the RfC? If not, to me this seems like an excellent job for a bot... any takers? -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
(basicly reverting armbrustbot's dual upmerge edits)
I still think this is best handled by a bot: going through armbrustbots edits on these BC biography articles one by one (that is: reverting them one by one, from the most recent one to the oldest one), and (this is the important part) giving a dump of the articles where such reverts are no longer possible (because they have already been done or some other intermediate edits prevented a revert). Then sort out the items on this dump manually. I'd be happy to help sort out manually when presented with such dump list. -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
@ Francis Schonken: The longer we wait for someone to create a bot to revert another bot's contribs, the greater the proportion that cannot be reverted using rollback or Undo. I've picked up the task again (see above), and gone back past the batch of deaths (40s BC) that Nyttend had fixed. Will you join in again? – Fayenatic L ondon 23:05, 21 January 2016 (UTC)
@ Rhadamante: I noticed that you did some a few months ago – thanks. If you have time to do some more, that would be much appreciated. – Fayenatic L ondon 19:25, 19 April 2016 (UTC)
@ Francis Schonken: you said you'd be happy to help sort out manually when presented with a dump list of pages that can no longer be reverted. Given the passage of time, it is now over 90% of the remaining edits that have to be reverted manually. If you wish I would be happy to convert the list of bot contribs into a list of linked pages; or can you work straight from the contribs (as I have been doing)? – Fayenatic L ondon 22:17, 25 April 2016 (UTC)
Hello I am requesting a bot that can automatically patrol a new page. Please help me and create me this bot. NepaliKeto62 Talk to me 02:59, 4 July 2016 (UTC)
There should be a bot that during the uploading process scans for the two word phrases "Creative Commons" and "Noncommercial", "Non-commercial", or "Non commercial" in the namespace (if I am right as to what it is called) of a file and warns the uploader that there is a licensing problem. I have accidentally uploaded such a file that violated copyrights, but I am innocent, and my warner was not precisely describing my problem.
I believe that, if we were to add this bot for use on Wikipedia and Wikimedia Commons, we could save thousands of unwanted files from going in the wrong direction and keep many uploaders (such as me) non-upset.
Gamingforfun365
(talk) 19:29, 12 July 2016 (UTC)
Not done This is a job for the upload wizard, not a user-operated bot. This, therefore, is the wrong forum. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:11, 19 July 2016 (UTC)
A bot should clean up these 5500 rests and left-overs from Template:Persondata. 88.67.113.78 ( talk) 14:14, 13 July 2016 (UTC)
<!-- Metadata: see [[Wikipedia:Persondata]] -->
should be removed, you did absolutely nothing regarding this. This code rests can confuse readers and editors.
88.67.113.78 (
talk) 18:21, 13 July 2016 (UTC)Doing... -- Magioladitis ( talk) 08:01, 15 July 2016 (UTC)
Done. -- Magioladitis ( talk) 20:34, 20 July 2016 (UTC)
Hi,
The first URL requires a password to get access to the source (and an irritating window hinders other actions) whereas the second one enables the access to the same source without the need of a password. Quite a lot of articles on plant families are concerned. Thanks Bu193 ( talk) 13:45, 14 July 2016 (UTC)
In WP's articles for professional boxing weight classes, the ones with two words do not contain a hyphen. However, the following categories do:
I would like to request for all articles linked to the above categories to have the hyphen removed, as I have already (albeit preemptively) moved the category pages to non-hyphenated ones.. without realising I would have to manually edit thousands of articles to remove the hyphens. Mac Dreamstate ( talk) 19:58, 21 July 2016 (UTC)
Many WikiProjects have participant lists. Many of the editors on those lists haven't edited in months, or even years—rendering those lists out-of-date.
This bot would find and update participant lists. Once it found a list, it would remove users who haven't edited Wikipedia for more than three months. The Transhumanist 20:26, 24 November 2015 (UTC)
Can anyone unzip the https stream of a full day's snapshots [7] [8] [9] from https://dumps.wikimedia.org/other/analytics/ e.g. pageviews-20160601-[012[0-9]00000.gz] such as to produce the sorted list of the top 300,000 /^en / articles daily in the lowest amount of memory? EllenCT ( talk) 22:21, 5 June 2016 (UTC)
An adminbot should delete the pages in Category:Documentation subpages without corresponding pages where the base template has no transclusions per criterion G8, with a few exceptions. If the base template was moved without redirect, the doc page should also be moved without redirect. An example of this is Template:User WPVG2/doc, where the base template was moved to User:Crash Underride/User WPVG2. This doc page should be moved to User:Crash Underride/User WPVG2/doc without redirect. GeoffreyT2000 ( talk) 22:31, 2 June 2016 (UTC)
Hi all
I'm trying to use TreeViews to get information on what are the most viewed articles in Category:Education, unfortunately such large categories just crash my browser, it means I will have to split the query up into at least 50-100 smaller queries.
Would this be possible to do with a bot? Ideally the output would be spreadsheet of the article title and the number of page views of the article for a 30, 60 or 90 period in the recent past. I will use Treeviews if it is the only way but I'd really love to save myself from half a day of data entry. I imagine this would also be useful for people working with other organisations for other subjects if there was a repeatable process for muggles to follow.
Thanks
John Cummings ( talk) 14:55, 21 April 2016 (UTC)
People, including the literally hundreds of thousands of BLP subjects involved, often prefer diacritic accent marks on article title characters, but very few people know how to type them on standard keyboards, and in many cases they turn URLs into incomprehensible strings of hexadecimal-encoded Unicode.
Has the question of the mass creation of unadorned ASCII-only redirects to article titles with diacritics in them come up before? If so, what was the disposition?
If not, is it a reasonable project to create them? How can their existence be signaled to those who may want to use the more legible URLs? Can a bot be trusted to add the non-accented title name to infoboxes and first paragraph bolded names? EllenCT ( talk) 13:21, 11 May 2016 (UTC)
I could see cases where it might be appropriate to give a diacritic-free version of the name in article, especially where this changes spelling, e.g. Müller / Mueller, and cases where the letters with diacritics are not necessarily fully supported (ŵ - a Welsh letter (and possibly others) was in this state for a long time). We should do so carefully, though. Adam Cuerden ( talk) 15:29, 11 June 2016 (UTC)
Would it be possible for a bot to go through articles on footballers and create a list of ones where they are listed as playing for a club in their infobox, but are not in the matching category? For example, Danny Green (footballer, born 1990) is listed as playing for Thurrock F.C. in the infobox, but Category:Thurrock F.C. players is missing from the article. This is a quite frequent occurrence as when a player is transferred, editors may forget to add the category despite updating other parts of the article. Cheers, Number 5 7 14:55, 22 June 2016 (UTC)
We need a bit that will search for all draft outlines in user and Draftspace and move them to the drafts page at the Outlines project. This is mandated by the consensus at Wikipedia talk:WikiProject Outlines/Drafts/Outline of ancient history. — Preceding unsigned comment added by 107.77.230.182 ( talk) 05:32, 23 June 2016 (UTC)
I request a bot that does the following, while preserving the original time as it was in the notice before it was removed:
-- Laber□ T 20:29, 2 May 2016 (UTC)
See Template talk:UnitedStatesCode#Handling usc.7Cch.7Csec.281.29.28A.29.28i.29 type_invocations.2C or a bot to autocorrect.3B transferred codes.3B https for links Sai ¿? ✍ 11:10, 27 June 2016 (UTC)
For the articles found using insource:/== *External Links *==/ please correct the following:
-- Leyo 19:13, 16 July 2016 (UTC)
I fixed insource:/== *External Link *==/ and started fixing insource:/== *External link *==/. -- Magioladitis ( talk) 08:00, 30 July 2016 (UTC)
Done -- Magioladitis ( talk) 11:04, 30 July 2016 (UTC)
I started fixing insource:/== *Source *==/. -- Magioladitis ( talk) 11:11, 30 July 2016 (UTC)
I started fixing insource:/== *Reference *==/. -- Magioladitis ( talk) 20:27, 30 July 2016 (UTC)
I started fixing insource:/== *Also see *==/. -- Magioladitis ( talk) 21:54, 30 July 2016 (UTC)
I started fixing insource:/== *Also See *==/. -- Magioladitis ( talk) 13:01, 31 July 2016 (UTC)
Omni Flames check the extra things I did based on this botreq. I also left a message on my BRFA. -- Magioladitis ( talk) 22:02, 31 July 2016 (UTC)
@ Omni Flames and Magioladitis: I removed {{ Resolved}} as several of the above searches still need to be done. -- Leyo 07:43, 9 August 2016 (UTC)
BRFA filed -- Magioladitis ( talk) 08:48, 9 August 2016 (UTC)
Funnily enough this was one of SmackBot's first tasks. All the best:
Rich
Farmbrough, 16:22, 13 August 2016 (UTC).
I fixed everything in the database and I'll be running it regularly. -- Magioladitis ( talk) 11:05, 20 August 2016 (UTC)
Some pages with insource:/== *External LInks *==/ found. -- Magioladitis ( talk) 11:09, 20 August 2016 (UTC)
I did a database scan. All fixed. -- Magioladitis ( talk) 08:30, 24 August 2016 (UTC)
I was kindly redirected here by User:Madman,
consulting this thread that I left in his botpage (please see here /info/en/?search=User_talk:Madman) and following the leads I provided therein, can anyone help out in "resurrecting" these WWW.FPF.PT links? Site has changed configuration it seems.
Attentively -- Be Quiet AL ( talk) 18:05, 26 July 2016 (UTC)
http://www.zerozerofootball.com/jogador.php?id=XXX&search=1
with {{Zerozero profile|id=XXX}}
? --
samtar
talk or
stalk 18:10, 26 July 2016 (UTC)No, sorry, wrong diff, what I was trying to show with that one was that I messaged Madman! The Zerozero situation has already been dealt with totally, the situation I meant is this one ( /info/en/?search=Wikipedia_talk:WikiProject_Football/Archive_103#Portuguese_Football_Federation), for example Anthony Lopes contains the already revived form, whereas Carlos Manuel Santos Fortes is still "dead". -- Be Quiet AL ( talk) 18:12, 26 July 2016 (UTC)
As far as I know, in some cases (not sure if all) old URL has a different set of numbers attached to it than the new one (for example Simãozinho). So this is bad news, no? -- Be Quiet AL ( talk) 16:16, 27 July 2016 (UTC)
There are up to 900+ localities in Israel that need a population update (A list can be seen here: User:Number 57/sandbox, these are also the names that should be used in the templates). There is a template for population:
| popyear = {{Israel populations|Year}}
| population = {{Israel populations|}}
| population_footnotes={{Israel populations|reference}}
These should be placed in infoboxes of locality articles (except for localities located in the Judea and Samaria area which has a newer population figure).
In addition to that, it is also needed that the first paragraph will also have at the end:
In {{Israel populations|Year}} it had a population of {{Israel populations|Ramat Hashofet}}.
When all the templates will be placed, there will be no need to update the population in every article, but insteed, the template it self will be updated when a new population figure will be published by the Israel Central Bureau of Statistics. I think there is such bot in the Hebrew Wikipedia and it will be good to have one here as well, because it will take a lot of time and efort to update every single locality.-- Bolter21 ( talk to me) 15:03, 31 July 2016 (UTC)
Are there currently an active bot that can archive dead links in articles? Otherwise I would request that one is started. It seems that all the active ones are inactive. BabbaQ ( talk) 11:55, 20 August 2016 (UTC)
I'm sure we've been here before: would anyone be able to run through the 140K articles under Category:Orphaned articles and remove the {{ orphan}} tag from any that have incoming links from other articles that are not redirects? Wikipedia:Orphan#Criteria specified zero incoming links; but a quick trawl through a few tens of articles in the category tree finds disturbingly many which have incoming links. Addbot, BattyBot and Yobot are reputed to work in this area, but the first seems inactive and t'other two are AWB, which might be somewhat overfaced. @ GoingBatty:, @ John of Reading:, @ Kvng: may wish to weigh in, since they may know more than I about this area. thanks -- Tagishsimon (talk) 00:25, 5 September 2016 (UTC)
Click on "►" below to display subcategories: |
---|
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Currently, the statistical data in http://wikistats.wmflabs.org/wikimedias_wiki.php automatically updates daily. However, users need to manually update, which is undesirable when no one updates it.
Therefore, I suggest updating the data with a bot. The bot needs to copy everything in the source page to the statistics page, while leaving a notice with the words "This page is updated automatically with a bot. Do not manually edit it. ". The bot needs to update the data daily at 00:15, 01:15, 02:15, 03:15, 04:15, 05:15, 06:15, 12:15, 18:15, which will fetch the latest data from the website. Then, the data will be up to date automatically. Wetitpig0 ( talk) 06:17, 5 September 2016 (UTC)
You may ask why I don't create a bot by myself? Actually there are lots of inactive bots. When there are vacant bots, left with no work, why should I still create one? I can just ask those bot creators to edit their bot code! The list of inactive bots are here and here. Wetitpig0 ( talk) 09:09, 5 September 2016 (UTC)
Maybe this can be done by one of the currently existing archiving bots, but I'd like help with the following: On some discussion pages, discussions are 'grouped' in themes - WT:SBL and WT:SWL are two typical examples. Both mentioned pages (and some others) have two main sections, one for 'additions' and one for 'removals' (and some other discussion sections), and editors make subsections inside these sections depending on the nature of their request. These are currently manually backed up into archives with the same structure ('additions', 'removals', etc.) as I am not aware of a bot that is capable of handling this. I would like to see this done by bot, archiving sections ## days after last timestamp into archive pages following the same structure. Any ideas? -- Dirk Beetstra T C 06:17, 10 July 2016 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 65 | ← | Archive 69 | Archive 70 | Archive 71 | Archive 72 | Archive 73 | → | Archive 75 |
DNV GL, a large ship register, has changed the link scheme for their online ship register at https://exchange.dnv.com/Exchange/Redirect.aspx and moved it to http://vesselregister.dnvgl.com/vesselregister/vesselregister.html
I updated the {{
DNV}} citation template to use the new link scheme, but there are still
120 pages that used bare URLs that are now dead links. I am requesting a bot to look for {{
cite web}} templates that link to URLs containing "https://exchange.dnv.com/exchange/main.aspx?" and replace them with {{
DNV|vesselid=vesselid|title=title|accessdate=accessdate}}
|title=
parameter of the {{
cite web}} template. However, if part of the title is between ''
s (it's in italic), only use that part of the title. For example: Vessel info: ''Freewinds'': Dimensions
would just be Freewinds
and ''Mariella'' - Summary
would be Mariella
.|accessdate=
parameter of the {{
cite web}} template.--
Ahecht (
TALK
PAGE) 23:22, 11 May 2016 (UTC)
Hi, is there a tool (or could one be created) that would report, for all the articles in a category, what are the coordinates appearing in them? This would be useful for editors creating list-articles that will use the {{ GeoGroup}} template, allowing readers to see all the locations in an OSM map, a Google map, or a Bing map. It would also be useful for updating list-articles, because it would allow for comparison to see if coordinates had been added or changed.
For example, there is Category:Dams in Maharashtra, which has multiple sub-categories, in which all or most articles include coordinates.
It would be great if the tool returned something like:
It should work if the coordinates were in either Degrees-Minutes-Seconds format or in decimal format, like the item
which is the one temple in Draft:List of Hindu temples in Cuttack not having DMS format coordinates.
If an article included more than one set of coordinates, I suppose it should return all of them.
This question came up at Wikipedia talk:Noticeboard for India-related topics#Getting coordinates and displaying maps for lists of places in India, asked by User:Dharmadhyaksha.
Thank you for your consideration! :) -- do ncr am 19:19, 23 April 2016 (UTC)
I need help replacing 343 external links. The webmaster of the IreAtlas Townland Database brought it to my attention that the url for his website changed from www.seanruad.com (now a spam site) to http://www.thecore.com/seanruad. Using the Internet Archive, I can confirm that the two web addresses formerly hosted the same website, compare current with archived site. According to Special:LinkSearch, there are 343 links currently pointing to the spam website that should be redirected to www.thecore.com/seanruad. For the record, I do not have AWB and don't know how to use it. Thanks for your assistance. Altamel ( talk) 04:52, 17 May 2016 (UTC)
I think it would be cool if we had a bot that would output the number of edits a user has, per request. So say that I want a subpage in my userspace to be automatically updated to show the most accurate edit count number, this bot would do that. And for the sake of uniformity, the subpage could be named "edits" or "editcount" or something along that line.
This would be useful for automatically updating templates in the user space that rely on the edit count of the user, such as the {{
service awards}}
template.
I think that this could be done by using a tool over at WMF labs, like the User Analysis Tool. -- MorbidEntree - ( Talk to me! (っ◕‿◕)っ♥) 06:34, 6 June 2016 (UTC)
I would like to request the use of a bot to remove all flags from any transclusions of Template:Infobox national football team. Per MOS:FLAG, flags should not be used for purely decorative purposes, and since the nations' names are included anyway, the flags do not aid identification of the nations in question. At the top of each national football team's infobox, a flag is often included next to the country's name; this should be removed, leaving only the country's name in plaintext (no link). At the bottom of each infobox, the team's first match is listed, usually using the {{ fb}} or {{ fb-rt}} templates; once the flags are removed, the opposition's name should remain linked, while the name of the team whose article it is should be in bolded plaintext. Please let me know if I haven't explained this properly; I can provide diffs for how the changes should appear once enacted. – Pee Jay 10:48, 5 February 2016 (UTC)
@ PeeJay2K3: Any progress on this? I seem to remember seeing you start a discussion on this, but I'm having trouble finding it. ~ Rob Talk 14:19, 11 May 2016 (UTC)
hi, i do not know programming and I want to control a bot. how can I do it? can someone else create a bot for me which can tell users that their added data needs citations? In fact a bot with any use would do. thanks -- VarunFEB2003 ( talk) 08:33, 9 June 2016 (UTC)
Can someone make a bot focusing on fixing and de-spamming nation pages? Even better, can you tell me how to make one?-- 91.125.46.171 ( talk) 22:20, 11 June 2016 (UTC)thanks! 91.125.46.171 ( talk) 22:20, 11 June 2016 (UTC)--
Does a bot exist that can place a relevant navbox at the end of an article?
JohannSnow ( talk) 23:13, 29 June 2016 (UTC)
Done I did this on my main account using AWB as it was pretty straightforward, with only around 80 pages to check in total. All the navboxes should be added now. Omni Flames ( talk) 11:45, 30 June 2016 (UTC)
I would like to make a request for a bot that refills unformatted references. I think that it could be really useful as a large number of articles have unformatted references that lies bare. BabbaQ ( talk) 18:00, 29 May 2016 (UTC)
I am a Wikipedian in Residence at the Harold B. Lee library at Brigham Young University. I would like a bot that changes our finding aid external links to HTTPS. So https://findingaid.lib.byu.edu/viewItem/MSS%201115 would become https://findingaid.lib.byu.edu/viewItem/MSS%201115. The reason I want to make this change is so that our analytics can see what Wikipedia subpage the link referral came from. As I understand it and have observed with our data, referrer data is not transmitted when going to an HTTP link (see also this stack exchange discussion).
I'm aware of the Cosmetic changes policy. I believe changing links to HTTPS is aligned with this WP:VPP. Admittedly for the user, changing the URL will not make much of a difference to them. But if I can get more specific referrer data, I can make a better case for how contributing to Wikipedia benefits my institution, which I believe is also beneficial to Wikipedia in general.
There are also only 290 of these links on Wikipedia--if making a bot for this would take longer than a few hours (I hope not, but I've never made a bot before), it might be more efficient to change by hand or with AWB. Rachel Helps (BYU) ( talk) 17:28, 5 May 2016 (UTC)
I would like to request a bot that will record, on a daily basis, the status of various areas requiring administrator attention, whether backlogged or not. It would update a page that will host the data. Each day would add a new line to a table.
Example: (this is not exhaustive for potential things to record)
Date | WP:AIV | WP:UAA | WP:AN3RR | CSD | Active Admins |
10 April 2016 | 9 | 87 | 29 | 171 | 559 |
11 April 2016 | 7 | 95 | 26 | 172 | 555 |
The last column above is derived from edit summary at [1].
There are plenty of potential areas to list. Initially, I would not want to get bound up in having too many, preferring to get this launched with some minimal set and add later as we can.
Rationale: There have been several discussions, seemingly unending, regarding how many administrators we need to keep the project running. We know the numbers passing RfA have declined. We know that things become backlogged from time to time. We do not have any data showing backlogs over time. This data would be useful to inform discussions on how to best benefit the project with perhaps administrator bots, unbundling of permissions, etc. Without this data, we're guessing. -- Hammersoft ( talk) 14:19, 11 April 2016 (UTC)
RfC: BC births and deaths categorization scheme has just been closed on:
(option 5:) Return to earlier guideline-conforming scheme adding "rollup" categories by decade/century
Could we have bot-assistance on realising that? Pinging a few people that may be able to give some assistance:
If I need to be more specific on possible tasks involved, please ask me. -- Francis Schonken ( talk) 17:18, 14 October 2015 (UTC)
mode=pages
. For more info see
MW:Extension:CategoryTree. So AFAIK this "rollup" code will have to be added manually.As the work cannot be processed by bot, I have listed the CFDs listing the births/deaths categories to be reinstated at
WT:WikiProject Years#BC births and deaths categories. –
Fayenatic
L
ondon 13:50, 20 October 2015 (UTC)
I'm not sure but from some comments I deduce this task has been done partially or completely – can someone give an overview whether this is done?
Have any BC births or deaths categories been undeleted that weren't populated before these categories were deleted? (I'd advise against that but have no clue where we are with that). Can someone give an update? -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
I've no clue where we are with this task? Have rollups been added to BC birth and death cats apart from the few examples that came up in the RfC? If not, to me this seems like an excellent job for a bot... any takers? -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
(basicly reverting armbrustbot's dual upmerge edits)
I still think this is best handled by a bot: going through armbrustbots edits on these BC biography articles one by one (that is: reverting them one by one, from the most recent one to the oldest one), and (this is the important part) giving a dump of the articles where such reverts are no longer possible (because they have already been done or some other intermediate edits prevented a revert). Then sort out the items on this dump manually. I'd be happy to help sort out manually when presented with such dump list. -- Francis Schonken ( talk) 03:36, 16 November 2015 (UTC)
@ Francis Schonken: The longer we wait for someone to create a bot to revert another bot's contribs, the greater the proportion that cannot be reverted using rollback or Undo. I've picked up the task again (see above), and gone back past the batch of deaths (40s BC) that Nyttend had fixed. Will you join in again? – Fayenatic L ondon 23:05, 21 January 2016 (UTC)
@ Rhadamante: I noticed that you did some a few months ago – thanks. If you have time to do some more, that would be much appreciated. – Fayenatic L ondon 19:25, 19 April 2016 (UTC)
@ Francis Schonken: you said you'd be happy to help sort out manually when presented with a dump list of pages that can no longer be reverted. Given the passage of time, it is now over 90% of the remaining edits that have to be reverted manually. If you wish I would be happy to convert the list of bot contribs into a list of linked pages; or can you work straight from the contribs (as I have been doing)? – Fayenatic L ondon 22:17, 25 April 2016 (UTC)
Hello I am requesting a bot that can automatically patrol a new page. Please help me and create me this bot. NepaliKeto62 Talk to me 02:59, 4 July 2016 (UTC)
There should be a bot that during the uploading process scans for the two word phrases "Creative Commons" and "Noncommercial", "Non-commercial", or "Non commercial" in the namespace (if I am right as to what it is called) of a file and warns the uploader that there is a licensing problem. I have accidentally uploaded such a file that violated copyrights, but I am innocent, and my warner was not precisely describing my problem.
I believe that, if we were to add this bot for use on Wikipedia and Wikimedia Commons, we could save thousands of unwanted files from going in the wrong direction and keep many uploaders (such as me) non-upset.
Gamingforfun365
(talk) 19:29, 12 July 2016 (UTC)
Not done This is a job for the upload wizard, not a user-operated bot. This, therefore, is the wrong forum. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:11, 19 July 2016 (UTC)
A bot should clean up these 5500 rests and left-overs from Template:Persondata. 88.67.113.78 ( talk) 14:14, 13 July 2016 (UTC)
<!-- Metadata: see [[Wikipedia:Persondata]] -->
should be removed, you did absolutely nothing regarding this. This code rests can confuse readers and editors.
88.67.113.78 (
talk) 18:21, 13 July 2016 (UTC)Doing... -- Magioladitis ( talk) 08:01, 15 July 2016 (UTC)
Done. -- Magioladitis ( talk) 20:34, 20 July 2016 (UTC)
Hi,
The first URL requires a password to get access to the source (and an irritating window hinders other actions) whereas the second one enables the access to the same source without the need of a password. Quite a lot of articles on plant families are concerned. Thanks Bu193 ( talk) 13:45, 14 July 2016 (UTC)
In WP's articles for professional boxing weight classes, the ones with two words do not contain a hyphen. However, the following categories do:
I would like to request for all articles linked to the above categories to have the hyphen removed, as I have already (albeit preemptively) moved the category pages to non-hyphenated ones.. without realising I would have to manually edit thousands of articles to remove the hyphens. Mac Dreamstate ( talk) 19:58, 21 July 2016 (UTC)
Many WikiProjects have participant lists. Many of the editors on those lists haven't edited in months, or even years—rendering those lists out-of-date.
This bot would find and update participant lists. Once it found a list, it would remove users who haven't edited Wikipedia for more than three months. The Transhumanist 20:26, 24 November 2015 (UTC)
Can anyone unzip the https stream of a full day's snapshots [7] [8] [9] from https://dumps.wikimedia.org/other/analytics/ e.g. pageviews-20160601-[012[0-9]00000.gz] such as to produce the sorted list of the top 300,000 /^en / articles daily in the lowest amount of memory? EllenCT ( talk) 22:21, 5 June 2016 (UTC)
An adminbot should delete the pages in Category:Documentation subpages without corresponding pages where the base template has no transclusions per criterion G8, with a few exceptions. If the base template was moved without redirect, the doc page should also be moved without redirect. An example of this is Template:User WPVG2/doc, where the base template was moved to User:Crash Underride/User WPVG2. This doc page should be moved to User:Crash Underride/User WPVG2/doc without redirect. GeoffreyT2000 ( talk) 22:31, 2 June 2016 (UTC)
Hi all
I'm trying to use TreeViews to get information on what are the most viewed articles in Category:Education, unfortunately such large categories just crash my browser, it means I will have to split the query up into at least 50-100 smaller queries.
Would this be possible to do with a bot? Ideally the output would be spreadsheet of the article title and the number of page views of the article for a 30, 60 or 90 period in the recent past. I will use Treeviews if it is the only way but I'd really love to save myself from half a day of data entry. I imagine this would also be useful for people working with other organisations for other subjects if there was a repeatable process for muggles to follow.
Thanks
John Cummings ( talk) 14:55, 21 April 2016 (UTC)
People, including the literally hundreds of thousands of BLP subjects involved, often prefer diacritic accent marks on article title characters, but very few people know how to type them on standard keyboards, and in many cases they turn URLs into incomprehensible strings of hexadecimal-encoded Unicode.
Has the question of the mass creation of unadorned ASCII-only redirects to article titles with diacritics in them come up before? If so, what was the disposition?
If not, is it a reasonable project to create them? How can their existence be signaled to those who may want to use the more legible URLs? Can a bot be trusted to add the non-accented title name to infoboxes and first paragraph bolded names? EllenCT ( talk) 13:21, 11 May 2016 (UTC)
I could see cases where it might be appropriate to give a diacritic-free version of the name in article, especially where this changes spelling, e.g. Müller / Mueller, and cases where the letters with diacritics are not necessarily fully supported (ŵ - a Welsh letter (and possibly others) was in this state for a long time). We should do so carefully, though. Adam Cuerden ( talk) 15:29, 11 June 2016 (UTC)
Would it be possible for a bot to go through articles on footballers and create a list of ones where they are listed as playing for a club in their infobox, but are not in the matching category? For example, Danny Green (footballer, born 1990) is listed as playing for Thurrock F.C. in the infobox, but Category:Thurrock F.C. players is missing from the article. This is a quite frequent occurrence as when a player is transferred, editors may forget to add the category despite updating other parts of the article. Cheers, Number 5 7 14:55, 22 June 2016 (UTC)
We need a bit that will search for all draft outlines in user and Draftspace and move them to the drafts page at the Outlines project. This is mandated by the consensus at Wikipedia talk:WikiProject Outlines/Drafts/Outline of ancient history. — Preceding unsigned comment added by 107.77.230.182 ( talk) 05:32, 23 June 2016 (UTC)
I request a bot that does the following, while preserving the original time as it was in the notice before it was removed:
-- Laber□ T 20:29, 2 May 2016 (UTC)
See Template talk:UnitedStatesCode#Handling usc.7Cch.7Csec.281.29.28A.29.28i.29 type_invocations.2C or a bot to autocorrect.3B transferred codes.3B https for links Sai ¿? ✍ 11:10, 27 June 2016 (UTC)
For the articles found using insource:/== *External Links *==/ please correct the following:
-- Leyo 19:13, 16 July 2016 (UTC)
I fixed insource:/== *External Link *==/ and started fixing insource:/== *External link *==/. -- Magioladitis ( talk) 08:00, 30 July 2016 (UTC)
Done -- Magioladitis ( talk) 11:04, 30 July 2016 (UTC)
I started fixing insource:/== *Source *==/. -- Magioladitis ( talk) 11:11, 30 July 2016 (UTC)
I started fixing insource:/== *Reference *==/. -- Magioladitis ( talk) 20:27, 30 July 2016 (UTC)
I started fixing insource:/== *Also see *==/. -- Magioladitis ( talk) 21:54, 30 July 2016 (UTC)
I started fixing insource:/== *Also See *==/. -- Magioladitis ( talk) 13:01, 31 July 2016 (UTC)
Omni Flames check the extra things I did based on this botreq. I also left a message on my BRFA. -- Magioladitis ( talk) 22:02, 31 July 2016 (UTC)
@ Omni Flames and Magioladitis: I removed {{ Resolved}} as several of the above searches still need to be done. -- Leyo 07:43, 9 August 2016 (UTC)
BRFA filed -- Magioladitis ( talk) 08:48, 9 August 2016 (UTC)
Funnily enough this was one of SmackBot's first tasks. All the best:
Rich
Farmbrough, 16:22, 13 August 2016 (UTC).
I fixed everything in the database and I'll be running it regularly. -- Magioladitis ( talk) 11:05, 20 August 2016 (UTC)
Some pages with insource:/== *External LInks *==/ found. -- Magioladitis ( talk) 11:09, 20 August 2016 (UTC)
I did a database scan. All fixed. -- Magioladitis ( talk) 08:30, 24 August 2016 (UTC)
I was kindly redirected here by User:Madman,
consulting this thread that I left in his botpage (please see here /info/en/?search=User_talk:Madman) and following the leads I provided therein, can anyone help out in "resurrecting" these WWW.FPF.PT links? Site has changed configuration it seems.
Attentively -- Be Quiet AL ( talk) 18:05, 26 July 2016 (UTC)
http://www.zerozerofootball.com/jogador.php?id=XXX&search=1
with {{Zerozero profile|id=XXX}}
? --
samtar
talk or
stalk 18:10, 26 July 2016 (UTC)No, sorry, wrong diff, what I was trying to show with that one was that I messaged Madman! The Zerozero situation has already been dealt with totally, the situation I meant is this one ( /info/en/?search=Wikipedia_talk:WikiProject_Football/Archive_103#Portuguese_Football_Federation), for example Anthony Lopes contains the already revived form, whereas Carlos Manuel Santos Fortes is still "dead". -- Be Quiet AL ( talk) 18:12, 26 July 2016 (UTC)
As far as I know, in some cases (not sure if all) old URL has a different set of numbers attached to it than the new one (for example Simãozinho). So this is bad news, no? -- Be Quiet AL ( talk) 16:16, 27 July 2016 (UTC)
There are up to 900+ localities in Israel that need a population update (A list can be seen here: User:Number 57/sandbox, these are also the names that should be used in the templates). There is a template for population:
| popyear = {{Israel populations|Year}}
| population = {{Israel populations|}}
| population_footnotes={{Israel populations|reference}}
These should be placed in infoboxes of locality articles (except for localities located in the Judea and Samaria area which has a newer population figure).
In addition to that, it is also needed that the first paragraph will also have at the end:
In {{Israel populations|Year}} it had a population of {{Israel populations|Ramat Hashofet}}.
When all the templates will be placed, there will be no need to update the population in every article, but insteed, the template it self will be updated when a new population figure will be published by the Israel Central Bureau of Statistics. I think there is such bot in the Hebrew Wikipedia and it will be good to have one here as well, because it will take a lot of time and efort to update every single locality.-- Bolter21 ( talk to me) 15:03, 31 July 2016 (UTC)
Are there currently an active bot that can archive dead links in articles? Otherwise I would request that one is started. It seems that all the active ones are inactive. BabbaQ ( talk) 11:55, 20 August 2016 (UTC)
I'm sure we've been here before: would anyone be able to run through the 140K articles under Category:Orphaned articles and remove the {{ orphan}} tag from any that have incoming links from other articles that are not redirects? Wikipedia:Orphan#Criteria specified zero incoming links; but a quick trawl through a few tens of articles in the category tree finds disturbingly many which have incoming links. Addbot, BattyBot and Yobot are reputed to work in this area, but the first seems inactive and t'other two are AWB, which might be somewhat overfaced. @ GoingBatty:, @ John of Reading:, @ Kvng: may wish to weigh in, since they may know more than I about this area. thanks -- Tagishsimon (talk) 00:25, 5 September 2016 (UTC)
Click on "►" below to display subcategories: |
---|
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Currently, the statistical data in http://wikistats.wmflabs.org/wikimedias_wiki.php automatically updates daily. However, users need to manually update, which is undesirable when no one updates it.
Therefore, I suggest updating the data with a bot. The bot needs to copy everything in the source page to the statistics page, while leaving a notice with the words "This page is updated automatically with a bot. Do not manually edit it. ". The bot needs to update the data daily at 00:15, 01:15, 02:15, 03:15, 04:15, 05:15, 06:15, 12:15, 18:15, which will fetch the latest data from the website. Then, the data will be up to date automatically. Wetitpig0 ( talk) 06:17, 5 September 2016 (UTC)
You may ask why I don't create a bot by myself? Actually there are lots of inactive bots. When there are vacant bots, left with no work, why should I still create one? I can just ask those bot creators to edit their bot code! The list of inactive bots are here and here. Wetitpig0 ( talk) 09:09, 5 September 2016 (UTC)
Maybe this can be done by one of the currently existing archiving bots, but I'd like help with the following: On some discussion pages, discussions are 'grouped' in themes - WT:SBL and WT:SWL are two typical examples. Both mentioned pages (and some others) have two main sections, one for 'additions' and one for 'removals' (and some other discussion sections), and editors make subsections inside these sections depending on the nature of their request. These are currently manually backed up into archives with the same structure ('additions', 'removals', etc.) as I am not aware of a bot that is capable of handling this. I would like to see this done by bot, archiving sections ## days after last timestamp into archive pages following the same structure. Any ideas? -- Dirk Beetstra T C 06:17, 10 July 2016 (UTC)