This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 60 | ← | Archive 64 | Archive 65 | Archive 66 | Archive 67 | Archive 68 | → | Archive 70 |
I've created new maps for almost every city, village, and township in Ohio, and I'm about halfway done with adding them to their articles. This normally works fine, but I periodically make errors, and it would help if a bot could check all 2200 of these pages for errors. In some cases, I've added the map of one place to the article about a different one; for example, here I copy/pasted one article's map into another article. In other cases, I simply haven't switched all the elements of the description correctly; for example, here I used the correct map, but I made a mistake in the caption, since the city's in Trumbull County, not Mahoning County as suggested by the description.
All maps follow a rigid naming convention: Map of COUNTYNAME County Ohio Highlighting PLACE TYPE.png. "Place" is simply the community or township name, and "Type" is City, Village, or Township (note the capital letter). Likewise, all captions follow the same convention: Location of PLACE in COUNTYNAME County, although "Township" is part of the PLACE section for townships (see the caption for Beaver Township, Mahoning County, Ohio). Given this convention, I expect that the bot can handle the situation easily. I'm imagining the following (collapsed so this request doesn't appear so massive):
Extended content
|
---|
Final notes: (1) When an article already had a good detailed map, I didn't add the new one. Therefore, if the bot doesn't find "County Ohio Highlighting" at all (this will be the case at Seven Hills, Ohio, for example), it should go to the next page without logging anything at all, because in most or all such cases, there's no problem. (2) Since the project isn't done, I'd appreciate it if you didn't run the full check until I tell you that I'm done. (3) Some municipalities are in multiple counties, so it's possible that the map link would be "wrong"; the bot would find an error when opening Adena, Ohio from {{ Guernsey County, Ohio}}, for example. Coding to avoid this error might require a lot of work, but these situations are rare, so don't worry about it. (4) Finally, since the bot's just logging pages that might be wrong, WP:CONTEXTBOT shouldn't apply. |
Nyttend ( talk) 18:12, 18 August 2015 (UTC)
I am proposing a redrafting of the Wikipedia:Naming conventions (ships) guideline, at user:Saberwyn/Proposed ship naming and disambiguation conventions update. One of the major changes follows the outcome of a Request for Comment on the matter of ship article disambiguation. The current form of the proposal is that all ship articles requiring disambiguation will be disambiguated in the form "(yyyy)", where yyyy = year of launch. If the proposed guideline is accepted, over 26,000 ship articles (number determined by transclusions of {{ infobox ship career}} - selected to include articles with infoboxes on specific ships, as opposed to class articles) will need to be checked for compliance with the guideline, and if not compliant, moved to a compliant title. Could a bot be used to check the titles of all ship articles (as determined per number above), check that year of launch is the method of disambiguation, and if not, move the article?
The bot will have to:
A second bot operation (or both passes if the above is not possible) could generate a list of articles that use a civilian ship prefix (which, under the proposal, will also be depreciated as part of an article title in most cases), so that humans can review article titles and move those necessary to a date-disambiguation title. I'm reluctant to suggest using a bot to move this group, as the prefix may be part of the common name for the subject.
So, theoretically (because the proposal may not pass in this form, or at all), is it possible to create/adapt a bot to do this, how difficult would it be, and what technical problems would have to be surmounted? Any opinions on the appropriateness of the proposed method of disambiguation should be directed to the proposal. -- saberwyn 03:45, 13 September 2015 (UTC)
|ship launched=
parameter. I wouldn't use Category:YYYY ship, because it's possible mistakes have been made in applying that category (listing year built instead of year launched, for instance). The parameter is more explicit in what the date means, so it's less likely to contain errors. You will need specific consensus on what the disambiguator should be, as this was not covered in the RfC you linked. For instance, should we use (1936) or (launched 1936)? ~
Rob
Talk 16:25, 13 September 2015 (UTC)
Typing a specific date, for example "15 February 2013", into our search box, finds articles with that date in references or maintenance templates, before events which happened on that date. Therefore, for every page like Portal:Current events/2013 February 15, we should have redirects from, at least:
and possibly:
and others. Once the backlog is done, we'd need a maintenance bot to create each new day's set.
We also need to decide what to do for dates were no Portal:Current events page exists (mostly pre 1999). Perhaps redirect to the relevant year? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:02, 8 September 2015 (UTC)
Discussion seems to have stalled. How an we take this forward? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:07, 15 September 2015 (UTC)
It has been proposed here that an adminbot be used to restore some 400,000 old IP talk pages that were deleted by user MZMcBride. It is to be noted that about 10,000 such pages have already been restored by user:MusikBot per this BRFA. See the BN discussion and the BRFA for further details. 103.6.159.68 ( talk) 19:11, 17 September 2015 (UTC)
I think it'd be alright to try get a consensus here, as there's no need to fragment the discussion to more pages. I have left a note at WP:VPR and WP:VPT. 103.6.159.88 ( talk) 06:07, 19 September 2015 (UTC)
A bot should fix all remaining instances of T44616. MZMcBride has already fixed some in December 2012; at the time it was Bug 42616. This will make the moves revertible only by administrators. GeoffreyT2000 ( talk) 16:20, 24 September 2015 (UTC)
Requesting bot assistance to tag talkpages of all categories and subcategories within
Category:Establishments in Rivers State by year and
Category:History of Rivers State by period with {{
NigeriaProject|Rivers State=yes}}
. Thanks.
Stanleytux (
talk) 11:28, 23 September 2015 (UTC)
{{
WikiProject Nigeria|Rivers State=yes}}
.
Stanleytux (
talk) 13:10, 23 September 2015 (UTC)
{{WikiProject Nigeria|Rivers State=yes}}
. Thanks.
Stanleytux (
talk) 13:24, 23 September 2015 (UTC)
Recently I did an edit like this to potentially de-eclipse an image Shadowed at Commons.
https://en.wikipedia.org/?title=File%3ABohr_model.jpg&type=revision&diff=682440062&oldid=634253010
It occurred to me that as this could be done for most of the images here Category:Wikipedia files that shadow a file on Wikimedia Commons automatically by a bot.
If bots are allowed to move files, then the entire process can be automated, apart from the eventual F2 deletion.
For All items in Category identifed:- Step 1. - Identify image tagged as {{ ShadowsCommons}}. Step 1a. - Ceck to see if it is actually shadowing Step 2. - Rename filename.ext to filename (uploadtimestamp).ext Step 3. - Remove {{ ShadowsCommons}} from the renamed file. Step 4. - Replace ALL transclusions and non disscusion page links to the file. Step 5. - Tag the created redirect as F2. Step 6. Repeat until category is empty (or only images remaining are protected generics.)
Sfan00 IMG ( talk) 19:20, 23 September 2015 (UTC)
I request a bot that places a tag at the talk page with a notice that the article in question has been through the week of being that weeks TAFI selected article. As it is not done all the time today. -- BabbaQ ( talk) 23:40, 26 September 2015 (UTC)
User:Stefan2 has a query here - https://quarry.wmflabs.org/query/950 which is used to identify files which have the same name on Commons and locally, but which may not be the same media?
Would it be possible for a bot task to run through the list periodically to tag files locally unless the local copy is already tagged using whatever CSD F8 uses, or there is a {{ Keep Local}} template on the local copy? Sfan00 IMG ( talk) 14:52, 3 October 2015 (UTC)
I am requesting you that please accept my bot request because I am wikipedia editor and now I have created 30 pages, so pleasgive me bot to check there mistakes and correct them. Thankyou.-- Productable Khan ( talk) 16:42, 9 October 2015 (UTC)
Wikipedia has a Citation overkill policy, which says that you do not need "more than a couple" citations to back up a single claim. I think a bot could easily recognize when five or more citations are mixed together (like this: [1][2][3][4][5]) and address the problem by adding a " too many citations" tag after the mix of 5 or more citations (like this: [1][2][3][4][5][too many citations]). I believe it would be very helpful because the bot can make more users aware of this policy and it would be less likely to happen in the future.-- Proud User ( talk) 18:06, 9 October 2015 (UTC)
An adminbot needs to revert pages for days in 2003 and 2004 to their last non-redirect version and move them to their corresponding Portal:Current events page (e.g. January 1, 2003 to Portal:Current events/2003 January 1). If the corresponding Portal page already exists, a history merge will be performed (this is why an adminbot is needed). If the day page has history only as a redirect, it can simply be deleted (another reason for an adminbot). Some users that have previously done such moves are Fram, AnomieBOT (a bot), and Waldir. GeoffreyT2000 ( talk) 19:31, 24 September 2015 (UTC)
I request a bot that places the GOCE tag at articles talk pages that has been through a completed work by the GOCE project. A GOCEbot perhaps.-- BabbaQ ( talk) 23:38, 26 September 2015 (UTC)
Alright, let's go with Needs wider discussion. — MusikAnimal talk 16:21, 30 September 2015 (UTC)
A bot needs to remove transclusions of Template:Sort from thousands of pages. The template itself can then be deleted. GeoffreyT2000 ( talk) 00:28, 3 October 2015 (UTC)
Needs wider discussion.
This request concerns the page Wikipedia:Articles for creation/Redirects. As each request on this page is dealt with, it is encapsulated with the code {{afc-c|a}}/{{afc-c|d}} and {{afc-c|b}}. This closes it and places it in a box, now ready for archival. At the moment, we haven't got a bot to move it to the month's archive page which can recognise that the request has been closed. Conventional "archive after x days" bots are unsuitable because some requests can sit on the page for up to a month while they are deliberated over and discussed. There is no average time for requests to be there, so some are done within a day and others may be done a few days or weeks later, depending on what the reviewers do.
Because of this, archival has been done exclusively by hand, picking out the closed requests and cut-pasting them into the archive. Because there is a high turnover of requests, this is a task required daily, at about the same time; this means that it can get tiring and real life causes to stop editing can cause a pileup of old requests. Because I know we have bots which can detect things about sections already on pages such as WP:RFPP and WP:AIV, I don't see why one can't be developed and run on this page. Here's the simple breakdown of how the task should run:
Who will take on such a task? Rcsprinter123 (remark) 19:45, 12 October 2015 (UTC)
The following exchange:
There can be many shared ip notices on talk pages as seen here: https://en.wikipedia.org/?title=User_talk:165.72.200.11&oldid=672279975
This can be confusing, looks bad. Only one is needed at the bottom. TheMagikCow ( talk) 14:36, 20 July 2015 (UTC)
- I've set up automated archiving for that page. Perhaps a bot could do so for others? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:13, 20 July 2015 (UTC)
- Redundant We already have archiving bots that do a great job of archiving.— cyberpower Chat:Online 20:17, 27 August 2015 (UTC)
seems to have been closed and archived in error; my suggestion was not that bots do the archiving; but that a bot be used to add the instructions for a bot to do so to the affected talk pages. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:32, 15 October 2015 (UTC)
Is there a bot that automatically archives conversations marked with this template? I know Commons has something like this and I imagine we at least had something at some point, but it's hard to find anything about it. Or will a page already configured with User:MiszaBot/config automatically archive {{ Resolved}} sections as if they were User:ClueBot III/ArchiveNow? czar 23:22, 17 October 2015 (UTC)
{{User:ClueBot III/ArchiveThis |archiveprefix=User talk:Citation bot/Archive |format=%%i |maxarchsize=20 |age=960000 |index=no |archivenow={{tl|resolved}},{{tl|Resolved}},{{tl|wontfix}},Fixedin,fixedin,{{tl|notabug}},{{tl|fixed}} }}
Maybe a bot that clicks on links and fixes redirects on Wikipedia links? Possible set-up for fixing redirects: If link name "X" has been redirected to page name "Y", then change link source code from "X"→ "X" (Fix redirects) this request makes more sense when viewed in the source code Thanks!
Badmonkey717 ( talk) 03:40, 18 October 2015 (UTC)Badmonkey717
Declined Sometimes a link to the redirect is intentional per [WP:NOTBROKEN]] as Jonesey95 wrote. -- Magioladitis ( talk) 09:12, 20 October 2015 (UTC)
An adminbot should protect all articles in Category:Articles tagged for copyright problems for an expiry time of 7 days. GeoffreyT2000 ( talk) 22:14, 16 October 2015 (UTC)
I suggest a bot that can remove duplicated citations. If you look at the source code, you can see what I mean by "duplicated citations". Qwertyxp2000 ( talk) 23:41, 6 April 2015 (UTC)
Markup | Renders as |
---|---|
====Without duplicated citations=== Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref name="random thingy" group="example ref1">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref name="random thingy" group="example ref1" /> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. ====Dummy refs==== {{reflist|group="example ref1"}} {{tick}} This is acceptable ===With duplicated citations=== Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref group="example ref2">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref group="example ref2">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. ====Dummy refs==== {{reflist|group="example ref2"}} {{cross}} This is not acceptable |
Without duplicated citationsLorem ipsum dolor sit amet, consectetuer adipiscing elit. [example ref1 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus.. [example ref1 1] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. Dummy refs
This is acceptable With duplicated citationsLorem ipsum dolor sit amet, consectetuer adipiscing elit. [example ref2 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus.. [example ref2 2] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. Dummy refs
This is not acceptable |
<ref>([^\<]+)</ref>.+<ref>\1</ref>
will find about 20,000 candidates. The first 1% or so are listed at
User:John of Reading/Sandbox. But remember that the AWB general fixes will only combine duplicate citations if the article already has at least one named reference, to avoid changing the citation style (
AWB documentation). --
John of Reading (
talk) 16:09, 29 June 2015 (UTC)
<ref(.|\n)*?>([^\<]+)<\/ref>.+<ref(.|\n)*?>\2<\/ref>
Sn1per
(t)
(c) 18:18, 2 July 2015 (UTC)
<ref.*?>([^\<]+)<\/ref>.+<ref.*?>\1<\/ref>
. --
John of Reading (
talk) 18:47, 2 July 2015 (UTC)<ref([^\>]*)?>([^\<]*)</ref>.*?<ref(?!\1)[^\>]*?>\2</ref>
It (should) be able to find two refs, where at least one has no attributes i.e. name="", or if both have different attributes.
Sn1per
(t)
(c) 19:15, 2 July 2015 (UTC)
On the behalf of myself and Figureskatingfan, we are looking at the possibility of having a bot help assist us in the GA Cup. We held the first competition at the end of 2014/beginning of 2015 and after the success of it, we are currently planning a second competition, hoping for it to be a even bigger success. In the first competition, some of the participants expressed their frustration in the how the submission process for their Good article reviews was not very efficient. For the upcoming competition, we were wondering if it would be possible to have a bot scan the Good article nomination page for reviews being conducted by participants and add the appropriate review links to a page.
More specifically, ideally, the bot would scan the nomination page and say BenLinus1214 was reviewing an article, it would add it under the appropriate header.
If anyone is interested in helping us I would be glad to have you on board and will be more than happy to answer any questions!-- Dom497 ( talk) 23:25, 11 May 2015 (UTC)
Doing... for reasons and explained here (among others) internet traffic should be encrypted. Recently, Wikimedia decided to use HTTPS by default. Which begs the question why we not also convert external links to HTTPS (wherever this is an option). For instance, one of the most-linked websites on Wikipedia, the Internet Archive, actually encourages HTTPS inbound links ever since 2013, yet most of the external links on Wikipedia to them still use insecure HTTP. Also, all Google services offer HTTPS access, and Google encourages one to use it, but there are still thousands of links to Google Books, Google News, and YouTube with HTTP. Long story short, what I am asking for is a simple search-and-replace bot, to convert:
http://[wayback.|web.|*]archive.org/
→ https://[wayback.|web.|*]archive.org/
http://[news.|books.|*]google[.com|.co.uk|.ca|...]/
→ https://[news.|books.|*]google[.com|.co.uk|.ca|...]/
youtube
... you get the idea.Is it possible to have this done by a bot? -- bender235 ( talk) 17:43, 27 June 2015 (UTC)
I pointed out to Bender235 that over and above altering links from "http:" to "https:" that changing from a country specific address (such as co.nz) to .com may deny access to some people, as sometimes there appears to be a restriction in the access to text in one country but not another. Bender235 wants proof of this, but as I have not kept records of it and I make a lot of edits, I will provide one when I come across it, but in the mean time I see no need to change the country domain along with the connection type.
It has been pointed out that this sort of edit can easily mask vandalism (see User talk:Bender235#https), so as it is not a change that needs expediting, that must be weighed in whether this is a suitable candidate for automation (rather than for example adding to to a process like AWB to be done when other more specific changes are made). See also User talk:Bender235#AWB, apparently Bender235's AWB access was removed on by user:Materialscientist on 2 July 2015 (it has not been restored. When discussing this on Bender235's talk page it was suggested by Bender235 that the discussion Wikipedia:Village pump (technical)/Archive 138#HTTPS by default was relevant to this and so should probably be included in this conversation.
-- PBS ( talk) 09:50, 13 July 2015 (UTC)
This Catscan query:-
http://tools.wmflabs.org/catscan3/catscan2.php?depth=10&categories=Copy+to+Wikimedia+Commons%0D%0AWikipedia+files+with+the+same+name+on+Wikimedia+Commons&ns[6]=1&sortby=uploaddate&ext_image_data=1&file_usage_data=1
Is there a way for a BOT to handle this periodically? Namely removing the {{ Copy to Wikimedia Commons}} tag, so people aren't confused about what ACTUALLy does need to be reviewed transferred? Sfan00 IMG ( talk) 10:43, 2 October 2015 (UTC)
I'd like to request a bot to tag all articles, categories, subcategories, templates under the parent Category:Pakistan with Template:WikiProject Pakistan. It's been a while since bot-assisted WP:PAK tags were added in mass (the last time was in early 2012), and I know that there are hundreds of pages that need tagging. A big thanks and a complementary barnstar await any bot who could take the initiative. Many thanks, Mar4d ( talk) 02:49, 10 October 2015 (UTC)
Every day, the Template talk:Did you know page is updated by moving the Current nominations level 2 section header to one newer day, and adding a new level 3 section header for articles created/expanded on that day. This task is currently done manually by a human. Examples: September 26, September 25, September 22, September 1.
I think this once-a-day task may be done better by a bot. Note that this is my first bot request, so please notify me if I have made any problems. sst flyer 15:34, 26 September 2015 (UTC)
Doing... Seems simple enough. This could run exactly at 00:00 UTC if we want. Happy to implement this, I don't think it will be hard — MusikAnimal talk 04:42, 30 September 2015 (UTC)
( ←) BRFA filed Sorry for the delay, got held up with other technical work — MusikAnimal talk 01:55, 9 October 2015 (UTC)
<noinclude>...</noinclude>
. Given we have the list of the nominations at
T:TDYK, it shouldn't be terribly difficult for the bot to check each one and if it has been closed, remove it from the list. How do you feel about automating this process?
T:TDYK is a very large page with lots of transclusions and can take quite a while to load at times. If the bot automated removing redundant transclusions to keep the page tidy, it might overall speed things up for us. For performance/efficiency, it would only check entries in "Older nominations". Pinging @
Allen3,
SSTflyer, and
BlueMoonset: who might be interested —
MusikAnimal
talk 00:09, 15 October 2015 (UTC)
A bot should replace * * with ''' ''' and _ _ with '' ''. GeoffreyT2000 ( talk) 00:02, 21 October 2015 (UTC)
There used to be a category (and a bot that forced articles into the category) that kept track of Draft class articles without an AFC submission banner of any type. I've also seen some lost into the ether because the submit substitution was screwed up somehow. Could a bot create a list of all draft-space article without a call to template:AFC submission? Depending on the volume created, this may be worth doing regularly (monthly?) as a backlog at Wikipedia:WikiProject Articles for creation or something. -- Ricky81682 ( talk) 19:53, 26 October 2015 (UTC)
( ←) See this Quarry. Assuming my SQL is right, there are around 1026 draft pages that have not been edited in the past six months. Most of these look like test pages, vandalism, or WP:WEBHOST violations. I even just deleted an attack page. Furthermore, nearly all that I've checked have less than 5 edits made to them. I suppose the lack of articles makes sense, as many content creators would have instead found their way into the draftspace via article creation links, which insert an AfC template. Either way it looks like there's a lot stuff to review here. I can make a tool to interact with this data easier — MusikAnimal talk 05:57, 27 October 2015 (UTC)
Ricky81682 (RE to 21:15, 26 Oct 2015 UTC) I would not bulk MFD them as the argument you're using "That they're stale and haven't been touched" was rejected multiple times for nonAFC draftspace pages. I would strenously suggest you go round up a consensus at WT:Drafts prior to nominating for MFD. Getting the consensus also has the side benefit of stirring the community up to support your MFD nominations. Once you can satisfy the CSD requirements (Objective, Unconstestable, Frequent, Non-redundant) there'll be a wonderful case for using CSD to vaporize the poor drafts. Hasteur ( talk) 14:26, 27 October 2015 (UTC)
please changes that i have been made..please keep it..what is your problem..sir please do this.. — Preceding unsigned comment added by Aamir rodaba ( talk • contribs) 18:46, 19 December 2015 (UTC)
Since I've been the only one active on
Template talk:YouTube for the past two months, I am going to
claim consensus for my proposed changes to the template. But before I rewrite the template, I need a bot to go to every page using it, and replace the channel
parameter with user
. Thanks,
117Avenue (
talk) 00:57, 9 November 2015 (UTC)
Please take part in the ongoing discussion at: Wikipedia:Village pump (technical)#Reducing the load of WP:TAFI unofficial-manager Northamerica1000 to make our lives over at WP:TAFI that little bit easier. :)-- Coin945 ( talk) 15:27, 30 October 2015 (UTC)
Is there a way a bot could give out WP:Deletion to Quality Awards ?
Here's what it would have to do:
You can say, on behalf of Cirt and WP:Deletion to Quality Awards.
And also, any way a bot could update the "Hall of Fame" table at Wikipedia:Deletion_to_Quality_Award#Deletion_to_Quality_Award_Hall_of_Fame ?
Thoughts ?
Any help would be most appreciated,
— Cirt ( talk) 05:04, 21 October 2015 (UTC)
(user, article, award_type)
tuples is fine, updating the
WP:DQUAL list with it is fine, and manually giving out awards from that list is fine, but I'm wary of an automated thing. More on the higher-level merits of the task, I note many of these AfDs were closed with strong, even speedy, keep rationales—I wonder if those should be exempt. —
Earwig
talk 19:49, 4 November 2015 (UTC)
An adminbot should delete all redirects created by Neelix, many of which are currently at RfD. GeoffreyT2000 ( talk) 17:43, 5 December 2015 (UTC)
Would someone be ever-so-kind as to set-up a bot to convert a deprecated parameter. The total number of article would be about 340, with one edit in each article. The lists are at Template talk:S-rel/oc lists, with the new parameter for each. For example "Change these {{s-rel|oc}} to {{s-rel|chal}}". The discussion was/is at Template talk:S-rel#Introduce two new parameters. tahc chat 03:59, 15 November 2015 (UTC)
Hello. Could I hire a bot to substitute all transclusions of {{ Infobox Country World Championships in Athletics}}, per the outcome of this TfD? Alakzi ( talk) 13:12, 20 June 2015 (UTC)
After the recent update of the Wikipedia:WikiProject Mountains banner ( Template:WikiProject Mountains) to include two new parameters for mountains in the Alps (see discussion here), I would like to update the talk page of everey article concerned (all in Category:Mountains of the Alps, no subcategories) by adding:
|alps=yes | alps-importance=
to:
{{WikiProject Mountains | class= | importance= }}
result:
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance=[same as "importance"] }}
ZachG (Talk) 18:50, 16 November 2015 (UTC)
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance= }}
{{WikiProject Mountains | class=stub | importance= | alps=yes | alps-importance= }}
Hazard-SJ I can help with the task. For instance in this one the wikproject should have been below the other template. This can e done if you enable general fixes in AWB. You should also enable this module to normalise all wikiproject banners and avoid placement problems. -- Magioladitis ( talk) 16:23, 24 November 2015 (UTC)
Hazard-SJ my mistake. I thought you were using AWB. What I can do is to ensure the correct placement of the banners etc. You can do the rest. -- Magioladitis ( talk) 12:32, 28 November 2015 (UTC)
All done here. -- Magioladitis ( talk) 09:43, 29 November 2015 (UTC)
I had liked to have a bot named 'KNOWLEDGEBOT'. I want a bot so that I could edit pages more speedily than I can and to help everyone here. I hereby accept the bot policy and take all responsibilities of bot I won't allow him to violate anything and see over his way of commenting or communication. It won't do any harm or go on editing too speedily I will supervise the bot and I hereby I accept the bot policy. I request you to create this bot and I as its Bot operator. I am responsible for all of its acts, repairs, communication language etc. I will supervise my bot and it will be in my control. Regards BOTFIGHTER ( talk) 13:57, 3 December 2015 (UTC)
Could someone create a bot that follows a betting method on a roulette website, please? The website is www.csgoskins.net.
- step 1: Bet 1/1023 of the credits I have on black - step 2: - if I won: bet 1/1023 of the credits I have on red - if I lost: bet 1/511 of the remaining credits I have on black - if I lost again: bet 1/255 of the credits I have on black - if lost again: 1/127 of the credits on black - if lost again: 1/63 of the credits on black - if lost again: 1/31 of the credits on black - if lost again: 1/15 of the credits on black - if lost again: 1/7 of the credits on black - if lost again: 1/3 of the credits on black - if lost again: all of the remaining credits on black
So basically if I win, restart the method on the other color. If I lose, double the bet on the same color until it wins, then start again on the other color. I have no idea how difficult to do this kind of bot since I don't have any programming experience, but I would appreciate if someone would do it for me. Thank you for the help. — Preceding unsigned comment added by Neate ( talk • contribs) 18:23, 30 December 2015 (UTC)
It has become common practice in album articles to use {{ Start date}} in the {{ Singles}} add-on to {{ Infobox album}}. Per Template:Start date/doc: "This purpose of the {{ start date}} template is to return the date (or date-time) that an event or entity started or was created. It also includes duplicate, machine-readable date (or date-time) in the ISO date format (which is hidden by CSS), for use inside other templates (or table rows) which emit microformats. It should only be used once in each such template and should not be used outside such templates." i.e. {{ Start date}} should only be used in album articles for the album release date, not single release dates. It would be nice to have a bot to clean this up, as this error is currently in who knows how many articles. Chase ( talk | contributions) 16:44, 5 July 2015 (UTC)
Dead links in external-links sections are useless; we provide the links for additional reading, not for citations, so if you can't access them, they're pointless — they always need to be fixed or removed. Could a bot go through Category:All articles with dead external links and record ones with dead links in the EL sections, either adding by a new category (e.g. Category:Articles with dead links in External Links sections, or something of the sort) or listing them on a tracking page? I'm imagining that it opens each page, finds each occurrence of {{ dead link}} or redirects thereto, and records the ones in which one or more of these templates appears below ==External links== (or == External links ==) and above the next set of equals signs. I'm asking that the bot only record these pages, without doing anything else, because fixing or removing these links is a CONTEXTBOT situation. Nyttend ( talk) 01:14, 27 November 2015 (UTC)
Per the discussion (and background) at Wikipedia:Administrators'_noticeboard#Category:AfD_debates_relisted_3_or_more_times, can we get a bot set up to check Category:AfD debates relisted 3 or more times, and remove the category from closed discussions. I had been doing this every few days using AWB, but would prefer to have something automated do it. There was talk of getting an AfD closing script to do it, however not everyone uses the same script, or a script at all. Much obliged. -- kelapstick( bainuu) 21:20, 3 December 2015 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 60 | ← | Archive 64 | Archive 65 | Archive 66 | Archive 67 | Archive 68 | → | Archive 70 |
I've created new maps for almost every city, village, and township in Ohio, and I'm about halfway done with adding them to their articles. This normally works fine, but I periodically make errors, and it would help if a bot could check all 2200 of these pages for errors. In some cases, I've added the map of one place to the article about a different one; for example, here I copy/pasted one article's map into another article. In other cases, I simply haven't switched all the elements of the description correctly; for example, here I used the correct map, but I made a mistake in the caption, since the city's in Trumbull County, not Mahoning County as suggested by the description.
All maps follow a rigid naming convention: Map of COUNTYNAME County Ohio Highlighting PLACE TYPE.png. "Place" is simply the community or township name, and "Type" is City, Village, or Township (note the capital letter). Likewise, all captions follow the same convention: Location of PLACE in COUNTYNAME County, although "Township" is part of the PLACE section for townships (see the caption for Beaver Township, Mahoning County, Ohio). Given this convention, I expect that the bot can handle the situation easily. I'm imagining the following (collapsed so this request doesn't appear so massive):
Extended content
|
---|
Final notes: (1) When an article already had a good detailed map, I didn't add the new one. Therefore, if the bot doesn't find "County Ohio Highlighting" at all (this will be the case at Seven Hills, Ohio, for example), it should go to the next page without logging anything at all, because in most or all such cases, there's no problem. (2) Since the project isn't done, I'd appreciate it if you didn't run the full check until I tell you that I'm done. (3) Some municipalities are in multiple counties, so it's possible that the map link would be "wrong"; the bot would find an error when opening Adena, Ohio from {{ Guernsey County, Ohio}}, for example. Coding to avoid this error might require a lot of work, but these situations are rare, so don't worry about it. (4) Finally, since the bot's just logging pages that might be wrong, WP:CONTEXTBOT shouldn't apply. |
Nyttend ( talk) 18:12, 18 August 2015 (UTC)
I am proposing a redrafting of the Wikipedia:Naming conventions (ships) guideline, at user:Saberwyn/Proposed ship naming and disambiguation conventions update. One of the major changes follows the outcome of a Request for Comment on the matter of ship article disambiguation. The current form of the proposal is that all ship articles requiring disambiguation will be disambiguated in the form "(yyyy)", where yyyy = year of launch. If the proposed guideline is accepted, over 26,000 ship articles (number determined by transclusions of {{ infobox ship career}} - selected to include articles with infoboxes on specific ships, as opposed to class articles) will need to be checked for compliance with the guideline, and if not compliant, moved to a compliant title. Could a bot be used to check the titles of all ship articles (as determined per number above), check that year of launch is the method of disambiguation, and if not, move the article?
The bot will have to:
A second bot operation (or both passes if the above is not possible) could generate a list of articles that use a civilian ship prefix (which, under the proposal, will also be depreciated as part of an article title in most cases), so that humans can review article titles and move those necessary to a date-disambiguation title. I'm reluctant to suggest using a bot to move this group, as the prefix may be part of the common name for the subject.
So, theoretically (because the proposal may not pass in this form, or at all), is it possible to create/adapt a bot to do this, how difficult would it be, and what technical problems would have to be surmounted? Any opinions on the appropriateness of the proposed method of disambiguation should be directed to the proposal. -- saberwyn 03:45, 13 September 2015 (UTC)
|ship launched=
parameter. I wouldn't use Category:YYYY ship, because it's possible mistakes have been made in applying that category (listing year built instead of year launched, for instance). The parameter is more explicit in what the date means, so it's less likely to contain errors. You will need specific consensus on what the disambiguator should be, as this was not covered in the RfC you linked. For instance, should we use (1936) or (launched 1936)? ~
Rob
Talk 16:25, 13 September 2015 (UTC)
Typing a specific date, for example "15 February 2013", into our search box, finds articles with that date in references or maintenance templates, before events which happened on that date. Therefore, for every page like Portal:Current events/2013 February 15, we should have redirects from, at least:
and possibly:
and others. Once the backlog is done, we'd need a maintenance bot to create each new day's set.
We also need to decide what to do for dates were no Portal:Current events page exists (mostly pre 1999). Perhaps redirect to the relevant year? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:02, 8 September 2015 (UTC)
Discussion seems to have stalled. How an we take this forward? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:07, 15 September 2015 (UTC)
It has been proposed here that an adminbot be used to restore some 400,000 old IP talk pages that were deleted by user MZMcBride. It is to be noted that about 10,000 such pages have already been restored by user:MusikBot per this BRFA. See the BN discussion and the BRFA for further details. 103.6.159.68 ( talk) 19:11, 17 September 2015 (UTC)
I think it'd be alright to try get a consensus here, as there's no need to fragment the discussion to more pages. I have left a note at WP:VPR and WP:VPT. 103.6.159.88 ( talk) 06:07, 19 September 2015 (UTC)
A bot should fix all remaining instances of T44616. MZMcBride has already fixed some in December 2012; at the time it was Bug 42616. This will make the moves revertible only by administrators. GeoffreyT2000 ( talk) 16:20, 24 September 2015 (UTC)
Requesting bot assistance to tag talkpages of all categories and subcategories within
Category:Establishments in Rivers State by year and
Category:History of Rivers State by period with {{
NigeriaProject|Rivers State=yes}}
. Thanks.
Stanleytux (
talk) 11:28, 23 September 2015 (UTC)
{{
WikiProject Nigeria|Rivers State=yes}}
.
Stanleytux (
talk) 13:10, 23 September 2015 (UTC)
{{WikiProject Nigeria|Rivers State=yes}}
. Thanks.
Stanleytux (
talk) 13:24, 23 September 2015 (UTC)
Recently I did an edit like this to potentially de-eclipse an image Shadowed at Commons.
https://en.wikipedia.org/?title=File%3ABohr_model.jpg&type=revision&diff=682440062&oldid=634253010
It occurred to me that as this could be done for most of the images here Category:Wikipedia files that shadow a file on Wikimedia Commons automatically by a bot.
If bots are allowed to move files, then the entire process can be automated, apart from the eventual F2 deletion.
For All items in Category identifed:- Step 1. - Identify image tagged as {{ ShadowsCommons}}. Step 1a. - Ceck to see if it is actually shadowing Step 2. - Rename filename.ext to filename (uploadtimestamp).ext Step 3. - Remove {{ ShadowsCommons}} from the renamed file. Step 4. - Replace ALL transclusions and non disscusion page links to the file. Step 5. - Tag the created redirect as F2. Step 6. Repeat until category is empty (or only images remaining are protected generics.)
Sfan00 IMG ( talk) 19:20, 23 September 2015 (UTC)
I request a bot that places a tag at the talk page with a notice that the article in question has been through the week of being that weeks TAFI selected article. As it is not done all the time today. -- BabbaQ ( talk) 23:40, 26 September 2015 (UTC)
User:Stefan2 has a query here - https://quarry.wmflabs.org/query/950 which is used to identify files which have the same name on Commons and locally, but which may not be the same media?
Would it be possible for a bot task to run through the list periodically to tag files locally unless the local copy is already tagged using whatever CSD F8 uses, or there is a {{ Keep Local}} template on the local copy? Sfan00 IMG ( talk) 14:52, 3 October 2015 (UTC)
I am requesting you that please accept my bot request because I am wikipedia editor and now I have created 30 pages, so pleasgive me bot to check there mistakes and correct them. Thankyou.-- Productable Khan ( talk) 16:42, 9 October 2015 (UTC)
Wikipedia has a Citation overkill policy, which says that you do not need "more than a couple" citations to back up a single claim. I think a bot could easily recognize when five or more citations are mixed together (like this: [1][2][3][4][5]) and address the problem by adding a " too many citations" tag after the mix of 5 or more citations (like this: [1][2][3][4][5][too many citations]). I believe it would be very helpful because the bot can make more users aware of this policy and it would be less likely to happen in the future.-- Proud User ( talk) 18:06, 9 October 2015 (UTC)
An adminbot needs to revert pages for days in 2003 and 2004 to their last non-redirect version and move them to their corresponding Portal:Current events page (e.g. January 1, 2003 to Portal:Current events/2003 January 1). If the corresponding Portal page already exists, a history merge will be performed (this is why an adminbot is needed). If the day page has history only as a redirect, it can simply be deleted (another reason for an adminbot). Some users that have previously done such moves are Fram, AnomieBOT (a bot), and Waldir. GeoffreyT2000 ( talk) 19:31, 24 September 2015 (UTC)
I request a bot that places the GOCE tag at articles talk pages that has been through a completed work by the GOCE project. A GOCEbot perhaps.-- BabbaQ ( talk) 23:38, 26 September 2015 (UTC)
Alright, let's go with Needs wider discussion. — MusikAnimal talk 16:21, 30 September 2015 (UTC)
A bot needs to remove transclusions of Template:Sort from thousands of pages. The template itself can then be deleted. GeoffreyT2000 ( talk) 00:28, 3 October 2015 (UTC)
Needs wider discussion.
This request concerns the page Wikipedia:Articles for creation/Redirects. As each request on this page is dealt with, it is encapsulated with the code {{afc-c|a}}/{{afc-c|d}} and {{afc-c|b}}. This closes it and places it in a box, now ready for archival. At the moment, we haven't got a bot to move it to the month's archive page which can recognise that the request has been closed. Conventional "archive after x days" bots are unsuitable because some requests can sit on the page for up to a month while they are deliberated over and discussed. There is no average time for requests to be there, so some are done within a day and others may be done a few days or weeks later, depending on what the reviewers do.
Because of this, archival has been done exclusively by hand, picking out the closed requests and cut-pasting them into the archive. Because there is a high turnover of requests, this is a task required daily, at about the same time; this means that it can get tiring and real life causes to stop editing can cause a pileup of old requests. Because I know we have bots which can detect things about sections already on pages such as WP:RFPP and WP:AIV, I don't see why one can't be developed and run on this page. Here's the simple breakdown of how the task should run:
Who will take on such a task? Rcsprinter123 (remark) 19:45, 12 October 2015 (UTC)
The following exchange:
There can be many shared ip notices on talk pages as seen here: https://en.wikipedia.org/?title=User_talk:165.72.200.11&oldid=672279975
This can be confusing, looks bad. Only one is needed at the bottom. TheMagikCow ( talk) 14:36, 20 July 2015 (UTC)
- I've set up automated archiving for that page. Perhaps a bot could do so for others? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:13, 20 July 2015 (UTC)
- Redundant We already have archiving bots that do a great job of archiving.— cyberpower Chat:Online 20:17, 27 August 2015 (UTC)
seems to have been closed and archived in error; my suggestion was not that bots do the archiving; but that a bot be used to add the instructions for a bot to do so to the affected talk pages. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:32, 15 October 2015 (UTC)
Is there a bot that automatically archives conversations marked with this template? I know Commons has something like this and I imagine we at least had something at some point, but it's hard to find anything about it. Or will a page already configured with User:MiszaBot/config automatically archive {{ Resolved}} sections as if they were User:ClueBot III/ArchiveNow? czar 23:22, 17 October 2015 (UTC)
{{User:ClueBot III/ArchiveThis |archiveprefix=User talk:Citation bot/Archive |format=%%i |maxarchsize=20 |age=960000 |index=no |archivenow={{tl|resolved}},{{tl|Resolved}},{{tl|wontfix}},Fixedin,fixedin,{{tl|notabug}},{{tl|fixed}} }}
Maybe a bot that clicks on links and fixes redirects on Wikipedia links? Possible set-up for fixing redirects: If link name "X" has been redirected to page name "Y", then change link source code from "X"→ "X" (Fix redirects) this request makes more sense when viewed in the source code Thanks!
Badmonkey717 ( talk) 03:40, 18 October 2015 (UTC)Badmonkey717
Declined Sometimes a link to the redirect is intentional per [WP:NOTBROKEN]] as Jonesey95 wrote. -- Magioladitis ( talk) 09:12, 20 October 2015 (UTC)
An adminbot should protect all articles in Category:Articles tagged for copyright problems for an expiry time of 7 days. GeoffreyT2000 ( talk) 22:14, 16 October 2015 (UTC)
I suggest a bot that can remove duplicated citations. If you look at the source code, you can see what I mean by "duplicated citations". Qwertyxp2000 ( talk) 23:41, 6 April 2015 (UTC)
Markup | Renders as |
---|---|
====Without duplicated citations=== Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref name="random thingy" group="example ref1">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref name="random thingy" group="example ref1" /> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. ====Dummy refs==== {{reflist|group="example ref1"}} {{tick}} This is acceptable ===With duplicated citations=== Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref group="example ref2">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref group="example ref2">[https://www.google.com Random citation] Google. Retrieved at "random date".</ref> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. ====Dummy refs==== {{reflist|group="example ref2"}} {{cross}} This is not acceptable |
Without duplicated citationsLorem ipsum dolor sit amet, consectetuer adipiscing elit. [example ref1 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus.. [example ref1 1] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. Dummy refs
This is acceptable With duplicated citationsLorem ipsum dolor sit amet, consectetuer adipiscing elit. [example ref2 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus.. [example ref2 2] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a. Dummy refs
This is not acceptable |
<ref>([^\<]+)</ref>.+<ref>\1</ref>
will find about 20,000 candidates. The first 1% or so are listed at
User:John of Reading/Sandbox. But remember that the AWB general fixes will only combine duplicate citations if the article already has at least one named reference, to avoid changing the citation style (
AWB documentation). --
John of Reading (
talk) 16:09, 29 June 2015 (UTC)
<ref(.|\n)*?>([^\<]+)<\/ref>.+<ref(.|\n)*?>\2<\/ref>
Sn1per
(t)
(c) 18:18, 2 July 2015 (UTC)
<ref.*?>([^\<]+)<\/ref>.+<ref.*?>\1<\/ref>
. --
John of Reading (
talk) 18:47, 2 July 2015 (UTC)<ref([^\>]*)?>([^\<]*)</ref>.*?<ref(?!\1)[^\>]*?>\2</ref>
It (should) be able to find two refs, where at least one has no attributes i.e. name="", or if both have different attributes.
Sn1per
(t)
(c) 19:15, 2 July 2015 (UTC)
On the behalf of myself and Figureskatingfan, we are looking at the possibility of having a bot help assist us in the GA Cup. We held the first competition at the end of 2014/beginning of 2015 and after the success of it, we are currently planning a second competition, hoping for it to be a even bigger success. In the first competition, some of the participants expressed their frustration in the how the submission process for their Good article reviews was not very efficient. For the upcoming competition, we were wondering if it would be possible to have a bot scan the Good article nomination page for reviews being conducted by participants and add the appropriate review links to a page.
More specifically, ideally, the bot would scan the nomination page and say BenLinus1214 was reviewing an article, it would add it under the appropriate header.
If anyone is interested in helping us I would be glad to have you on board and will be more than happy to answer any questions!-- Dom497 ( talk) 23:25, 11 May 2015 (UTC)
Doing... for reasons and explained here (among others) internet traffic should be encrypted. Recently, Wikimedia decided to use HTTPS by default. Which begs the question why we not also convert external links to HTTPS (wherever this is an option). For instance, one of the most-linked websites on Wikipedia, the Internet Archive, actually encourages HTTPS inbound links ever since 2013, yet most of the external links on Wikipedia to them still use insecure HTTP. Also, all Google services offer HTTPS access, and Google encourages one to use it, but there are still thousands of links to Google Books, Google News, and YouTube with HTTP. Long story short, what I am asking for is a simple search-and-replace bot, to convert:
http://[wayback.|web.|*]archive.org/
→ https://[wayback.|web.|*]archive.org/
http://[news.|books.|*]google[.com|.co.uk|.ca|...]/
→ https://[news.|books.|*]google[.com|.co.uk|.ca|...]/
youtube
... you get the idea.Is it possible to have this done by a bot? -- bender235 ( talk) 17:43, 27 June 2015 (UTC)
I pointed out to Bender235 that over and above altering links from "http:" to "https:" that changing from a country specific address (such as co.nz) to .com may deny access to some people, as sometimes there appears to be a restriction in the access to text in one country but not another. Bender235 wants proof of this, but as I have not kept records of it and I make a lot of edits, I will provide one when I come across it, but in the mean time I see no need to change the country domain along with the connection type.
It has been pointed out that this sort of edit can easily mask vandalism (see User talk:Bender235#https), so as it is not a change that needs expediting, that must be weighed in whether this is a suitable candidate for automation (rather than for example adding to to a process like AWB to be done when other more specific changes are made). See also User talk:Bender235#AWB, apparently Bender235's AWB access was removed on by user:Materialscientist on 2 July 2015 (it has not been restored. When discussing this on Bender235's talk page it was suggested by Bender235 that the discussion Wikipedia:Village pump (technical)/Archive 138#HTTPS by default was relevant to this and so should probably be included in this conversation.
-- PBS ( talk) 09:50, 13 July 2015 (UTC)
This Catscan query:-
http://tools.wmflabs.org/catscan3/catscan2.php?depth=10&categories=Copy+to+Wikimedia+Commons%0D%0AWikipedia+files+with+the+same+name+on+Wikimedia+Commons&ns[6]=1&sortby=uploaddate&ext_image_data=1&file_usage_data=1
Is there a way for a BOT to handle this periodically? Namely removing the {{ Copy to Wikimedia Commons}} tag, so people aren't confused about what ACTUALLy does need to be reviewed transferred? Sfan00 IMG ( talk) 10:43, 2 October 2015 (UTC)
I'd like to request a bot to tag all articles, categories, subcategories, templates under the parent Category:Pakistan with Template:WikiProject Pakistan. It's been a while since bot-assisted WP:PAK tags were added in mass (the last time was in early 2012), and I know that there are hundreds of pages that need tagging. A big thanks and a complementary barnstar await any bot who could take the initiative. Many thanks, Mar4d ( talk) 02:49, 10 October 2015 (UTC)
Every day, the Template talk:Did you know page is updated by moving the Current nominations level 2 section header to one newer day, and adding a new level 3 section header for articles created/expanded on that day. This task is currently done manually by a human. Examples: September 26, September 25, September 22, September 1.
I think this once-a-day task may be done better by a bot. Note that this is my first bot request, so please notify me if I have made any problems. sst flyer 15:34, 26 September 2015 (UTC)
Doing... Seems simple enough. This could run exactly at 00:00 UTC if we want. Happy to implement this, I don't think it will be hard — MusikAnimal talk 04:42, 30 September 2015 (UTC)
( ←) BRFA filed Sorry for the delay, got held up with other technical work — MusikAnimal talk 01:55, 9 October 2015 (UTC)
<noinclude>...</noinclude>
. Given we have the list of the nominations at
T:TDYK, it shouldn't be terribly difficult for the bot to check each one and if it has been closed, remove it from the list. How do you feel about automating this process?
T:TDYK is a very large page with lots of transclusions and can take quite a while to load at times. If the bot automated removing redundant transclusions to keep the page tidy, it might overall speed things up for us. For performance/efficiency, it would only check entries in "Older nominations". Pinging @
Allen3,
SSTflyer, and
BlueMoonset: who might be interested —
MusikAnimal
talk 00:09, 15 October 2015 (UTC)
A bot should replace * * with ''' ''' and _ _ with '' ''. GeoffreyT2000 ( talk) 00:02, 21 October 2015 (UTC)
There used to be a category (and a bot that forced articles into the category) that kept track of Draft class articles without an AFC submission banner of any type. I've also seen some lost into the ether because the submit substitution was screwed up somehow. Could a bot create a list of all draft-space article without a call to template:AFC submission? Depending on the volume created, this may be worth doing regularly (monthly?) as a backlog at Wikipedia:WikiProject Articles for creation or something. -- Ricky81682 ( talk) 19:53, 26 October 2015 (UTC)
( ←) See this Quarry. Assuming my SQL is right, there are around 1026 draft pages that have not been edited in the past six months. Most of these look like test pages, vandalism, or WP:WEBHOST violations. I even just deleted an attack page. Furthermore, nearly all that I've checked have less than 5 edits made to them. I suppose the lack of articles makes sense, as many content creators would have instead found their way into the draftspace via article creation links, which insert an AfC template. Either way it looks like there's a lot stuff to review here. I can make a tool to interact with this data easier — MusikAnimal talk 05:57, 27 October 2015 (UTC)
Ricky81682 (RE to 21:15, 26 Oct 2015 UTC) I would not bulk MFD them as the argument you're using "That they're stale and haven't been touched" was rejected multiple times for nonAFC draftspace pages. I would strenously suggest you go round up a consensus at WT:Drafts prior to nominating for MFD. Getting the consensus also has the side benefit of stirring the community up to support your MFD nominations. Once you can satisfy the CSD requirements (Objective, Unconstestable, Frequent, Non-redundant) there'll be a wonderful case for using CSD to vaporize the poor drafts. Hasteur ( talk) 14:26, 27 October 2015 (UTC)
please changes that i have been made..please keep it..what is your problem..sir please do this.. — Preceding unsigned comment added by Aamir rodaba ( talk • contribs) 18:46, 19 December 2015 (UTC)
Since I've been the only one active on
Template talk:YouTube for the past two months, I am going to
claim consensus for my proposed changes to the template. But before I rewrite the template, I need a bot to go to every page using it, and replace the channel
parameter with user
. Thanks,
117Avenue (
talk) 00:57, 9 November 2015 (UTC)
Please take part in the ongoing discussion at: Wikipedia:Village pump (technical)#Reducing the load of WP:TAFI unofficial-manager Northamerica1000 to make our lives over at WP:TAFI that little bit easier. :)-- Coin945 ( talk) 15:27, 30 October 2015 (UTC)
Is there a way a bot could give out WP:Deletion to Quality Awards ?
Here's what it would have to do:
You can say, on behalf of Cirt and WP:Deletion to Quality Awards.
And also, any way a bot could update the "Hall of Fame" table at Wikipedia:Deletion_to_Quality_Award#Deletion_to_Quality_Award_Hall_of_Fame ?
Thoughts ?
Any help would be most appreciated,
— Cirt ( talk) 05:04, 21 October 2015 (UTC)
(user, article, award_type)
tuples is fine, updating the
WP:DQUAL list with it is fine, and manually giving out awards from that list is fine, but I'm wary of an automated thing. More on the higher-level merits of the task, I note many of these AfDs were closed with strong, even speedy, keep rationales—I wonder if those should be exempt. —
Earwig
talk 19:49, 4 November 2015 (UTC)
An adminbot should delete all redirects created by Neelix, many of which are currently at RfD. GeoffreyT2000 ( talk) 17:43, 5 December 2015 (UTC)
Would someone be ever-so-kind as to set-up a bot to convert a deprecated parameter. The total number of article would be about 340, with one edit in each article. The lists are at Template talk:S-rel/oc lists, with the new parameter for each. For example "Change these {{s-rel|oc}} to {{s-rel|chal}}". The discussion was/is at Template talk:S-rel#Introduce two new parameters. tahc chat 03:59, 15 November 2015 (UTC)
Hello. Could I hire a bot to substitute all transclusions of {{ Infobox Country World Championships in Athletics}}, per the outcome of this TfD? Alakzi ( talk) 13:12, 20 June 2015 (UTC)
After the recent update of the Wikipedia:WikiProject Mountains banner ( Template:WikiProject Mountains) to include two new parameters for mountains in the Alps (see discussion here), I would like to update the talk page of everey article concerned (all in Category:Mountains of the Alps, no subcategories) by adding:
|alps=yes | alps-importance=
to:
{{WikiProject Mountains | class= | importance= }}
result:
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance=[same as "importance"] }}
ZachG (Talk) 18:50, 16 November 2015 (UTC)
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance= }}
{{WikiProject Mountains | class=stub | importance= | alps=yes | alps-importance= }}
Hazard-SJ I can help with the task. For instance in this one the wikproject should have been below the other template. This can e done if you enable general fixes in AWB. You should also enable this module to normalise all wikiproject banners and avoid placement problems. -- Magioladitis ( talk) 16:23, 24 November 2015 (UTC)
Hazard-SJ my mistake. I thought you were using AWB. What I can do is to ensure the correct placement of the banners etc. You can do the rest. -- Magioladitis ( talk) 12:32, 28 November 2015 (UTC)
All done here. -- Magioladitis ( talk) 09:43, 29 November 2015 (UTC)
I had liked to have a bot named 'KNOWLEDGEBOT'. I want a bot so that I could edit pages more speedily than I can and to help everyone here. I hereby accept the bot policy and take all responsibilities of bot I won't allow him to violate anything and see over his way of commenting or communication. It won't do any harm or go on editing too speedily I will supervise the bot and I hereby I accept the bot policy. I request you to create this bot and I as its Bot operator. I am responsible for all of its acts, repairs, communication language etc. I will supervise my bot and it will be in my control. Regards BOTFIGHTER ( talk) 13:57, 3 December 2015 (UTC)
Could someone create a bot that follows a betting method on a roulette website, please? The website is www.csgoskins.net.
- step 1: Bet 1/1023 of the credits I have on black - step 2: - if I won: bet 1/1023 of the credits I have on red - if I lost: bet 1/511 of the remaining credits I have on black - if I lost again: bet 1/255 of the credits I have on black - if lost again: 1/127 of the credits on black - if lost again: 1/63 of the credits on black - if lost again: 1/31 of the credits on black - if lost again: 1/15 of the credits on black - if lost again: 1/7 of the credits on black - if lost again: 1/3 of the credits on black - if lost again: all of the remaining credits on black
So basically if I win, restart the method on the other color. If I lose, double the bet on the same color until it wins, then start again on the other color. I have no idea how difficult to do this kind of bot since I don't have any programming experience, but I would appreciate if someone would do it for me. Thank you for the help. — Preceding unsigned comment added by Neate ( talk • contribs) 18:23, 30 December 2015 (UTC)
It has become common practice in album articles to use {{ Start date}} in the {{ Singles}} add-on to {{ Infobox album}}. Per Template:Start date/doc: "This purpose of the {{ start date}} template is to return the date (or date-time) that an event or entity started or was created. It also includes duplicate, machine-readable date (or date-time) in the ISO date format (which is hidden by CSS), for use inside other templates (or table rows) which emit microformats. It should only be used once in each such template and should not be used outside such templates." i.e. {{ Start date}} should only be used in album articles for the album release date, not single release dates. It would be nice to have a bot to clean this up, as this error is currently in who knows how many articles. Chase ( talk | contributions) 16:44, 5 July 2015 (UTC)
Dead links in external-links sections are useless; we provide the links for additional reading, not for citations, so if you can't access them, they're pointless — they always need to be fixed or removed. Could a bot go through Category:All articles with dead external links and record ones with dead links in the EL sections, either adding by a new category (e.g. Category:Articles with dead links in External Links sections, or something of the sort) or listing them on a tracking page? I'm imagining that it opens each page, finds each occurrence of {{ dead link}} or redirects thereto, and records the ones in which one or more of these templates appears below ==External links== (or == External links ==) and above the next set of equals signs. I'm asking that the bot only record these pages, without doing anything else, because fixing or removing these links is a CONTEXTBOT situation. Nyttend ( talk) 01:14, 27 November 2015 (UTC)
Per the discussion (and background) at Wikipedia:Administrators'_noticeboard#Category:AfD_debates_relisted_3_or_more_times, can we get a bot set up to check Category:AfD debates relisted 3 or more times, and remove the category from closed discussions. I had been doing this every few days using AWB, but would prefer to have something automated do it. There was talk of getting an AfD closing script to do it, however not everyone uses the same script, or a script at all. Much obliged. -- kelapstick( bainuu) 21:20, 3 December 2015 (UTC)