This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 80 | ← | Archive 83 | Archive 84 | Archive 85 | Archive 86 |
Is it possible to get a report in my userspace for a list of redirects to pages that have a list of episodes (using one of the templates {{
Episode list}} or {{
Episode list/sublist}}) for which the redirects are episode redirects whose title is the same (or similar - typos, disambiguation, etc.) to the one in the |Title=
parameter (of the above templates) and that don't use {{
R from television episode}} or {{
Television episode redirect handler}}? An example would be
And a Trauma in a Pear Tree which redirects to
Law & Order: Special Victims Unit (season 24).
Gonnym (
talk) 12:23, 13 March 2023 (UTC)
* {{-r|Redirect}} → [[target]]
site.expand_text(text)
Requesting a bot that replaces Template:Unreferenced with Template:BLP unreferenced and replaces Template:More citations needed with Template:BLP sources when applied to BLPs. So far I've been monitoring the former manually through this search, but the latter is too large of a task. Thebiguglyalien ( talk) 01:57, 29 April 2023 (UTC)
As Buzzfeed News is shutting down and per this ongoing discussion on WP:RSN, I am requesting a big job that would ask the bot to archive all https://buzzfeednews.com citations. This is my first time dealing with WP:BOTREQ and bots in general, but if I understand IABot correctly, I'm not allowed to ask it to modify domains.
This query makes note of 1,965 articles that cite https://buzzfeednews.com. — That Coptic Guyping me! ( talk) ( contribs) 19:03, 20 April 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} What I need is a bot that can scan this spreadsheet take the municipality name and then find that article (if there is one) and then take the coords from the spreadsheet (Column AC and AD) and insert the coords into the infobox or coord template. A bit more info at this TH discussion. It would also be useful to know which towns didn't come back with a page so we can create redirects to the correct articles. PalauanReich 🗣️ 06:50, 23 April 2023 (UTC)
Not done for now, with no prejudice against new request after a consensus is formed regarding the reliability of the source in question. —usernamekiran (talk) 23:57, 27 May 2023 (UTC)
The "participate" form over at WP:MolBio doesn't seem to work, and appears to have also stopped at other projects like WP:MED. When the form is filled in, it successfully creates a user subpage ( examples) but I'm guessing that User:Reports_bot is supposed to spot that and add it to the members list. Any ideas on how to get it up and running again? It seems as though the bot is also menat to keep the WikiProject Directory up to date too, so has a few functions that'd be useful to restart. T.Shafee(Evo&Evo) talk 12:33, 12 April 2023 (UTC)
[username]/WikiProjectCards/[wikiproject name]
, and copying those pagenames into [wikiproject name]/Members
could a lightweight replacement bot be created to take over those functions? There's
quite a few wikiprojects that it affects.
T.Shafee(Evo&Evo)
talk 03:10, 19 April 2023 (UTC)
WP:WPWX is also in need of a bot that can change colors within timelines on articles. This would only involve substituting in the new color parameters that were approved in a recent RfC. I can provide more specific details on this if anyone is able to do this. Noah Talk 19:36, 25 February 2023 (UTC)
Extended content
|
---|
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression_=_≤38_mph_(≤62_km/h) id:TS value:rgb(0.3,1,1) legend:Tropical_Storm_=_39–73_mph_(63–117_km/h) id:C1 value:rgb(1,1,0.85) legend:Category_1_=_74–95_mph_(118–153_km/h) id:C2 value:rgb(1,0.85,0.55) legend:Category_2_=_96–110_mph_(154–177_km/h) id:C3 value:rgb(1,0.62,0.35) legend:Category_3_=_111–129_mph_(178–208_km/h) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_130–156_mph_(209–251_km/h) id:C5 value:rgb(0.55,0.46,0.90) legend:Category_5_=_≥157_mph_(≥252_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Atlantic Ocean meteorological timelines, Category:Atlantic hurricane seasons, Category:Pacific hurricane seasons, and Category:Pacific hurricane meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression_=_≤62_km/h_(≤39_mph) id:TS value:rgb(0.3,1,1) legend:Tropical_Storm_=_62–88_km/h_(39–54_mph) id:ST value:rgb(0.75,1,0.75) legend:Severe_Tropical_Storm_=_89–117_km/h_(55–72_mph) id:STY value:rgb(1,0.85,0.55) legend:Strong_Typhoon_=_118–156_km/h_(73–96_mph) id:VSTY value:rgb(1,0.45,0.54) legend:Very_Strong_Typhoon_=_157–193_km/h_(97–119_mph) id:VITY value:rgb(0.55,0.46,0.90) legend:Violent_Typhoon_=_≥194_km/h_(≥120_mph) In addition, please change any values of TY within the actual part where storms are listed to STY since typhoon is now strong typhoon. The legend name also changes from typhoon to strong typhoon. The VSTY and VITY need to be added into the timelines since those statuses are now being included (the VSTY and VITY are within the coding above). A caveat is that articles dated 1972 or earlier need to use the colors for the Atlantic/Eastern Pacific. For all season-related articles with timelines in the categories (may need to check subcategories): Category:Pacific typhoon seasons
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0,0.52,0.84) legend:Depression_(31–50_km/h) id:DD value:rgb(0.43,0.76,0.92) legend:Deep_Depression_(51–62_km/h) id:TS value:rgb(0.3,1,1) legend:Cyclonic_Storm_(63–88_km/h) id:ST value:rgb(0.75,1,0.75) legend:Severe_Cyclonic_Storm_(89–117_km/h) id:VS value:rgb(1,0.85,0.55) legend:Very_Severe_Cyclonic_Storm_(118–165_km/h) id:ES value:rgb(1,0.45,0.54) legend:Extremely_Severe_Cyclonic_Storm_(166–220_km/h) id:SU value:rgb(0.55,0.46,0.9) legend:Super_Cyclonic_Storm_(≥221_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:North Indian Ocean cyclone seasons and Category:North Indian Ocean meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:ZD value:rgb(0,0.52,0.84) legend:Zone_of_Disturbed_Weather/Tropical_Disturbance_=_≤31_mph_(≤50_km/h) id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression/Subtropical_Depression_=_32–38_mph_(51–62_km/h) id:TS value:rgb(0.30,1,1) legend:Moderate_Tropical_Storm_=_39–54_mph_(63–88_km/h) id:ST value:rgb(0.75,1,0.75) legend:Severe_Tropical_Storm_=_55–73_mph_(89–118_km/h) id:TC value:rgb(1,0.85,0.55) legend:Tropical_Cyclone_=_74–103_mph_(119–166_km/h) id:IT value:rgb(1,0.45,0.54) legend:Intense_Tropical_Cyclone_=_104–133_mph_(167–214_km/h) id:VI value:rgb(0.55,0.46,0.9) legend:Very_Intense_Tropical_Cyclone_=_≥134_mph_(≥215_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Southwest Indian Ocean meteorological timelines and Category:South-West Indian Ocean cyclone seasons
Colors = id:canvas value:gray(0.88) id:GP value:red id:TL value:rgb(0.43,0.76,0.92) legend:Tropical_Low_=_<63_km/h_(<39_mph) id:C1 value:rgb(0.3,1,1) legend:Category_1_=_63–88_km/h_(39-55_mph) id:C2 value:rgb(0.75,1,0.75) legend:Category_2_=_89–117_km/h_(55-73_mph) id:C3 value:rgb(1,0.85,0.55) legend:Category_3_=_118–159_km/h_(73-99_mph) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_160–199_km/h_(99-124_mph) id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_=_≥200_km/h_(≥124_mph) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Australian region cyclone seasons and Category:Australian region meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:TDi value:rgb(0,0.52,0.84) legend:Tropical_Disturbance id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression id:C1 value:rgb(0.3,1,1) legend:Category_1_=_63-87_km/h_(39-54_mph) id:C2 value:rgb(0.75,1,0.75) legend:Category_2_=_88-142_km/h_(55-74_mph) id:C3 value:rgb(1,0.85,0.55) legend:Category_3_=_143-158-km/h_(75-98_mph) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_159–204_km/h_(99–127_mph) id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_=_≥205_km/h_(≥128_mph) For all season-related articles with timelines in the categories (may need to check subcategories): Category:South Pacific cyclone seasons and Category:South Pacific meteorological timelines
Colors = id:canvas value:gray(0.88) id:C0 value:rgb(0.3,1,1) legend:Category_0_&_N/A_<_1.0_RSI id:C1 value:rgb(1,1,0.85) legend:Category_1_=_1.0–3.0_RSI id:C2 value:rgb(1,0.85,0.55) legend:Category_2_=_3.0–6.0_RSI id:C3 value:rgb(1,0.62,0.35) legend:Category_3_=_6.0–10.0_RSI id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_10.0–18.0_RSI id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_≥_18.0_RSI Category: Category:North American winters |
Foo
Bar
Foobar
taken from archive, got no responses last time
Here are some possible tasks for a bot:
137a ( talk • edits) 13:42, 14 April 2023 (UTC)
Guess I should ping @ JamesR: since he operates AIV helper bot? 137a ( talk • edits) 19:30, 21 April 2023 (UTC)
I'm not sure if this is the right place to propose something like this, but here it is:
There are 1,208 pages (according to
this search) that use one exact citation towards
this page. The citation is fairly simple: just this:
{{
cite web}}
: CS1 maint: url-status (
link).
However, it gives neither authors or an archived copy of the webpage. This website itself, part of a project from
Carleton University, gives a more detailed suggested citation (under the header "To cite this page"). This includes authors and other broader information. Assuming it is acceptable to use a page's suggested citation, I suggest replacing these bare-bones web citations with the fuller references, along with a link to an archived copy of the webpage (there is one at archive.today
[3]), like has been done at
Qaba Sorkh (reference number three). Since there are roughly 1200 articles with the bare-bones citation, this would be nigh-on impossible to do manually. Could a bot replace these simple citations with the fuller ones, such as can be found at
Qaba Sorkh? The more detailed citation modified from the webpage's suggested format looks like this:
I propose mass-expanding these references to fuller version, using the information in the webpage's suggested style of citation:
(Thanks to
GoingBatty for helping with the technicalities here)
Basically, it's a find-and-replace job with the goal of expanding citations.
Thank-you,
Edward-Woodrow :) [
talk 19:43, 28 May 2023 (UTC)
|display-authors=etal
like this?:
Hello again. I was wondering if a bot could remove BLP tags for people who have died three or more years ago per the Category:Deaths by year category. I stumbled across the BLP tag at Johnny Burke (Canadian singer), despite having died in 2017.
With this search, I found over 1,300 articles tagged as BLPs while skipping over people who died between 2021-2023. The reason why I am not requesting for 2 years or less is because of WP:BDP. Thanks! MrLinkinPark333 ( talk) 19:03, 10 May 2023 (UTC)
Please, fix these links. There are 1648 ones, I cannot do it manually. Thanks! — Zalán Hári ( talk) 08:49, 30 April 2023 (UTC)
I'll start going over these with AWB. When I'm done I'll post here and ping @ Hári Zalán: Dr vulpes ( 💬 • 📝) 21:32, 12 June 2023 (UTC)
Hello again. I'm looking for a bot to help clear out Category:Redundant infobox title param. Per the category description, the |title parameter can be removed in the infobox if they match the article name. A lot of there are populated by Infobox comics creator and Infobox comic book title. See for example my removals of the |title parameter at Quino and Captain America (vol. 5). Currently, there's 2,435 of them to go through. Any help would be appreciated. Thanks! MrLinkinPark333 ( talk) 19:03, 4 May 2023 (UTC)
Wikipedia:Reliable sources/Noticeboard/Archive 405#fmg.ac (Foundation for Medieval Genealogy) reached consensus that fmg.ac/Projects/MedLands is a completely unreliable source, has been repeatedly recognised as such, had to be phased out, and now we've agreed it should be completely purged. 576 references to fmg.ac/Projects/MedLands had been made through Template:Medieval Lands by Charles Cawley, which has just been deleted at Wikipedia:Templates for discussion/Log/2023 May 25#Template:Medieval Lands by Charles Cawley. User:Frietjes was so kind as to let me know that I could check at External links search just how many references we've still got to fmg.ac/Projects/MedLands throughout English Wikipedia. It turns out that number is 1,326, way too many for anyone to go and delete manually. Not all links to the domain fmg.ac need to be purged, just those under the subdomain fmg.ac/Projects/MedLands. Nor should links outside of the mainspace be removed, such as the Talk:, Wikipedia:, User: etc. spaces. Is it possible for a bot to carry this out? Thanks in advance! Nederlandse Leeuw ( talk) 19:20, 2 June 2023 (UTC)
I notice a lot of talk pages without talk headers and was wondering if a bot can be created to do such a task? Lightoil ( talk) 03:15, 7 July 2023 (UTC)
This template does not need to be placed on every talk page, and should not be indiscriminately added to talk pages using automated editing tools* Pppery * it has begun... 03:23, 7 July 2023 (UTC)
As per Wikipedia:Village pump (technical)/Archive 206#Categorisation error due to error in user warning template, an error in Template:Uw-username caused the following code to appear:
{{#ifeq:{{NAMESPACENUMBER}}|3|{{#ifeq:{{ROOTPAGENAME}}|{{ROOTPAGENAME:}}[[Category:Pages which use a template in place of a magic word|S{{PAGENAME}}]]|[[Category:Wikipedia usernames with possible policy issues|{{PAGENAME}}]]}}}}
The actual markup that should have appeared is:
{{#ifeq:{{NAMESPACENUMBER}}|3|{{#ifeq:{{ROOTPAGENAME}}|<username at the time of subst>|[[Category:Wikipedia usernames with possible policy issues|{{PAGENAME}}]]}}}}
Can someone run a bot that checks the username of the user talk page at the time of substing this template, and fix the markup accordingly? — CX Zoom[he/him] ( let's talk • { C• X}) 13:22, 6 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}}
Hello, following the outcome of this RfC from last year, it was decided that men's footballer categories should be created to match the women's categories which had already existed. Therefore, I was wondering if a bot would be able to help create the relevant men's categories necessary, and then to move the appropriate pages to the new categories. I have finished cleaning up/completing the category trees for women, so now only men's footballers directly populate the relevant categories. There are five category trees needing to be adjusted, with each containing a type of footballer category by country/nationality (e.g. Category:Moroccan footballers). Each of these categories should become a container, with the articles in each category then diffused to a men's subcategory (e.g. Category:Moroccan footballers becomes a container, with the articles moved to Category:Moroccan men's footballers). The five category trees are:
The first two category trees will not require any category changes to articles, as the category for each country acts as a container.
Given the large number of articles involved, any help with creating a bot would be really appreciated. Thanks, S.A. Julio ( talk) 17:25, 22 April 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} Could anyone help with a bot task to merge the duplicate WikiProject banners found on pages listed at Category:Pages using WikiProject banner shell with duplicate banner templates? Gonnym ( talk) 12:39, 13 June 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} I'd like to request a bot that generates a report somewhere of NPPs and admins who Page Curation "mark as reviewed" (type=pagetriage-curation&subtype=review) 10 or more mainspace non-redirect articles in 2 minutes. This is way too fast to be proper NPP patrolling and these folks will need further scrutiny. Thank you. – Novem Linguae ( talk) 22:29, 20 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}}
This is my first ever such request, so apologies in advance if this is the wrong venue.
What I'm looking for basically is if an article is already tagged with WikiProject Biography and WikiProject Physics, to then add the bio=yes parameter to the Physics project banner if it's not already present, and s&a-work-group=yes to the Biography project banner if not already present.
Just the batch of existing ones if need be (so may be better suited for AWB?), but ideally something on-going, though doesn't need running that often.
Given could be bio=yes, or bio=y, detection using categories. More specifically, just regular articles, not list-class, or categories themselves, or templates, or files, etc. Something like https://petscan.wmflabs.org/?psid=25332108. Happy to include drafts. Plus something to deal with s&a-work-group=yes.
For transparency, the initial discussion was at Wikipedia_talk:WikiProject_Physics#Article_classification_thought. I do not have the ability to code this myself.
Thank you. - Kj cheetham ( talk) 15:10, 23 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} Hello.
Could somone please create a bot editing templates still using the old NFL style/color templates the following way:
{{NFLPrimaryColor|
=> {{Gridiron primary color|
{{NFLPrimaryColorRaw|
=> {{Gridiron primary color raw|
{{NFLPrimaryStyle|
=> {{Gridiron primary style|
{{NFLAltPrimaryColor|
=> {{Gridiron alt primary color|
{{NFLAltPrimaryStyle|
=> {{Gridiron alt primary style|
{{NFLAltSecondaryColor|
=> {{Gridiron alt secondary color|
{{NFLSecondaryColor|
=> {{Gridiron secondary color|
{{NFLSecondaryColorRaw|
=> {{Gridiron secondary color raw|
{{NFLTertiaryColorRaw|
=> {{Gridiron tertiary color raw|
There are thousands of templates (mostly navboxes) that use these template redirects.
Thank you. HandsomeFella ( talk) 20:54, 24 July 2023 (UTC)
Hi, I noticed a little inconsistency with railway station articles in Australia and New Zealand.
Context: The typical format for the name of train stations in Australia and New Zealand is Example railway station, otherwise if that name is used more than once around the world, then Example railway station, Administrative Division is used. The administrative divisions used are the full names of the states/territories of Australia, e.g. “Western Australia”, and the country of “New Zealand”. This is typically used for train stations that are in regional areas outside of a major city. However, for metropolitan train stations within the bounds of major cities such as Sydney and Melbourne, then it should be Example railway station, City.
Context ctd: Thus, many articles have redirects with the disambiguators, e.g. the article Example railway station might have redirects from “, New South Wales” and “, Sydney”. Or Example railway station, City might have a redirect from Example railway station, State. Some real examples of articles include Panania railway station, Epping railway station, Sydney and Gloucester railway station, New South Wales. Often, these redirects exist because of page moves, someone manually created it and/or someone created an erroneous redlink to a station when it was incorrectly disambiguated, and a redirect was created. But, other times, some redirects exist, some don’t.
Request: Are there any bots (or possibly one to be created) that could parse the names of a station’s article (or potentially a list of names) that would be manually inputted by the user, then the bot would check what format it is in (e.g. no disambiguator, city disambiguator, administrative division disambiguator, or other), then create redirects as necessary for the other disambiguators?
Request ctd: It should ideally process names in alphabetical order and then sorted by the administrative region it is in, while timestamping/logging each action into a readable list. It should also list the main article’s name, and redirects that already exist. Note that if the railway station is in an “other format” such as Newcastle Interchange, then the bot should just ignore it. It should also check the redirects that already do exist, and add/change the class of the article to “redirect” for the relevant WikiProjects. Possibly, the user could input the city and/or administrative region that the station(s) are in into the bot’s interface.
I am aware of the WP:MASSCREATION policy, but I think because these are redirects and not articles, it should be ok. It’d also help out those who like to search for station articles, but instead they add in an unnecessary disambigator, and those who place redlinks.
Would love to hear your thoughts guys, and thanks for making it this far :) Fork99 ( talk) 08:28, 9 June 2023 (UTC)
I am requesting that a bot start adding the Top 25 report template to the talk pages of articles appearing in Wikipedia:Top 25 Report. In is an achievement for articles to get enough views to appear on this list, and a template exists. BabbaQ ( talk) 08:52, 27 May 2023 (UTC)
First time I've done this, so hopefully this makes sense.
I would like to request the assistance of a bot in identifying large stub articles within various WikiProjects. The purpose of this task is to highlight articles within the WikiProjects that have a significant character count and may require further attention or improvement.
If it would help to show the logic, I have developed a program that can analyse the character count of articles in designated stub categories associated with each WikiProject. The program uses an algorithm to scan the articles' content and generate a report listing the articles that exceed a certain character limit (e.g., 10,000 characters). The report includes the article's title, the character count, and a link to the article page. Tell me where to show the code and I could do this.
To facilitate the testing and implementation of this program, I propose that the bot initially runs in a user sandbox or designated user space. This will allow for easy monitoring and review of the generated reports by the WikiProject members.
A significant number of stub categories in the WikiProjects have large articles. The program aims to identify such articles with large character counts which may suggest misclassification.
Please let me know if you have any queries. JASpencer ( talk) 08:03, 13 May 2023 (UTC)
(I created something, but rather naively didn't realise that Wikipedia would rightly block it)- what do you mean? — Qwerfjkl talk 11:52, 17 May 2023 (UTC)
"There are a quite a few stubs which have lots of text but it doesn't cover the subject adequately"That makes it a Start-class article, though I suspect you might be able to offer some exceptions. Cheers, Nick Moyes ( talk) 14:03, 10 July 2023 (UTC)
There are various lists of Wikipedians by edit count and related stats. One that I'd be interested to see would be Wikipedia:List of Wikipedians by non-automated edit count. I think the presence of such a list might be a small help re Editcountitis, as it'd recognize/incentivize editors whose edit count has not been juiced through tons of (typically low-value) automated edits.
Would anyone be interested in coding a bot to populate and maintain such a page? Courtesy pinging Legoktm and 0xDeadbeef, who run the bot that updates the overall edit count list, in case either of you might be up for it. {{u| Sdkb}} talk 20:48, 19 July 2023 (UTC)
Sounds like we need three lists: total, manual and (semi-)automated. Or possibly two: manual and (semi-)automated. The negative/subtraction name "by non-automated edit count" rather should be a positive name "by manual edit count". I'm not convinced automated edits are typically low-value, anyway, manual edits often have the same characteristics. Plus it's so hard to tell since automation can take many forms that are impossible to track. -- Green C 03:58, 20 July 2023 (UTC)
The Cricket Archive (
https://cricketarchive.com) have introduced a subscription access to their website. We have about 14,600 articles referencing the site. Could a bot add |url-access=subscription
to those that do not have this.
Keith D (
talk) 22:09, 25 July 2023 (UTC)
|url-access=
is a good idea where possible, anyone can do this, including with AWB. If the cite has an archive URL with |url-status=dead
(or no |url-status=
same thing), maybe the |url-access=
is redundant. --
Green
C 04:03, 26 July 2023 (UTC)Hey folks, I was perusing Category:ATP Tour navigational boxes and realised that a) most of these templates are out of date, and b) this would be a good task for a bot. From The ATP site gives this PDF, which contains all of the useful information that could be used for a module. Before I start writing the module, I guess I was wondering if anyone would be able to utilise this information in a way where it could be updated weekly as the stats update. The way I see it:
From there the module can give ranks and any changes from week to week for use in the various templates. If this seems workable, let me know. Thanks. Primefac ( talk) 16:07, 22 July 2023 (UTC)
Deferred
Hopefully everyone knows there is no national flag of "Korea"; there is a DPRK (North Korea) and a South Korea. Unfortunately, several beauty pageant fans have added the nonexistent "Korea" flag to over 80 articles as indicated in the search results above. It should be pretty simple to have a bot rename it to South Korea by removing the name=Korea
parameter from the flag template. ☆
Bri (
talk) 22:01, 9 August 2023 (UTC)
The archive date doesn't match on many of User:AShiv1212's articles such as Ahimsa (2023 film). DareshMohan ( talk) 07:31, 13 August 2023 (UTC)
|archive-url=
. Or
this which requires maintain the same date format (iso, mdy, etc). There are probably other things like that. So the first run will be about discovering what the bot needs to do to fix the errors. --
Green
C 16:44, 13 August 2023 (UTC)|archive-url=https://web.archive.org/web/20201028113503/https://www.tvguide.com/news/drag-48227/ |archive-date=October 28, 2022
.. notice the date in the URL is 20201028 and the |archive-date=October 28, 2022
don't match. It should be |archive-date=October 28, 2020
--
Green
C 05:39, 14 August 2023 (UTC)
|website=
parameters, my guess is that they were filling out the templates manually, and setting the |archive-date=
equal to the |access-date=
for some reason. None of their edit summaries contain links to userscripts. Have you tried asking the user directly?
Folly Mox (
talk) 13:26, 14 August 2023 (UTC)
Hi - I want to re-raise this item, since we have users assuming that wikiprojects that use the bot. For example this conerstaion occured because the user created a WikiProjectCards page but it wasnt added by the bot this page, so they assumed the wikiproject wasnt accepting new members. There are dozens of other users that have attempted to sign up to the wikiproject but not appeared on the participant list. I suspecrt similar scenarios are happening for the other wikiproject that use the bot ( WP:MED WP:WPWIR etc). (ping @ Pppery, Harej, and MZMcBride: I realise you're busy but I wanted to make sure you saw this). T.Shafee(Evo&Evo) talk 02:55, 17 August 2023 (UTC)
Hello!
I am a new editor who noticed a large number of uses of US English on British English-tagged pages and I would like to see what can be done with a bot. The first thing I fixated on was use of "percent" over "per cent", but soon noticed others such as "program" over "programme". Is there a way of programming a bot to reform these and similar spellings? Of course it would have to take into account proper names of things and direct quotations. I think it might be advantageous to programme the bot to do a general sweep of the tag and nip any future issues in the bud so words such as "colour" (US: "color"), "enrolment" (US: "enrollment"), "travelled" (US: "traveled"), "hippy" (US: "hippie"), "aluminium" (US: aluminum"), "gaol" (US: "jail"), "rouble" (US: "ruble"), "aeroplane" (US: "airplane"), "sulphur" (US: "sulfur"), and so on are reformed also. It should not touch words containing -ize/-ise, because both are acceptable in BrE (indeed I saw a sign at my local GP surgery recently that read "sanitize your hands"). Stolitz ( talk) 01:51, 16 August 2023 (UTC)
Manually-assisted bots are acceptable, so long as they include international spell checking (not only country-specific spell checking) and the operator does in fact examine every proposed edit before allowing the bot to make it.WP:AWB or WP:JWB may be useful here, though their users needs pre-approval (to deter mass drive-by vandalism) and we normally expect longer experience before granting it. Certes ( talk) 09:17, 16 August 2023 (UTC)
Standard WikiProject tagging request here. I have been slowly working on
Category:Green Bay Packers articles needing infoboxes, bringing the total down from about 100 articles to 24 as of this writing. I have been averaging about 3 or 4 a day, so I expect to clear the category in about a week. However, it has been a really long time since I had a bot run through the category tree and tag the relevant article talk pages (see
here for the last request). So, for the request, could I have a bot owner run through the following categories and sub-categories, check if an infobox exists on the page, and if not, tag {{
WikiProject Green Bay Packers}} with the needs-infobox=yes
:
Thank you for any help you can provide. « Gonzo fan2007 (talk) @ 21:54, 14 September 2023 (UTC)
Hello, i'd like to request a bot add the template {{ WikiProject Artsakh}} to the talk pages of all articles in this category tree:
I'm not sure how many of them already have the template, but combing through all of them & adding it individually would be extremely tedious. Thanks! Sawyer-mcdonell ( talk) 22:16, 30 September 2023 (UTC)
The IUBMB_EC_number parameter was removed from Template:Infobox enzyme in February 2013 by Boghog (the value is now automatically calculated from another field). At this point there are over 4800 entries in Category:Pages using infobox enzyme with unknown parameters and all but a dozen or so are the IUBMB_EC_number parameter. I've done parameter removal using AWB before and know that the regex can get tricky on some but in this case I don't think it would be that bad the values should be of the form A/B/C/D where A,B,C &D are all numbers. Naraht ( talk) 15:28, 24 August 2023 (UTC)
|IUBMB_EC_number=
from
Template:Infobox enzyme as I find them. I would be in favor if a bot removed all occurrences. This parameter was necessary before Lua support was available in Wikipedia templates. With Lua, this parameter can automatically calculated from the mandatory {{
EC_number}} parameter.
Boghog (
talk) 17:56, 24 August 2023 (UTC)
I wrote a separate article about East Khandesh (EK), now we don't need EK to redirect at Jalgaon district, so req to delete all such redirects. Tesla car owner ( talk) 11:57, 4 September 2023 (UTC)
Wikipedia:WikiProject User warnings says:
"To help centralise discussions and keep related topics together, all uw-* template talk pages and WikiProject User warnings project talk pages redirect to [Wikipedia talk:Template index/User talk namespace]"
However, many talk pages still do not do this. It would be quite tedious to have to go through every single template, go to its talk page, check if it redirects to it, and if not create it.
Could anybody create a bot that checks if a Uw- template's talk page redirects to Wikipedia talk:Template index/User talk namespace, and if it doesn't, create it? Millows ( talk) 21:46, 20 July 2023 (UTC)
Template:Gutenberg author takes an |id=
that is either a name eg. |id=Quick,+Herbert
.. or a number |id=4251
. When a name, it goes to a search page
like this then you click through to the author page. When a number it goes directly to the author page. Ideally one would use the number it's more accurate and direct. Most cases on Enwiki are the name and would benefit by changing eg.
Special:Diff/1169939608/1170077601.
A bot could scrape the search page and change the ID to the number. It's only safe when there is a single name in the search result page. The template is in about 11k pages. Potentially one could add support for Wikidata, which might have it, but they didn't always correctly match the authors on Wikidata with the authors on Gutenberg and Wikipedia. Plus the problem of unwatched vandalism. This is a relatively easy and low-controversy bot or AWB project. -- Green C 02:26, 13 August 2023 (UTC)
I'd like to request a bot that will fix the redirect targets of the redirects listed at
User:Gonnym/sandbox/database report with the precise anchor target. A target page that uses the standard
Template:Episode list (or
Module:Episode list directly) will have anchors as "#ep<number>" where the number is the |EpisodeNumber=
value (of the mentioned template or module call). So for example, for "
His Maker's Name" the link is [[The Zeta Project#ep2]]
. Note that some are targets to episode articles (alternative names, spelling, disambigiation, etc.) and do not need to be changed so the bot should just skip these.
Gonnym (
talk) 15:41, 21 August 2023 (UTC)
|Title=
matches the redirect's title, and then it gets the value of |EpisodeNumber=
. So it would just skip cases like
Behold... The Inhumans because the target doesn't use {{
episode list}}. (Likewise with
The Real Deal (Agents of S.H.I.E.L.D. episode).)|Title=
check ignore disambiguation? That is, would
A Bolt From the Blue (Lois & Clark) and
A Bolt From the Blue (Lois & Clark episode) both be fixed?
Gonnym (
talk) 11:44, 10 September 2023 (UTC)
Fix references. Java. i wanna using bot on this wiki Aesthetic of me ( talk) 16:54, 2 September 2023 (UTC)
Hi, I don't know if there's already a bot that can handle this, or if it's something I would need to request a new bot be written for, but the situation is that it was announced today that the entertainment news series Entertainment Tonight Canada will be cancelled in a couple of weeks, and apparently the website is disappearing with it — but as a person who edits principally in the film, television and music areas, it's obviously a source I've cited a lot (and I mean a lot a lot) in the past several years, so the links are going to need to be archived for salvage purposes.
It's obviously not a task I want to grind through all by myself if I don't absolutely have to, so I wanted to ask if there's a bot that can check for all Wikipedia articles that feature links to the https://etcanada.com/ domain, and ensure that there's an archived copy added to the citation if there isn't already one present yet? Thanks. Bearcat ( talk) 19:36, 27 September 2023 (UTC)
When {{Coord}} templates are at the top of an article, they always break the page previews, making them display with no text. I'd like to request a bot that moves them to the bottom of the first paragraph to remedy this. (Don't know why this happens, but it does get fixed when you move it.) LOOKSQUARE ( 👤️· 🗨️) talk 01:11, 3 September 2023 (UTC)
Links to recent discussions:
Before acting on the above, please check these discussions. – Jonesey95 ( talk) 04:50, 3 September 2023 (UTC)
Currently, User:FireflyBot posts notifications to the talk pages of page creators of AfC Drafts that have not been edited for five months, to warn them of impending deletion ( BRFA). However, this approach of notifying only the page creator is suboptimal, as it does nothing to attract the attention of other editors who may be interested in rescuing the draft. In this discussion, extending the bot task to also post notifications to the draft talk page itself was suggested as a solution. I have asked Firefly about this, but he's been quite busy and probably won't be available to work on it any time soon. He did say, "if someone else is willing to write it I can run it as part of the same task." So I'm asking to see if someone might be. -- Paul_012 ( talk) 20:12, 3 September 2023 (UTC)
I recently discovered that all major WMF announcements go to WP:VPM, because that's the page listed at m:Distribution list/Global message delivery and a few other global MassMessage lists.
I'd like to receive all MassMessages that go to VPM on my talk page. I can't do that by subscribing to the source, because m:Distribution list/Global message delivery only contains noticeboards and announcement venues (not individual user talk pages).
I therefore would like a bot that, whenever User:MediaWiki message delivery posts a message to WP:VPM, the bot submits another Special:MassMessage request that copies whatever was put onto VPM and re-announces that onto a new MassMessage list. I have to imagine I'm not the only one who wants to see global announcements on my talk page.
This shouldn't be too hard to do, right? Best, KevinL (aka L235 · t · c) 21:31, 28 August 2023 (UTC)
A bot that can translate over 35 languages. It's pretty cool if you ask me. It would have a giant blue button that says "Chatter Box Mode" that helps translating. I am pretty good at drawing so I could even make a logo for the Bot. There could even be a chatter box squad or maybe CBS to help answer the questions people have about the Mrs Chatter bot. The bot's userpage could have like a sort of table with the world HELLO in different languages like Bonjour, Dumelang, Mmolo, Sao bona, Dumela, Elko, and Namaste and even many more DJ Aquah ( talk) 10:03, 14 December 2023 (UTC)
While we currently archive the WP:ITNC page to review past candidates for In The News, we don't have a similar function for the ITN items that are actually added to {{ In The News}}. Is it possible that, given a date range and a target page, for a bot to capture "significant" additions to the template along with the date added. So for example, the bot should be able to review this diff, and create a line on the target page with the date of the change, the editor that added it, and the text of the addition (here being " Tharman Shanmugaratnam is elected as the next president of Singapore.")
The end goal would then to have the bot initially make monthly pages from the start of ITN, and then on a monthly basis create a new monthly archive.
I think the one constant is that all blurbs as well as RDs added start with a "*" mark, as to distinguish from minor typos or wording corrections or changes in the picture. However, I would rather the bot be overzealous and include false positives. Masem ( t) 13:14, 8 September 2023 (UTC)
blurb, editor name, timestamp
format, correct? courtesy ping:
Qwerfjkl (I couldnt have solved the diff issue without their help). —usernamekiran
(talk) 17:00, 10 September 2023 (UTC)
*[[Bill Richardson]] − added by <username> at <date of addition>, removed by Stephen at 00:12, September 8, 2023.
Thats just an example, the wording can be changed. —usernamekiran
(talk) 10:44, 11 September 2023 (UTC)
|
. They are from the "Main page image/ITN" template. In the current version, I have updated the program so that it will skip such lines.
In this version, the bot excludes the lines beginning with |
. I have also updated it so that lines beginning with *[[
are considered as recent deaths. Let me know what you think, and what other functionality would you like. —usernamekiran
(talk) 13:14, 11 September 2023 (UTC)
Thanks so much for this! It would be nice to add a header for each day, which would help to add some separation. For example ==September 10== and ==September 9==. I also think the "added by [Username] on [date]" part could be set off in a smaller font or otherwise separated from the main blurb, but I'm not sure exactly how that would be best formatted, and even as it currently is I don't particularly mind it. The single most important thing for readability, though: would it be at all possible to detect at least some cases of modification as distinct from addition? I can suggest some approaches. If in the same edit, one non-RD line is removed and another non-RD line is added in the same line number position, it is generally a modification. This is because new blurbs are usually added to the top and old ones are usually removed from the bottom due to staleness, so if the same line is added and removed it's a good indication of an intentional swap. I understand that this may not be a 100% accurate check, but you could also combine it with a second check like making sure the bolded link target is the same. Other approaches like textual similarity metrics based on edit distance or NLP algorithms are probably overkill, but could be attempted. I still think that checking the placement within the document and the bolded link target is good enough to reduce most of the duplication. In the case that you do detect duplication, I would probably try to put it underneath the original version, and have them listed as a chronological progression of edits. Example. 98.170.164.88 ( talk) 01:45, 12 September 2023 (UTC)
I'd like to harness the data from afltables.com to automatically update all AFL players at the conclusion of every round. Some of the players are years out of date, which is a shame. The stats box (goals and games) could also be updated. Electricmaster ( talk) 04:23, 12 October 2023 (UTC)
I would be interested to see a bot go through the |URL=
parameter of all instances of {{
Infobox newspaper}} and {{
Infobox organization}} and create redirects if they do not already exist from the domain name to the article with the transclusion, tagged with {{
R from domain name}}. This task could potentially be expanded to other infoboxes as well, but I think those two are a good place to start given that they'd help make linking to articles on sources in citations easier. Cheers, {{u|
Sdkb}}
talk 08:26, 12 October 2023 (UTC)
|url=
parameter of those infoboxes to generate a list, and then proposing a bot that use said list to make those edits (e.g. changing |website=nytimes.com
to |website=The New York Times
)? You could also provide the list to
Ohconfucius, whose
Fix Sources script makes some of those corrections. I'd also suggest asking the citation tool owners to adjust their tools, but some tools aren't maintained.
GoingBatty (
talk) 16:28, 12 October 2023 (UTC)
|website=nytimes.com
to |website=The New York Times
, among others, but I would be happy to add more similar converts that are not already included. Feel free to check
here to see which journals are converted. --
Ohc
revolution of our times 14:39, 20 October 2023 (UTC)
There are a few hundred cases with |last=Reuters
(
Reuters insource:/\{\{Cite news[^\}]+last *= *Reuters/i). Correction as in
this example would fix the issue. Would anyone be willing to perform this task?
Leyo 13:26, 9 November 2023 (UTC)
On most (about 80%) of the food and drink pages italics are used improperly (I have spent many hours of my days correcting errors of this kind), for example: many times on a page a food is put in italics and on the same page many times it's not; on one page a food is made italic and on others the same food is not made italic; it almost always happens that when one enters a wikilink of a food put in italics, one is confronted with a page without that food in italics. I myself struggle to continue reading foods and drinks pages, I don't want to imagine in the mind of a reader how much bloody confusion is created. I would propose to have a bot act by removing all italicised food and drink terms, or, even better, selecting every existing food, deciding whether to make it italic or not, and, again through the bot, changing everything at the same time, without (which is impossible) doing it without bot. I, however, have done my best, but I will announce that I will never again spend time on this problem, as I am in an endless loop. I wonder what's the point of italicising a food If there is zero uniformity. JackkBrown ( talk) 07:07, 10 November 2023 (UTC)
Could a bot please change all instances of “power forward (basketball)” to “power forward” to correct the links to Power forward, which was recently moved? Thank you Rikster2 ( talk) 19:43, 22 November 2023 (UTC)
[[power forward (basketball)]]
without a pipe, it would be appropriate to change those. However, I could not find any in articlespace.
GoingBatty (
talk) 19:57, 22 November 2023 (UTC)
linksto:"Power forward" -basketball
finds wikilinks which might lead to the wrong article and need improving. (There's currently only one result, and it's a false positive.) Changing the links would raise the number of false positives to 50, making any actual errors harder to spot. The qualifier is acting a bit like "(disambiguation)" in an
INTDABLINK by marking these links as checked and correct.
Certes (
talk) 20:55, 22 November 2023 (UTC)
I would like to request a bot and I would like to name it, the Bad Guy Patrol (BGP) and it would help me restore order on Wikipedia. It will help with vandalism, blocking and cleanups. Harley Quinn on duty ( talk) 14:07, 6 December 2023 (UTC)
Hello Wiki world. I am in Wikipedia for the past 8 months making a little bit contributions. It would be so helpful for me if you can enable the bot in my account. Thank youu!! EEverest 8848 ( talk) 15:47, 20 November 2023 (UTC)
This is unmanageable. I have tried to do what I can, but considering the infinity of Italian municipalities and provinces, I wonder if you could ask bot to make all "Provinces" (with a capital 'P') "provinces" (with a lowercase "p"). I have done what I can, more than that I cannot, there are really too many corrections for one person. Examples: Province of Caserta; Comitini; Province of Trapani; Province of Udine. Thanks in advance. JackkBrown ( talk) 21:03, 4 November 2023 (UTC)
Sorry if I've missed a past discussion on this suggestion.
Manually setting up archiving bots on talk pages is time consuming, and there are an enormous number of pages were it hasn't been set up, leading to clogged talk pages. It would be useful for a bot to add one to talk pages that don't have one, with notional parameters (30d, 2 minimum, etc).
I think it would be important for editors to be able to exclude it, too, if consensus was that an archiving bot wasn't wanted for some reason. Cheers. Riposte97 ( talk) 05:56, 25 October 2023 (UTC)
I noticed that a very large number of uses of the template:lang-ku display Latin text incorrectly (non-italics, in Arabic font; see, for example, here in the lead). Since it would take a very long time for anyone to go through and add the parameter script=Latn to make the text display correctly, is there any way that a bot can complete this task? Thanks in advance. Revolution Saga ( talk) 22:00, 28 October 2023 (UTC)
Please see the Template_talk:UCI_team_code#Requested_move_30_October_2023 where there was rough consensus to usurp the ct shortcut. Essentially, replace the 12,000 transclusions to bypass the redirect so that Template:Ct can redirect to Template:Contentious topics. Awesome Aasim 23:08, 9 November 2023 (UTC)
A lot of pages have links to Wiktionary pages in the body text. This is fine, though I think the links are supposed to be like
this (interwiki) and not
this (external).
Would it be possible to create a bot that turns these into the first example? LOOKSQUARE ( 👤️· 🗨️) talk 21:05, 16 November 2023 (UTC)
See
Wikipedia:Reliable sources/Noticeboard#Washington Independent.
But archiving those links is not quite straightforward. We should probably get rid of any link to the live domain (which is garbage) and we should only use archive.org snapshots that are older than, say, 2016. When there is no older snapshot, the link/reference should be removed entirely. — Alexis Jazz (
talk or ping me) 01:34, 15 November 2023 (UTC)
Looking for a willing bot operator to implement WP:PIQA by migrating quality assessments from WikiProject banners into {{ WikiProject banner shell}}. To be more precise,
|class=
parameter and remove from project banners, e.g.
[6]|class=
parameter to encourage editors to add a rating, e.g.
[8]|class=
parameter. These will be tracked and reviewed manually.|living=yes
or |blp=yes
then add |blp=yes
to {{
WikiProject banner shell}}.|listas=
then move this to {{
WikiProject banner shell}} and remove from project banners, e.g.
[10]Thanks in advance — Martin ( MSGJ · talk) 22:18, 15 November 2023 (UTC)
it'd be fine to boldly consolidate duplicate AFC banners on article talk pages. There are two possibilities - actual duplicated banners (in which case one should be removed) and two different banners, which should be merged, to best save the review history. Primefac ( talk) 13:48, 16 November 2023 (UTC)
Not sure if this is the right place to ask this, but I noticed that when User:MalnadachBot was procedurally blocked, its tasks 12 and 13 were still marked as active. I don't know if it has finished running through all lint errors on wiki, but task 13 is certainly an ongoing effort. Would we need a replacement bot to pick up these tasks, or do we have existing bots/procedures handling these things? Liu1126 ( talk) 20:17, 19 November 2023 (UTC)
I am in the midst of translating articles from English to Gagana Sāmoa which is a very necessary task given the massive inequality in the available information in each language. My hope is that more Sāmoa users will find Wikipedia to be a more hospitable site and access it to find information in their language. There is a massive disparity not only between English and Gagana Sāmoa but even between other languages and the languages of the Pasifika by and large. I would like to request a bot to help translate these articles as this task is overwhelming and this disparity will only grow given the population, internet access, and specialization of Sāmoa users. Something has to be done otherwise the language will likely go the way of 'Ōlelo Hawaiʻi. IonaPatamea ( talk) 20:57, 20 November 2023 (UTC)
I have seen Pablo Picasso's files on English Wikipedia that his works will be transferred to Wikimedia Commons on January 1, 2044, 70 years after his death. However under Spanish copyright law, the copyright term for Spanish authors who died before December 7, 1987, including Picasso who died on April 8, 1973, have life term plus 80 years, and for those who died otherwise have life term plus 70 years, though it is unclear if the copyright expires on his 80th death anniversary or January 1 following it. So for sure, there is no room to have his copyright expired on January 1, 2044. It's either April 8, 2053 or January 1, 2054. I am hoping for a consensus if his copyright expired on these bolded dates. Here are his works to have his copyright expiration date edited or added (if not) one-by-one. Ishagaturo ( talk) 09:07, 23 November 2023 (UTC)
This probably should not be implemented trivially for languages written in the Latin script, but with a few caveats, it seems pretty doable to write a bot that scours articles, and while staying out of appropriate templates, tags text using existing templates like {{
lang}}
as either being in a specific language, or at least being in some language written in a particular script, e.g. und-Hani or und-Cyrl as per the obligatory HTML |lang=
parameter and
ISO 639. If there is und text already tagged, it makes it much easier to see whether 漢字 is lang=ja-Hani or lang=zh-Hant, and also to quickly retag everything en masse.
If we are getting dangerous, I can think of multiple ways to further discriminate between, say, Japanese and Chinese-language text beyond simple checking for strings of
CJK ideographs.
Remsense
留 21:16, 6 December 2023 (UTC)
zh-hant
and zh-hans
or whatever they're called in the appropriate standards.
Folly Mox (
talk) 13:29, 7 December 2023 (UTC)
We have a lot of citations that could be improved using |author-link=
eg.
Special:Diff/1186331321/1186342802. The problem is it's difficult to match the correct author, it requires a human. Thus wondering if/how this might be automated in certain cases. It doesn't require every case only those it can match with greater certainty. For example we know, per the above diff, there is only one
Steven Poole there is no dab page. And we know Steven Poole writes for a publication called
Quercus. Thus any other cites that match those criteria, is a good bet that is the same person, and where an |author-link=
could be added.
Is this method 100% foolproof? Probably not, but is it at least 99% accurate in matching names? Probably. I think a test run would show how reliable it is. I don't have the time right now but wanted to mention in case anyone wants to run an experiment. Or had other ideas. A dump of CS1|2 citations on enwiki - not including cite web - can be found here. I currently have updates disabled, but can restart if anyone wants. -- Green C 20:27, 22 November 2023 (UTC)
|author-link=
and build a 2-column database: "Steven Pool = Quercus". Then find all other citations that cite Steven Pool and Quercus, and add the |author-link=
if missing. It works backwards from what is know to be true. --
Green
C 19:42, 5 December 2023 (UTC)
Can a bot make this kind of changes to multiple pages?
{{abcd|ᚠ|ᚡ|ᚢ |ᚣ|ᚤ|ᚥ|ᚦ| ᚧ|ᚨ|ᚩ}}
, {{abcd|Ꭰ|Ꭱ|Ꭲ|Ꭳ|Ꭴ}}
ᚠᚡᚢ ᚣᚤᚥᚦ ᚧᚨᚩ
, {{abcd|Ꭰ|Ꭱ|Ꭲ|Ꭳ|Ꭴ}}
That is,
[ ]?[ᚠ-ᛸ][ ]?
in each parameter.|
, but retain the text entered as parameters (including spaces).[ ]?[ᚠ-ᛸ][ ]?
), leave it as-is.172.58.208.108 ( talk) 19:19, 16 December 2023 (UTC)
Good day, can someone make a bot to run through
this and append {{
SVG-logo}} below the Non-free xxx template and add ==Summary==
above the FUR template to files that don't have it? --
Minorax«¦
talk¦» 11:22, 7 November 2023 (UTC)
This is my first time posting here, so no idea if this should be done by a bot. So, the "IPAlink" template has another variation "IPA link" (notice the space). The official representation is "IPA link" but I find the "IPAlink" variation also is quite predominant. This isn't urgent ("IPAlink" redirects to "IPA link"), but would a bot fix this sort of thing? PharyngealImplosive7 ( talk) 18:08, 21 December 2023 (UTC)
I asked at the help desk and I was told to ask here. The Organized crime task force and the Serial killer task force banners recently got added to the banner of Template:WikiProject Crime and Criminal Biography using parameters. The previous banners (as wrappers of the new one with the parameters) were mass substituted. This has left ~6700 (see Category:Unknown-importance Crime-related articles, not counting ones that didn't have an initial basic crime importance) duplicates, that have both the original crime importance and task force importance but split between two duplicate banners.
Is there any bot that can merge the importance values on the pages that have both templates so there aren't so many duplicates (for example if there's two duplicate banners, one of which has the importance for wp crime and one which has the task force importance, add them together)? Of course the ones that were not initially tagged with the original crime ones will have to be manually tagged as they don't have the basic importance parameter, but that's less than 500 which isn't as bad (compared to 6700 that already HAVE all the required importance parameters) PARAKANYAA ( talk) 17:56, 11 November 2023 (UTC)
{{
WikiProject Crime and Criminal Biography|serialkiller=yes|serialkiller-imp=low|organizedcrime=yes|organizedcrime-imp=low}}
(or having 1 or the other task forc parameters, just showing both for sake of example){{
WikiProject Crime|importance=low}}
OR {{
WikiProject Criminal Biography|importance=low}}
(also called WikiProject Criminal which iirc has quite a few transclusions){{
WikiProject Crime and Criminal Biography|importance=low|serialkiller=yes|serialkiller-imp=/nowiki>low|organizedcrime=yes|organizedcrime-imp=low}}
Hi, I would like to know whether a bot would be able to do this particular task or not. The task is to replace the existing format with the template like I did here on my sandbox to explain it better: [12]
The following articles: 2004 Andhra Pradesh Legislative Assembly election and 2009 Andhra Pradesh Legislative Assembly election require these template changes. Since I am finding this monotonous task quite difficult to do it myself, I am looking for help probably a bot might help I believe? Any info or help is appreciated. Thank you 456legend ( talk) 05:52, 7 December 2023 (UTC)
Hello, it has come to my attention that template sandboxes X21 to X71 are not automatically cleared by Cyberbot I, which clears template sandboxes X1 to X20, and the main template sandbox. So I think there should be a bot that clears the rest of the template sandboxes. This bot would be called "SandBot", and it would clear the template sandboxes at 00:00 UTC and 12:00 UTC every day. It would do additional help for Cyberbot I for clearing template sandboxes X21 to X71. This is only a proposed bot I had the idea to create. RandomWikiPerson_277 talk page or something 19:58, 12 December 2023 (UTC)
Simple idea: monitor the protection log, and any time the protection level is increased, but the expiration time is decreased, wait until a few minutes before the expiration, and restore the status quo. If it really is the intention of the protecting admin to leave the page unprotected at expiry, they can leave a keyword like NOFALLBACK
or something in the protection summary. An obvious complication would arise if the bot is lagging, and some edits slip in before protection can be restored, but that's a minor detail. Yes, I know about the PC trick, but people sometimes forget, and sometimes PC is isn't enough.
Suffusion of Yellow (
talk) 03:58, 2 December 2023 (UTC)
Category:Pages using WikiProject Film with unknown parameters, a maintenance category which exists to flag problems where a use of {{ WikiProject Film}} on a talk page is calling parameters that don't exist to be called, currently has 4,808 articles in it — and after looking at it and cleaning up the tiny single-digits handful of exceptions that existed anywhere after the letter B, I was able to determine that the remaining contents all relate entirely to an old, long-deprecated practice whereby B-Class articles in that queue were each also tagged as b1=[y/n], b2=[y/n], b3=[y/n], b4=[y/n] and b5=[y/n] for their individual success or failure in meeting each of the five B-Class criteria listed at Wikipedia:WikiProject Film/Assessment. That's long since been deprecated and isn't done anymore, which is why those are landing as unknown parameters now — but with 4,808 articles to deal with, actually cleaning them up is more work than any human editor would ever actually be inclined to undertake.
Accordingly, I wanted to ask if there's any bot that can be set loose on the task of stripping b#= parameters from the contents of that category. Bearcat ( talk) 17:20, 1 January 2024 (UTC)
For the 5832 articles listed at
Wikipedia:WikiProject Africa/The 10,000 Challenge please add |AFR10k=yes
to the project banner {{
WikiProject Africa}} on the talk page. This adds a note to the banner and also populates
Category:Articles created or improved during the WikiProject Africa 10,000 Challenge. Thanks — Martin (
MSGJ ·
talk) 19:19, 3 January 2024 (UTC)
Hello,
GNU is the Operating System and Linux is one of its Kernels. Linux is not an Operating System. Hence, why I believe a bot should locate and correct these errors. Where Linux is mentioned, it should be changed to GNU/Linux or GNU-Linux. This request is being made for Richard Stallman, who has cancer. Twillisjr ( talk) 16:07, 6 January 2024 (UTC)
I have noticed that many categories, especially content categories, include
non-free files without the __NOGALLERY__
magic word, which is against
WP:NFCC#9. I'd suggest using a bot to auto-tag such categories, skipping a whitelist for those categories covered by
WP:NFEXMP (generally those categories concerning reviews of questionable files, such as
CAT:FFD, and some maintenance categories that should contain no non-free files). –
LaundryPizza03 (
d
c̄) 12:01, 14 November 2023 (UTC)
__NOGALLERY__
and it would be actually smarter and less work to disable image showing on all categories by default (without requiring any code per page or bot work) and have a __YESGALLERY__
magic word for the much less instances of categories that actually could show images.
Gonnym (
talk) 12:38, 14 November 2023 (UTC)
Does the task at Wikipedia:Village pump (technical)#Implementation of Template:Refideas editnotice require a bot, or is there another way to accomplish that? You can respond there if you like. BOZ ( talk) 05:42, 14 November 2023 (UTC)
My website runeberg.org just recently moved from http: to https: so it would be nice if someone could update the 11,000 links accordingly. This is not urgent, as everything works fine with automatic redirects, but it would be nice. Thank you. -- LA2 ( talk) 22:33, 17 December 2023 (UTC)
Done -- Green C 16:33, 9 January 2024 (UTC)
I just moved
Saint Francis University to
Saint Francis University (Pennsylvania), because there will be a university also named Saint Francis University in Hong Kong (
Caritas Institute of Higher Education acquires university title, Government of Hong Kong Press Release). The page "Saint Francis University" will be a redirect to
University of Saint Francis. Before doing so I need to fix all pages with link to [[Saint Francis University]]
and replace it with [[Saint Francis University (Pennsylvania)|Saint Francis University]]
(or, if the link is [[Saint Francis University|something else]]
, just replace the link itself, not description), which I found hundreds. Is there a bot that can do this task for me? --
Leeyc0
(Talk) 12:04, 9 January 2024 (UTC)
There are quite a few external links templates created in recent years (See Category:Social media external link templates) and when used they offer a consistent style and allow for error tracking among other things. However there are still quite a lot of external links that don't use these. Sometimes they are bare links, while others have some kind of text with them. Would it possible for a bot to convert external links in the external links section (links in the body should be ignored as I'm not sure if these templates work in the body correctly or not) to use one of the listed templates at the bottom? Here is an example of an edit with IMDb title.
Templates:
If this is controversial and needs discussion, please point me to where it should be held. Gonnym ( talk) 15:45, 28 December 2023 (UTC)
https://example.com/person/<number assigned to person>
to
https://example.com/profile/<persons_full_name>
, the template cannot be updated in a fashion that will result in a meaningful change, since all we have on the template calls is {{example|<number>}}
. On the other hand, I think at least one of the URL bots has the ability to match old to new, so if it sees
https://example.com/person/<number>
in the text directly it can update to the new code. Either way a bot will need to update everything, but with the latter case (again, assuming it's possible) there is already a bot that can do that functionality.In other words, an elink bot will notice a change in URL if the URL is in the article, but a user has to notice a dead link if it's in a template.
Primefac (
talk) 12:27, 9 January 2024 (UTC)
{{
official}}
might get supported. Don't forget, these templates change so if a bot supports the template, and someone changes it, the bot has to be updated to avoid making errors. And this is just Enwiki there are over 300 Wikipedias, plus hundreds more in other projects. And the underlying code of saving dead links is quite complex to do correctly, only a few programmers have this down, these are large complex tools that have taken many years of development. A BOTREQ to make a dead link fixer, for one template, doesn't make sense. At best, a bot that converts templates to CS1|2 or square-link, then run the archive bots. --
Green
C 16:30, 9 January 2024 (UTC)
|url-status=
, |archive-url=
and |archive-date=
?
Gonnym (
talk) 12:50, 10 January 2024 (UTC)
|archive=
like {{
2006 Commonwealth Games profile}}
- this looks like an alternative method in use, most of those templates are sports related so it was probably conceived by a few editors at some time.|archive-url=
trio, and external link template use simply |archive=
which if it exists the template renders this URL as a replacement for what it would have rendered. It's going to be template-specific how to best approach this. Anyway, if it's true
Category:External link templates with archive parameter is only for templates that use |archive=
, it will be important to have a new category for
Category:External link templates with archive-url parameter, so bots and tools can differentiate which parameters to use. --
Green
C 16:58, 10 January 2024 (UTC)Hey all! For some background here, for TWL users to access Newspapers.com, the library sends them through a proxied domain at https://www-newspapers-com.wikipedialibrary.idm.oclc.org/. This often results in this domain name making its way into the mainspace, which is problematic because it can only be accessed by those with access to TWL.
JPxG has set up a way to replace these links with the unproxied domain using JWB (see more info and an example edit), but I feel like this is an area where a bot could step in.
Citation bot is able to clean these links up automatically (see an example edit), but it has to be triggered manually. These proxy URLs are not automatically placed in a category, which means a human editor would need to assemble a list of pages to be fixed for Citation bot to even look at them. Citation bot also wouldn't deal with these links outside of citations, such as with external links.
It's worth noting that I've previously filed a tangentially similar BRFA, which was denied as Citation bot would be easier to use and give better results. With these links, however, I don't think that's the case, mainly because Citation bot is tedious to trigger on these pages, but also because Citation bot doesn't even touch other proxied URLs, only Newspapers.com.
I'd love to make this happen using Pywikibot, but based on my previous BRFA I wanted to see some thoughts on this being fully automated. This task is already being done semi-automatically way through JWB, so I think it might as well be fully automated, potentially expanding additionally to other TWL-proxied sites. (Citation bot doesn't even touch other proxied URLs, only Newspapers.com.)
(CCing Headbomb for your thorough comments on the previous BRFA—would love to hear your opinion especially.) Bsoyka ( talk) 18:32, 28 December 2023 (UTC)
insource:wikipedialibrary.idm.oclc.org
which throws about 643 hits in mainspace, I wonder if expanding the scope to other wikipedialibrary domains would be warranted. It seems like there are a lot of links to that proxy.
Jo-Jo Eumerus (
talk) 09:55, 29 January 2024 (UTC)
Hi, I asked the following at the Help Desk, and they suggested asking here:
I noticed that there are a ton of pages tagged for needing verification from August 2022. All of the location ones really just need the first of the two notes citations (the one just going to census.gov) removed. Is there a way for someone to mass-fix this?
The note, as it is, is always in the Demographics section as:
"Note: the US Census treats Hispanic/Latino as an ethnic category. This table excludes Latinos from the racial categories and assigns them to a separate category. Hispanics/Latinos can be of any race.<ref>http://www.census.gov {{nonspecific|date=August 2022}}</ref><ref>{{cite web |title=About the Hispanic Population and its Origin |url=https://www.census.gov/topics/population/hispanic-origin/about.html |website=www.census.gov |access-date=18 May 2022}}</ref>"
It is the first of the two that needs to go, because the second has it covered.
To add: on all the pages I have fixed thus far with this error ( see: recent Texas edits), it is the only note on the page, and always attached to a table with racial demographic data.
Thanks in advance! Edenaviv5 ( talk) 16:18, 11 January 2024 (UTC)
See this discussion: is there a bot that can assist us with the deletion review process? Jarble ( talk) 19:46, 12 January 2024 (UTC)
Deleting or otherwise removing errors from uncalled references. Geardona ( talk to me?) 20:24, 17 January 2024 (UTC)
Ever since the idea of immediately moving inadequate articles to draftspace emerged as a common alternative to deletion, the amount of time that has had to be invested in cleaning up polluted categories that have draftspace pages in them has gone way up, because the people who do the sandboxing frequently forget to remove or disable the categories in the process — so I wanted to ask if there's any way that a bot can be made to clean up any overlooked stuff.
Since there's already a bot, JJMC89bot, that detects main-to-draft page moves and tags them as {{ Drafts moved from mainspace}}, the easiest thing would probably be to just have that bot automatically disable any categories on the page at the same time as it's tagging it — but when I directly approached that bot's maintainer earlier this year to ask if this could be implemented, they declined on the basis that the bot hadn't already been approved to perform that task, while failing to give me any explanation of why taking the steps necessary to get the bot approved to perform that task was somehow not an option. As an alternative, I then approached the maintainer of DannyS712bot, which catches and disables categories on drafts that are in the active AFC submission queue (which newly sandboxed former articles generally aren't, and thus don't get caught by it), but was basically told to buzz off and talk to JJMC89bot.
So, since I've already been rebuffed by the maintainers of both of the obvious candidate bots, I wanted to ask if there's any other way to either get one of those two bots on the task or make a new bot to go through Category:All content moved from mainspace to draftspace disabling any active categories, so that editors can cut down on the amount of time we have to spend on DRAFTNOCAT cleanup. If possible, such a bot would ideally also do an ifexist check, and outright remove any redlinked categories that don't even exist at all, though just disabling redlinks too would still be preferable to editors having to manually clean up hundreds of categorized drafts at a time — it's just that merely disabling the redlinks creates another load of cleanup work later on when the draft gets approved or moved by its own creator without AFC review or whatever, so killing redlinks right away is preferable to simply deferring them for a second round of future cleanup. Bearcat ( talk) 16:18, 8 November 2023 (UTC)
<nowiki>...</nowiki>
or <!--...-->
tags to prevent categorization.
GoingBatty (
talk) 01:59, 8 January 2024 (UTC)
Hi, I would like to ask if it would be possible to align text to the right in the # of votes and % of votes columns in the table listing over 460 MPs located in the List of Sejm members (2023–2027)#List of members section. The use of {{ Table alignment}} is imposible due to merged cells which help with wisual representation. There fore befoure every cell in mentioned columns which all contain numerical data, "align-text: right|" sholud be placed. Chears! — Antoni12345 ( talk) 23:49, 20 January 2024 (UTC)
|| 1
Replace: || style="text-align: right;" | 1
|| 2
Replace: || style="text-align: right;" | 2
|| 0
Replace: || style="text-align: right;" | 0
"The three most dangerous things in the world are a programmer with a soldering iron, a hardware type with a program patch, and a user with an idea."
— Rick Cook, The Wizardry Consulted
So I have an idea, and...
Is it possible for a bot to find articles that:
If a bot could automatically detect such articles, then I'd like to have it add the {{ underlinked}} template, on a schedule of perhaps a few articles being tagged per hour, to feed the seemingly popular Category:Underlinked articles for the Wikipedia:Growth Team features, without giving a large number of articles to the first editor and then leaving none for anyone else.
I realize that this would require a demonstration of consensus, but I don't want to make the suggestion, get people's hopes up, and then find out that bots can't count the number of words or links in an article. WhatamIdoing ( talk) 22:21, 30 November 2023 (UTC)
{od}
An update and some observations, in case anyone else is interested:
Red Cross
). (Pinging
Trizek (WMF))
WhatamIdoing ( talk) 04:32, 26 January 2024 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 80 | ← | Archive 83 | Archive 84 | Archive 85 | Archive 86 |
Is it possible to get a report in my userspace for a list of redirects to pages that have a list of episodes (using one of the templates {{
Episode list}} or {{
Episode list/sublist}}) for which the redirects are episode redirects whose title is the same (or similar - typos, disambiguation, etc.) to the one in the |Title=
parameter (of the above templates) and that don't use {{
R from television episode}} or {{
Television episode redirect handler}}? An example would be
And a Trauma in a Pear Tree which redirects to
Law & Order: Special Victims Unit (season 24).
Gonnym (
talk) 12:23, 13 March 2023 (UTC)
* {{-r|Redirect}} → [[target]]
site.expand_text(text)
Requesting a bot that replaces Template:Unreferenced with Template:BLP unreferenced and replaces Template:More citations needed with Template:BLP sources when applied to BLPs. So far I've been monitoring the former manually through this search, but the latter is too large of a task. Thebiguglyalien ( talk) 01:57, 29 April 2023 (UTC)
As Buzzfeed News is shutting down and per this ongoing discussion on WP:RSN, I am requesting a big job that would ask the bot to archive all https://buzzfeednews.com citations. This is my first time dealing with WP:BOTREQ and bots in general, but if I understand IABot correctly, I'm not allowed to ask it to modify domains.
This query makes note of 1,965 articles that cite https://buzzfeednews.com. — That Coptic Guyping me! ( talk) ( contribs) 19:03, 20 April 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} What I need is a bot that can scan this spreadsheet take the municipality name and then find that article (if there is one) and then take the coords from the spreadsheet (Column AC and AD) and insert the coords into the infobox or coord template. A bit more info at this TH discussion. It would also be useful to know which towns didn't come back with a page so we can create redirects to the correct articles. PalauanReich 🗣️ 06:50, 23 April 2023 (UTC)
Not done for now, with no prejudice against new request after a consensus is formed regarding the reliability of the source in question. —usernamekiran (talk) 23:57, 27 May 2023 (UTC)
The "participate" form over at WP:MolBio doesn't seem to work, and appears to have also stopped at other projects like WP:MED. When the form is filled in, it successfully creates a user subpage ( examples) but I'm guessing that User:Reports_bot is supposed to spot that and add it to the members list. Any ideas on how to get it up and running again? It seems as though the bot is also menat to keep the WikiProject Directory up to date too, so has a few functions that'd be useful to restart. T.Shafee(Evo&Evo) talk 12:33, 12 April 2023 (UTC)
[username]/WikiProjectCards/[wikiproject name]
, and copying those pagenames into [wikiproject name]/Members
could a lightweight replacement bot be created to take over those functions? There's
quite a few wikiprojects that it affects.
T.Shafee(Evo&Evo)
talk 03:10, 19 April 2023 (UTC)
WP:WPWX is also in need of a bot that can change colors within timelines on articles. This would only involve substituting in the new color parameters that were approved in a recent RfC. I can provide more specific details on this if anyone is able to do this. Noah Talk 19:36, 25 February 2023 (UTC)
Extended content
|
---|
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression_=_≤38_mph_(≤62_km/h) id:TS value:rgb(0.3,1,1) legend:Tropical_Storm_=_39–73_mph_(63–117_km/h) id:C1 value:rgb(1,1,0.85) legend:Category_1_=_74–95_mph_(118–153_km/h) id:C2 value:rgb(1,0.85,0.55) legend:Category_2_=_96–110_mph_(154–177_km/h) id:C3 value:rgb(1,0.62,0.35) legend:Category_3_=_111–129_mph_(178–208_km/h) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_130–156_mph_(209–251_km/h) id:C5 value:rgb(0.55,0.46,0.90) legend:Category_5_=_≥157_mph_(≥252_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Atlantic Ocean meteorological timelines, Category:Atlantic hurricane seasons, Category:Pacific hurricane seasons, and Category:Pacific hurricane meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression_=_≤62_km/h_(≤39_mph) id:TS value:rgb(0.3,1,1) legend:Tropical_Storm_=_62–88_km/h_(39–54_mph) id:ST value:rgb(0.75,1,0.75) legend:Severe_Tropical_Storm_=_89–117_km/h_(55–72_mph) id:STY value:rgb(1,0.85,0.55) legend:Strong_Typhoon_=_118–156_km/h_(73–96_mph) id:VSTY value:rgb(1,0.45,0.54) legend:Very_Strong_Typhoon_=_157–193_km/h_(97–119_mph) id:VITY value:rgb(0.55,0.46,0.90) legend:Violent_Typhoon_=_≥194_km/h_(≥120_mph) In addition, please change any values of TY within the actual part where storms are listed to STY since typhoon is now strong typhoon. The legend name also changes from typhoon to strong typhoon. The VSTY and VITY need to be added into the timelines since those statuses are now being included (the VSTY and VITY are within the coding above). A caveat is that articles dated 1972 or earlier need to use the colors for the Atlantic/Eastern Pacific. For all season-related articles with timelines in the categories (may need to check subcategories): Category:Pacific typhoon seasons
Colors = id:canvas value:gray(0.88) id:GP value:red id:TD value:rgb(0,0.52,0.84) legend:Depression_(31–50_km/h) id:DD value:rgb(0.43,0.76,0.92) legend:Deep_Depression_(51–62_km/h) id:TS value:rgb(0.3,1,1) legend:Cyclonic_Storm_(63–88_km/h) id:ST value:rgb(0.75,1,0.75) legend:Severe_Cyclonic_Storm_(89–117_km/h) id:VS value:rgb(1,0.85,0.55) legend:Very_Severe_Cyclonic_Storm_(118–165_km/h) id:ES value:rgb(1,0.45,0.54) legend:Extremely_Severe_Cyclonic_Storm_(166–220_km/h) id:SU value:rgb(0.55,0.46,0.9) legend:Super_Cyclonic_Storm_(≥221_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:North Indian Ocean cyclone seasons and Category:North Indian Ocean meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:ZD value:rgb(0,0.52,0.84) legend:Zone_of_Disturbed_Weather/Tropical_Disturbance_=_≤31_mph_(≤50_km/h) id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression/Subtropical_Depression_=_32–38_mph_(51–62_km/h) id:TS value:rgb(0.30,1,1) legend:Moderate_Tropical_Storm_=_39–54_mph_(63–88_km/h) id:ST value:rgb(0.75,1,0.75) legend:Severe_Tropical_Storm_=_55–73_mph_(89–118_km/h) id:TC value:rgb(1,0.85,0.55) legend:Tropical_Cyclone_=_74–103_mph_(119–166_km/h) id:IT value:rgb(1,0.45,0.54) legend:Intense_Tropical_Cyclone_=_104–133_mph_(167–214_km/h) id:VI value:rgb(0.55,0.46,0.9) legend:Very_Intense_Tropical_Cyclone_=_≥134_mph_(≥215_km/h) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Southwest Indian Ocean meteorological timelines and Category:South-West Indian Ocean cyclone seasons
Colors = id:canvas value:gray(0.88) id:GP value:red id:TL value:rgb(0.43,0.76,0.92) legend:Tropical_Low_=_<63_km/h_(<39_mph) id:C1 value:rgb(0.3,1,1) legend:Category_1_=_63–88_km/h_(39-55_mph) id:C2 value:rgb(0.75,1,0.75) legend:Category_2_=_89–117_km/h_(55-73_mph) id:C3 value:rgb(1,0.85,0.55) legend:Category_3_=_118–159_km/h_(73-99_mph) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_160–199_km/h_(99-124_mph) id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_=_≥200_km/h_(≥124_mph) For all season-related articles with timelines in the categories (may need to check subcategories): Category:Australian region cyclone seasons and Category:Australian region meteorological timelines
Colors = id:canvas value:gray(0.88) id:GP value:red id:TDi value:rgb(0,0.52,0.84) legend:Tropical_Disturbance id:TD value:rgb(0.43,0.76,0.92) legend:Tropical_Depression id:C1 value:rgb(0.3,1,1) legend:Category_1_=_63-87_km/h_(39-54_mph) id:C2 value:rgb(0.75,1,0.75) legend:Category_2_=_88-142_km/h_(55-74_mph) id:C3 value:rgb(1,0.85,0.55) legend:Category_3_=_143-158-km/h_(75-98_mph) id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_159–204_km/h_(99–127_mph) id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_=_≥205_km/h_(≥128_mph) For all season-related articles with timelines in the categories (may need to check subcategories): Category:South Pacific cyclone seasons and Category:South Pacific meteorological timelines
Colors = id:canvas value:gray(0.88) id:C0 value:rgb(0.3,1,1) legend:Category_0_&_N/A_<_1.0_RSI id:C1 value:rgb(1,1,0.85) legend:Category_1_=_1.0–3.0_RSI id:C2 value:rgb(1,0.85,0.55) legend:Category_2_=_3.0–6.0_RSI id:C3 value:rgb(1,0.62,0.35) legend:Category_3_=_6.0–10.0_RSI id:C4 value:rgb(1,0.45,0.54) legend:Category_4_=_10.0–18.0_RSI id:C5 value:rgb(0.55,0.46,0.9) legend:Category_5_≥_18.0_RSI Category: Category:North American winters |
Foo
Bar
Foobar
taken from archive, got no responses last time
Here are some possible tasks for a bot:
137a ( talk • edits) 13:42, 14 April 2023 (UTC)
Guess I should ping @ JamesR: since he operates AIV helper bot? 137a ( talk • edits) 19:30, 21 April 2023 (UTC)
I'm not sure if this is the right place to propose something like this, but here it is:
There are 1,208 pages (according to
this search) that use one exact citation towards
this page. The citation is fairly simple: just this:
{{
cite web}}
: CS1 maint: url-status (
link).
However, it gives neither authors or an archived copy of the webpage. This website itself, part of a project from
Carleton University, gives a more detailed suggested citation (under the header "To cite this page"). This includes authors and other broader information. Assuming it is acceptable to use a page's suggested citation, I suggest replacing these bare-bones web citations with the fuller references, along with a link to an archived copy of the webpage (there is one at archive.today
[3]), like has been done at
Qaba Sorkh (reference number three). Since there are roughly 1200 articles with the bare-bones citation, this would be nigh-on impossible to do manually. Could a bot replace these simple citations with the fuller ones, such as can be found at
Qaba Sorkh? The more detailed citation modified from the webpage's suggested format looks like this:
I propose mass-expanding these references to fuller version, using the information in the webpage's suggested style of citation:
(Thanks to
GoingBatty for helping with the technicalities here)
Basically, it's a find-and-replace job with the goal of expanding citations.
Thank-you,
Edward-Woodrow :) [
talk 19:43, 28 May 2023 (UTC)
|display-authors=etal
like this?:
Hello again. I was wondering if a bot could remove BLP tags for people who have died three or more years ago per the Category:Deaths by year category. I stumbled across the BLP tag at Johnny Burke (Canadian singer), despite having died in 2017.
With this search, I found over 1,300 articles tagged as BLPs while skipping over people who died between 2021-2023. The reason why I am not requesting for 2 years or less is because of WP:BDP. Thanks! MrLinkinPark333 ( talk) 19:03, 10 May 2023 (UTC)
Please, fix these links. There are 1648 ones, I cannot do it manually. Thanks! — Zalán Hári ( talk) 08:49, 30 April 2023 (UTC)
I'll start going over these with AWB. When I'm done I'll post here and ping @ Hári Zalán: Dr vulpes ( 💬 • 📝) 21:32, 12 June 2023 (UTC)
Hello again. I'm looking for a bot to help clear out Category:Redundant infobox title param. Per the category description, the |title parameter can be removed in the infobox if they match the article name. A lot of there are populated by Infobox comics creator and Infobox comic book title. See for example my removals of the |title parameter at Quino and Captain America (vol. 5). Currently, there's 2,435 of them to go through. Any help would be appreciated. Thanks! MrLinkinPark333 ( talk) 19:03, 4 May 2023 (UTC)
Wikipedia:Reliable sources/Noticeboard/Archive 405#fmg.ac (Foundation for Medieval Genealogy) reached consensus that fmg.ac/Projects/MedLands is a completely unreliable source, has been repeatedly recognised as such, had to be phased out, and now we've agreed it should be completely purged. 576 references to fmg.ac/Projects/MedLands had been made through Template:Medieval Lands by Charles Cawley, which has just been deleted at Wikipedia:Templates for discussion/Log/2023 May 25#Template:Medieval Lands by Charles Cawley. User:Frietjes was so kind as to let me know that I could check at External links search just how many references we've still got to fmg.ac/Projects/MedLands throughout English Wikipedia. It turns out that number is 1,326, way too many for anyone to go and delete manually. Not all links to the domain fmg.ac need to be purged, just those under the subdomain fmg.ac/Projects/MedLands. Nor should links outside of the mainspace be removed, such as the Talk:, Wikipedia:, User: etc. spaces. Is it possible for a bot to carry this out? Thanks in advance! Nederlandse Leeuw ( talk) 19:20, 2 June 2023 (UTC)
I notice a lot of talk pages without talk headers and was wondering if a bot can be created to do such a task? Lightoil ( talk) 03:15, 7 July 2023 (UTC)
This template does not need to be placed on every talk page, and should not be indiscriminately added to talk pages using automated editing tools* Pppery * it has begun... 03:23, 7 July 2023 (UTC)
As per Wikipedia:Village pump (technical)/Archive 206#Categorisation error due to error in user warning template, an error in Template:Uw-username caused the following code to appear:
{{#ifeq:{{NAMESPACENUMBER}}|3|{{#ifeq:{{ROOTPAGENAME}}|{{ROOTPAGENAME:}}[[Category:Pages which use a template in place of a magic word|S{{PAGENAME}}]]|[[Category:Wikipedia usernames with possible policy issues|{{PAGENAME}}]]}}}}
The actual markup that should have appeared is:
{{#ifeq:{{NAMESPACENUMBER}}|3|{{#ifeq:{{ROOTPAGENAME}}|<username at the time of subst>|[[Category:Wikipedia usernames with possible policy issues|{{PAGENAME}}]]}}}}
Can someone run a bot that checks the username of the user talk page at the time of substing this template, and fix the markup accordingly? — CX Zoom[he/him] ( let's talk • { C• X}) 13:22, 6 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}}
Hello, following the outcome of this RfC from last year, it was decided that men's footballer categories should be created to match the women's categories which had already existed. Therefore, I was wondering if a bot would be able to help create the relevant men's categories necessary, and then to move the appropriate pages to the new categories. I have finished cleaning up/completing the category trees for women, so now only men's footballers directly populate the relevant categories. There are five category trees needing to be adjusted, with each containing a type of footballer category by country/nationality (e.g. Category:Moroccan footballers). Each of these categories should become a container, with the articles in each category then diffused to a men's subcategory (e.g. Category:Moroccan footballers becomes a container, with the articles moved to Category:Moroccan men's footballers). The five category trees are:
The first two category trees will not require any category changes to articles, as the category for each country acts as a container.
Given the large number of articles involved, any help with creating a bot would be really appreciated. Thanks, S.A. Julio ( talk) 17:25, 22 April 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} Could anyone help with a bot task to merge the duplicate WikiProject banners found on pages listed at Category:Pages using WikiProject banner shell with duplicate banner templates? Gonnym ( talk) 12:39, 13 June 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} I'd like to request a bot that generates a report somewhere of NPPs and admins who Page Curation "mark as reviewed" (type=pagetriage-curation&subtype=review) 10 or more mainspace non-redirect articles in 2 minutes. This is way too fast to be proper NPP patrolling and these folks will need further scrutiny. Thank you. – Novem Linguae ( talk) 22:29, 20 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}}
This is my first ever such request, so apologies in advance if this is the wrong venue.
What I'm looking for basically is if an article is already tagged with WikiProject Biography and WikiProject Physics, to then add the bio=yes parameter to the Physics project banner if it's not already present, and s&a-work-group=yes to the Biography project banner if not already present.
Just the batch of existing ones if need be (so may be better suited for AWB?), but ideally something on-going, though doesn't need running that often.
Given could be bio=yes, or bio=y, detection using categories. More specifically, just regular articles, not list-class, or categories themselves, or templates, or files, etc. Something like https://petscan.wmflabs.org/?psid=25332108. Happy to include drafts. Plus something to deal with s&a-work-group=yes.
For transparency, the initial discussion was at Wikipedia_talk:WikiProject_Physics#Article_classification_thought. I do not have the ability to code this myself.
Thank you. - Kj cheetham ( talk) 15:10, 23 July 2023 (UTC)
{{ User:ClueBot III/ArchiveNow}} Hello.
Could somone please create a bot editing templates still using the old NFL style/color templates the following way:
{{NFLPrimaryColor|
=> {{Gridiron primary color|
{{NFLPrimaryColorRaw|
=> {{Gridiron primary color raw|
{{NFLPrimaryStyle|
=> {{Gridiron primary style|
{{NFLAltPrimaryColor|
=> {{Gridiron alt primary color|
{{NFLAltPrimaryStyle|
=> {{Gridiron alt primary style|
{{NFLAltSecondaryColor|
=> {{Gridiron alt secondary color|
{{NFLSecondaryColor|
=> {{Gridiron secondary color|
{{NFLSecondaryColorRaw|
=> {{Gridiron secondary color raw|
{{NFLTertiaryColorRaw|
=> {{Gridiron tertiary color raw|
There are thousands of templates (mostly navboxes) that use these template redirects.
Thank you. HandsomeFella ( talk) 20:54, 24 July 2023 (UTC)
Hi, I noticed a little inconsistency with railway station articles in Australia and New Zealand.
Context: The typical format for the name of train stations in Australia and New Zealand is Example railway station, otherwise if that name is used more than once around the world, then Example railway station, Administrative Division is used. The administrative divisions used are the full names of the states/territories of Australia, e.g. “Western Australia”, and the country of “New Zealand”. This is typically used for train stations that are in regional areas outside of a major city. However, for metropolitan train stations within the bounds of major cities such as Sydney and Melbourne, then it should be Example railway station, City.
Context ctd: Thus, many articles have redirects with the disambiguators, e.g. the article Example railway station might have redirects from “, New South Wales” and “, Sydney”. Or Example railway station, City might have a redirect from Example railway station, State. Some real examples of articles include Panania railway station, Epping railway station, Sydney and Gloucester railway station, New South Wales. Often, these redirects exist because of page moves, someone manually created it and/or someone created an erroneous redlink to a station when it was incorrectly disambiguated, and a redirect was created. But, other times, some redirects exist, some don’t.
Request: Are there any bots (or possibly one to be created) that could parse the names of a station’s article (or potentially a list of names) that would be manually inputted by the user, then the bot would check what format it is in (e.g. no disambiguator, city disambiguator, administrative division disambiguator, or other), then create redirects as necessary for the other disambiguators?
Request ctd: It should ideally process names in alphabetical order and then sorted by the administrative region it is in, while timestamping/logging each action into a readable list. It should also list the main article’s name, and redirects that already exist. Note that if the railway station is in an “other format” such as Newcastle Interchange, then the bot should just ignore it. It should also check the redirects that already do exist, and add/change the class of the article to “redirect” for the relevant WikiProjects. Possibly, the user could input the city and/or administrative region that the station(s) are in into the bot’s interface.
I am aware of the WP:MASSCREATION policy, but I think because these are redirects and not articles, it should be ok. It’d also help out those who like to search for station articles, but instead they add in an unnecessary disambigator, and those who place redlinks.
Would love to hear your thoughts guys, and thanks for making it this far :) Fork99 ( talk) 08:28, 9 June 2023 (UTC)
I am requesting that a bot start adding the Top 25 report template to the talk pages of articles appearing in Wikipedia:Top 25 Report. In is an achievement for articles to get enough views to appear on this list, and a template exists. BabbaQ ( talk) 08:52, 27 May 2023 (UTC)
First time I've done this, so hopefully this makes sense.
I would like to request the assistance of a bot in identifying large stub articles within various WikiProjects. The purpose of this task is to highlight articles within the WikiProjects that have a significant character count and may require further attention or improvement.
If it would help to show the logic, I have developed a program that can analyse the character count of articles in designated stub categories associated with each WikiProject. The program uses an algorithm to scan the articles' content and generate a report listing the articles that exceed a certain character limit (e.g., 10,000 characters). The report includes the article's title, the character count, and a link to the article page. Tell me where to show the code and I could do this.
To facilitate the testing and implementation of this program, I propose that the bot initially runs in a user sandbox or designated user space. This will allow for easy monitoring and review of the generated reports by the WikiProject members.
A significant number of stub categories in the WikiProjects have large articles. The program aims to identify such articles with large character counts which may suggest misclassification.
Please let me know if you have any queries. JASpencer ( talk) 08:03, 13 May 2023 (UTC)
(I created something, but rather naively didn't realise that Wikipedia would rightly block it)- what do you mean? — Qwerfjkl talk 11:52, 17 May 2023 (UTC)
"There are a quite a few stubs which have lots of text but it doesn't cover the subject adequately"That makes it a Start-class article, though I suspect you might be able to offer some exceptions. Cheers, Nick Moyes ( talk) 14:03, 10 July 2023 (UTC)
There are various lists of Wikipedians by edit count and related stats. One that I'd be interested to see would be Wikipedia:List of Wikipedians by non-automated edit count. I think the presence of such a list might be a small help re Editcountitis, as it'd recognize/incentivize editors whose edit count has not been juiced through tons of (typically low-value) automated edits.
Would anyone be interested in coding a bot to populate and maintain such a page? Courtesy pinging Legoktm and 0xDeadbeef, who run the bot that updates the overall edit count list, in case either of you might be up for it. {{u| Sdkb}} talk 20:48, 19 July 2023 (UTC)
Sounds like we need three lists: total, manual and (semi-)automated. Or possibly two: manual and (semi-)automated. The negative/subtraction name "by non-automated edit count" rather should be a positive name "by manual edit count". I'm not convinced automated edits are typically low-value, anyway, manual edits often have the same characteristics. Plus it's so hard to tell since automation can take many forms that are impossible to track. -- Green C 03:58, 20 July 2023 (UTC)
The Cricket Archive (
https://cricketarchive.com) have introduced a subscription access to their website. We have about 14,600 articles referencing the site. Could a bot add |url-access=subscription
to those that do not have this.
Keith D (
talk) 22:09, 25 July 2023 (UTC)
|url-access=
is a good idea where possible, anyone can do this, including with AWB. If the cite has an archive URL with |url-status=dead
(or no |url-status=
same thing), maybe the |url-access=
is redundant. --
Green
C 04:03, 26 July 2023 (UTC)Hey folks, I was perusing Category:ATP Tour navigational boxes and realised that a) most of these templates are out of date, and b) this would be a good task for a bot. From The ATP site gives this PDF, which contains all of the useful information that could be used for a module. Before I start writing the module, I guess I was wondering if anyone would be able to utilise this information in a way where it could be updated weekly as the stats update. The way I see it:
From there the module can give ranks and any changes from week to week for use in the various templates. If this seems workable, let me know. Thanks. Primefac ( talk) 16:07, 22 July 2023 (UTC)
Deferred
Hopefully everyone knows there is no national flag of "Korea"; there is a DPRK (North Korea) and a South Korea. Unfortunately, several beauty pageant fans have added the nonexistent "Korea" flag to over 80 articles as indicated in the search results above. It should be pretty simple to have a bot rename it to South Korea by removing the name=Korea
parameter from the flag template. ☆
Bri (
talk) 22:01, 9 August 2023 (UTC)
The archive date doesn't match on many of User:AShiv1212's articles such as Ahimsa (2023 film). DareshMohan ( talk) 07:31, 13 August 2023 (UTC)
|archive-url=
. Or
this which requires maintain the same date format (iso, mdy, etc). There are probably other things like that. So the first run will be about discovering what the bot needs to do to fix the errors. --
Green
C 16:44, 13 August 2023 (UTC)|archive-url=https://web.archive.org/web/20201028113503/https://www.tvguide.com/news/drag-48227/ |archive-date=October 28, 2022
.. notice the date in the URL is 20201028 and the |archive-date=October 28, 2022
don't match. It should be |archive-date=October 28, 2020
--
Green
C 05:39, 14 August 2023 (UTC)
|website=
parameters, my guess is that they were filling out the templates manually, and setting the |archive-date=
equal to the |access-date=
for some reason. None of their edit summaries contain links to userscripts. Have you tried asking the user directly?
Folly Mox (
talk) 13:26, 14 August 2023 (UTC)
Hi - I want to re-raise this item, since we have users assuming that wikiprojects that use the bot. For example this conerstaion occured because the user created a WikiProjectCards page but it wasnt added by the bot this page, so they assumed the wikiproject wasnt accepting new members. There are dozens of other users that have attempted to sign up to the wikiproject but not appeared on the participant list. I suspecrt similar scenarios are happening for the other wikiproject that use the bot ( WP:MED WP:WPWIR etc). (ping @ Pppery, Harej, and MZMcBride: I realise you're busy but I wanted to make sure you saw this). T.Shafee(Evo&Evo) talk 02:55, 17 August 2023 (UTC)
Hello!
I am a new editor who noticed a large number of uses of US English on British English-tagged pages and I would like to see what can be done with a bot. The first thing I fixated on was use of "percent" over "per cent", but soon noticed others such as "program" over "programme". Is there a way of programming a bot to reform these and similar spellings? Of course it would have to take into account proper names of things and direct quotations. I think it might be advantageous to programme the bot to do a general sweep of the tag and nip any future issues in the bud so words such as "colour" (US: "color"), "enrolment" (US: "enrollment"), "travelled" (US: "traveled"), "hippy" (US: "hippie"), "aluminium" (US: aluminum"), "gaol" (US: "jail"), "rouble" (US: "ruble"), "aeroplane" (US: "airplane"), "sulphur" (US: "sulfur"), and so on are reformed also. It should not touch words containing -ize/-ise, because both are acceptable in BrE (indeed I saw a sign at my local GP surgery recently that read "sanitize your hands"). Stolitz ( talk) 01:51, 16 August 2023 (UTC)
Manually-assisted bots are acceptable, so long as they include international spell checking (not only country-specific spell checking) and the operator does in fact examine every proposed edit before allowing the bot to make it.WP:AWB or WP:JWB may be useful here, though their users needs pre-approval (to deter mass drive-by vandalism) and we normally expect longer experience before granting it. Certes ( talk) 09:17, 16 August 2023 (UTC)
Standard WikiProject tagging request here. I have been slowly working on
Category:Green Bay Packers articles needing infoboxes, bringing the total down from about 100 articles to 24 as of this writing. I have been averaging about 3 or 4 a day, so I expect to clear the category in about a week. However, it has been a really long time since I had a bot run through the category tree and tag the relevant article talk pages (see
here for the last request). So, for the request, could I have a bot owner run through the following categories and sub-categories, check if an infobox exists on the page, and if not, tag {{
WikiProject Green Bay Packers}} with the needs-infobox=yes
:
Thank you for any help you can provide. « Gonzo fan2007 (talk) @ 21:54, 14 September 2023 (UTC)
Hello, i'd like to request a bot add the template {{ WikiProject Artsakh}} to the talk pages of all articles in this category tree:
I'm not sure how many of them already have the template, but combing through all of them & adding it individually would be extremely tedious. Thanks! Sawyer-mcdonell ( talk) 22:16, 30 September 2023 (UTC)
The IUBMB_EC_number parameter was removed from Template:Infobox enzyme in February 2013 by Boghog (the value is now automatically calculated from another field). At this point there are over 4800 entries in Category:Pages using infobox enzyme with unknown parameters and all but a dozen or so are the IUBMB_EC_number parameter. I've done parameter removal using AWB before and know that the regex can get tricky on some but in this case I don't think it would be that bad the values should be of the form A/B/C/D where A,B,C &D are all numbers. Naraht ( talk) 15:28, 24 August 2023 (UTC)
|IUBMB_EC_number=
from
Template:Infobox enzyme as I find them. I would be in favor if a bot removed all occurrences. This parameter was necessary before Lua support was available in Wikipedia templates. With Lua, this parameter can automatically calculated from the mandatory {{
EC_number}} parameter.
Boghog (
talk) 17:56, 24 August 2023 (UTC)
I wrote a separate article about East Khandesh (EK), now we don't need EK to redirect at Jalgaon district, so req to delete all such redirects. Tesla car owner ( talk) 11:57, 4 September 2023 (UTC)
Wikipedia:WikiProject User warnings says:
"To help centralise discussions and keep related topics together, all uw-* template talk pages and WikiProject User warnings project talk pages redirect to [Wikipedia talk:Template index/User talk namespace]"
However, many talk pages still do not do this. It would be quite tedious to have to go through every single template, go to its talk page, check if it redirects to it, and if not create it.
Could anybody create a bot that checks if a Uw- template's talk page redirects to Wikipedia talk:Template index/User talk namespace, and if it doesn't, create it? Millows ( talk) 21:46, 20 July 2023 (UTC)
Template:Gutenberg author takes an |id=
that is either a name eg. |id=Quick,+Herbert
.. or a number |id=4251
. When a name, it goes to a search page
like this then you click through to the author page. When a number it goes directly to the author page. Ideally one would use the number it's more accurate and direct. Most cases on Enwiki are the name and would benefit by changing eg.
Special:Diff/1169939608/1170077601.
A bot could scrape the search page and change the ID to the number. It's only safe when there is a single name in the search result page. The template is in about 11k pages. Potentially one could add support for Wikidata, which might have it, but they didn't always correctly match the authors on Wikidata with the authors on Gutenberg and Wikipedia. Plus the problem of unwatched vandalism. This is a relatively easy and low-controversy bot or AWB project. -- Green C 02:26, 13 August 2023 (UTC)
I'd like to request a bot that will fix the redirect targets of the redirects listed at
User:Gonnym/sandbox/database report with the precise anchor target. A target page that uses the standard
Template:Episode list (or
Module:Episode list directly) will have anchors as "#ep<number>" where the number is the |EpisodeNumber=
value (of the mentioned template or module call). So for example, for "
His Maker's Name" the link is [[The Zeta Project#ep2]]
. Note that some are targets to episode articles (alternative names, spelling, disambigiation, etc.) and do not need to be changed so the bot should just skip these.
Gonnym (
talk) 15:41, 21 August 2023 (UTC)
|Title=
matches the redirect's title, and then it gets the value of |EpisodeNumber=
. So it would just skip cases like
Behold... The Inhumans because the target doesn't use {{
episode list}}. (Likewise with
The Real Deal (Agents of S.H.I.E.L.D. episode).)|Title=
check ignore disambiguation? That is, would
A Bolt From the Blue (Lois & Clark) and
A Bolt From the Blue (Lois & Clark episode) both be fixed?
Gonnym (
talk) 11:44, 10 September 2023 (UTC)
Fix references. Java. i wanna using bot on this wiki Aesthetic of me ( talk) 16:54, 2 September 2023 (UTC)
Hi, I don't know if there's already a bot that can handle this, or if it's something I would need to request a new bot be written for, but the situation is that it was announced today that the entertainment news series Entertainment Tonight Canada will be cancelled in a couple of weeks, and apparently the website is disappearing with it — but as a person who edits principally in the film, television and music areas, it's obviously a source I've cited a lot (and I mean a lot a lot) in the past several years, so the links are going to need to be archived for salvage purposes.
It's obviously not a task I want to grind through all by myself if I don't absolutely have to, so I wanted to ask if there's a bot that can check for all Wikipedia articles that feature links to the https://etcanada.com/ domain, and ensure that there's an archived copy added to the citation if there isn't already one present yet? Thanks. Bearcat ( talk) 19:36, 27 September 2023 (UTC)
When {{Coord}} templates are at the top of an article, they always break the page previews, making them display with no text. I'd like to request a bot that moves them to the bottom of the first paragraph to remedy this. (Don't know why this happens, but it does get fixed when you move it.) LOOKSQUARE ( 👤️· 🗨️) talk 01:11, 3 September 2023 (UTC)
Links to recent discussions:
Before acting on the above, please check these discussions. – Jonesey95 ( talk) 04:50, 3 September 2023 (UTC)
Currently, User:FireflyBot posts notifications to the talk pages of page creators of AfC Drafts that have not been edited for five months, to warn them of impending deletion ( BRFA). However, this approach of notifying only the page creator is suboptimal, as it does nothing to attract the attention of other editors who may be interested in rescuing the draft. In this discussion, extending the bot task to also post notifications to the draft talk page itself was suggested as a solution. I have asked Firefly about this, but he's been quite busy and probably won't be available to work on it any time soon. He did say, "if someone else is willing to write it I can run it as part of the same task." So I'm asking to see if someone might be. -- Paul_012 ( talk) 20:12, 3 September 2023 (UTC)
I recently discovered that all major WMF announcements go to WP:VPM, because that's the page listed at m:Distribution list/Global message delivery and a few other global MassMessage lists.
I'd like to receive all MassMessages that go to VPM on my talk page. I can't do that by subscribing to the source, because m:Distribution list/Global message delivery only contains noticeboards and announcement venues (not individual user talk pages).
I therefore would like a bot that, whenever User:MediaWiki message delivery posts a message to WP:VPM, the bot submits another Special:MassMessage request that copies whatever was put onto VPM and re-announces that onto a new MassMessage list. I have to imagine I'm not the only one who wants to see global announcements on my talk page.
This shouldn't be too hard to do, right? Best, KevinL (aka L235 · t · c) 21:31, 28 August 2023 (UTC)
A bot that can translate over 35 languages. It's pretty cool if you ask me. It would have a giant blue button that says "Chatter Box Mode" that helps translating. I am pretty good at drawing so I could even make a logo for the Bot. There could even be a chatter box squad or maybe CBS to help answer the questions people have about the Mrs Chatter bot. The bot's userpage could have like a sort of table with the world HELLO in different languages like Bonjour, Dumelang, Mmolo, Sao bona, Dumela, Elko, and Namaste and even many more DJ Aquah ( talk) 10:03, 14 December 2023 (UTC)
While we currently archive the WP:ITNC page to review past candidates for In The News, we don't have a similar function for the ITN items that are actually added to {{ In The News}}. Is it possible that, given a date range and a target page, for a bot to capture "significant" additions to the template along with the date added. So for example, the bot should be able to review this diff, and create a line on the target page with the date of the change, the editor that added it, and the text of the addition (here being " Tharman Shanmugaratnam is elected as the next president of Singapore.")
The end goal would then to have the bot initially make monthly pages from the start of ITN, and then on a monthly basis create a new monthly archive.
I think the one constant is that all blurbs as well as RDs added start with a "*" mark, as to distinguish from minor typos or wording corrections or changes in the picture. However, I would rather the bot be overzealous and include false positives. Masem ( t) 13:14, 8 September 2023 (UTC)
blurb, editor name, timestamp
format, correct? courtesy ping:
Qwerfjkl (I couldnt have solved the diff issue without their help). —usernamekiran
(talk) 17:00, 10 September 2023 (UTC)
*[[Bill Richardson]] − added by <username> at <date of addition>, removed by Stephen at 00:12, September 8, 2023.
Thats just an example, the wording can be changed. —usernamekiran
(talk) 10:44, 11 September 2023 (UTC)
|
. They are from the "Main page image/ITN" template. In the current version, I have updated the program so that it will skip such lines.
In this version, the bot excludes the lines beginning with |
. I have also updated it so that lines beginning with *[[
are considered as recent deaths. Let me know what you think, and what other functionality would you like. —usernamekiran
(talk) 13:14, 11 September 2023 (UTC)
Thanks so much for this! It would be nice to add a header for each day, which would help to add some separation. For example ==September 10== and ==September 9==. I also think the "added by [Username] on [date]" part could be set off in a smaller font or otherwise separated from the main blurb, but I'm not sure exactly how that would be best formatted, and even as it currently is I don't particularly mind it. The single most important thing for readability, though: would it be at all possible to detect at least some cases of modification as distinct from addition? I can suggest some approaches. If in the same edit, one non-RD line is removed and another non-RD line is added in the same line number position, it is generally a modification. This is because new blurbs are usually added to the top and old ones are usually removed from the bottom due to staleness, so if the same line is added and removed it's a good indication of an intentional swap. I understand that this may not be a 100% accurate check, but you could also combine it with a second check like making sure the bolded link target is the same. Other approaches like textual similarity metrics based on edit distance or NLP algorithms are probably overkill, but could be attempted. I still think that checking the placement within the document and the bolded link target is good enough to reduce most of the duplication. In the case that you do detect duplication, I would probably try to put it underneath the original version, and have them listed as a chronological progression of edits. Example. 98.170.164.88 ( talk) 01:45, 12 September 2023 (UTC)
I'd like to harness the data from afltables.com to automatically update all AFL players at the conclusion of every round. Some of the players are years out of date, which is a shame. The stats box (goals and games) could also be updated. Electricmaster ( talk) 04:23, 12 October 2023 (UTC)
I would be interested to see a bot go through the |URL=
parameter of all instances of {{
Infobox newspaper}} and {{
Infobox organization}} and create redirects if they do not already exist from the domain name to the article with the transclusion, tagged with {{
R from domain name}}. This task could potentially be expanded to other infoboxes as well, but I think those two are a good place to start given that they'd help make linking to articles on sources in citations easier. Cheers, {{u|
Sdkb}}
talk 08:26, 12 October 2023 (UTC)
|url=
parameter of those infoboxes to generate a list, and then proposing a bot that use said list to make those edits (e.g. changing |website=nytimes.com
to |website=The New York Times
)? You could also provide the list to
Ohconfucius, whose
Fix Sources script makes some of those corrections. I'd also suggest asking the citation tool owners to adjust their tools, but some tools aren't maintained.
GoingBatty (
talk) 16:28, 12 October 2023 (UTC)
|website=nytimes.com
to |website=The New York Times
, among others, but I would be happy to add more similar converts that are not already included. Feel free to check
here to see which journals are converted. --
Ohc
revolution of our times 14:39, 20 October 2023 (UTC)
There are a few hundred cases with |last=Reuters
(
Reuters insource:/\{\{Cite news[^\}]+last *= *Reuters/i). Correction as in
this example would fix the issue. Would anyone be willing to perform this task?
Leyo 13:26, 9 November 2023 (UTC)
On most (about 80%) of the food and drink pages italics are used improperly (I have spent many hours of my days correcting errors of this kind), for example: many times on a page a food is put in italics and on the same page many times it's not; on one page a food is made italic and on others the same food is not made italic; it almost always happens that when one enters a wikilink of a food put in italics, one is confronted with a page without that food in italics. I myself struggle to continue reading foods and drinks pages, I don't want to imagine in the mind of a reader how much bloody confusion is created. I would propose to have a bot act by removing all italicised food and drink terms, or, even better, selecting every existing food, deciding whether to make it italic or not, and, again through the bot, changing everything at the same time, without (which is impossible) doing it without bot. I, however, have done my best, but I will announce that I will never again spend time on this problem, as I am in an endless loop. I wonder what's the point of italicising a food If there is zero uniformity. JackkBrown ( talk) 07:07, 10 November 2023 (UTC)
Could a bot please change all instances of “power forward (basketball)” to “power forward” to correct the links to Power forward, which was recently moved? Thank you Rikster2 ( talk) 19:43, 22 November 2023 (UTC)
[[power forward (basketball)]]
without a pipe, it would be appropriate to change those. However, I could not find any in articlespace.
GoingBatty (
talk) 19:57, 22 November 2023 (UTC)
linksto:"Power forward" -basketball
finds wikilinks which might lead to the wrong article and need improving. (There's currently only one result, and it's a false positive.) Changing the links would raise the number of false positives to 50, making any actual errors harder to spot. The qualifier is acting a bit like "(disambiguation)" in an
INTDABLINK by marking these links as checked and correct.
Certes (
talk) 20:55, 22 November 2023 (UTC)
I would like to request a bot and I would like to name it, the Bad Guy Patrol (BGP) and it would help me restore order on Wikipedia. It will help with vandalism, blocking and cleanups. Harley Quinn on duty ( talk) 14:07, 6 December 2023 (UTC)
Hello Wiki world. I am in Wikipedia for the past 8 months making a little bit contributions. It would be so helpful for me if you can enable the bot in my account. Thank youu!! EEverest 8848 ( talk) 15:47, 20 November 2023 (UTC)
This is unmanageable. I have tried to do what I can, but considering the infinity of Italian municipalities and provinces, I wonder if you could ask bot to make all "Provinces" (with a capital 'P') "provinces" (with a lowercase "p"). I have done what I can, more than that I cannot, there are really too many corrections for one person. Examples: Province of Caserta; Comitini; Province of Trapani; Province of Udine. Thanks in advance. JackkBrown ( talk) 21:03, 4 November 2023 (UTC)
Sorry if I've missed a past discussion on this suggestion.
Manually setting up archiving bots on talk pages is time consuming, and there are an enormous number of pages were it hasn't been set up, leading to clogged talk pages. It would be useful for a bot to add one to talk pages that don't have one, with notional parameters (30d, 2 minimum, etc).
I think it would be important for editors to be able to exclude it, too, if consensus was that an archiving bot wasn't wanted for some reason. Cheers. Riposte97 ( talk) 05:56, 25 October 2023 (UTC)
I noticed that a very large number of uses of the template:lang-ku display Latin text incorrectly (non-italics, in Arabic font; see, for example, here in the lead). Since it would take a very long time for anyone to go through and add the parameter script=Latn to make the text display correctly, is there any way that a bot can complete this task? Thanks in advance. Revolution Saga ( talk) 22:00, 28 October 2023 (UTC)
Please see the Template_talk:UCI_team_code#Requested_move_30_October_2023 where there was rough consensus to usurp the ct shortcut. Essentially, replace the 12,000 transclusions to bypass the redirect so that Template:Ct can redirect to Template:Contentious topics. Awesome Aasim 23:08, 9 November 2023 (UTC)
A lot of pages have links to Wiktionary pages in the body text. This is fine, though I think the links are supposed to be like
this (interwiki) and not
this (external).
Would it be possible to create a bot that turns these into the first example? LOOKSQUARE ( 👤️· 🗨️) talk 21:05, 16 November 2023 (UTC)
See
Wikipedia:Reliable sources/Noticeboard#Washington Independent.
But archiving those links is not quite straightforward. We should probably get rid of any link to the live domain (which is garbage) and we should only use archive.org snapshots that are older than, say, 2016. When there is no older snapshot, the link/reference should be removed entirely. — Alexis Jazz (
talk or ping me) 01:34, 15 November 2023 (UTC)
Looking for a willing bot operator to implement WP:PIQA by migrating quality assessments from WikiProject banners into {{ WikiProject banner shell}}. To be more precise,
|class=
parameter and remove from project banners, e.g.
[6]|class=
parameter to encourage editors to add a rating, e.g.
[8]|class=
parameter. These will be tracked and reviewed manually.|living=yes
or |blp=yes
then add |blp=yes
to {{
WikiProject banner shell}}.|listas=
then move this to {{
WikiProject banner shell}} and remove from project banners, e.g.
[10]Thanks in advance — Martin ( MSGJ · talk) 22:18, 15 November 2023 (UTC)
it'd be fine to boldly consolidate duplicate AFC banners on article talk pages. There are two possibilities - actual duplicated banners (in which case one should be removed) and two different banners, which should be merged, to best save the review history. Primefac ( talk) 13:48, 16 November 2023 (UTC)
Not sure if this is the right place to ask this, but I noticed that when User:MalnadachBot was procedurally blocked, its tasks 12 and 13 were still marked as active. I don't know if it has finished running through all lint errors on wiki, but task 13 is certainly an ongoing effort. Would we need a replacement bot to pick up these tasks, or do we have existing bots/procedures handling these things? Liu1126 ( talk) 20:17, 19 November 2023 (UTC)
I am in the midst of translating articles from English to Gagana Sāmoa which is a very necessary task given the massive inequality in the available information in each language. My hope is that more Sāmoa users will find Wikipedia to be a more hospitable site and access it to find information in their language. There is a massive disparity not only between English and Gagana Sāmoa but even between other languages and the languages of the Pasifika by and large. I would like to request a bot to help translate these articles as this task is overwhelming and this disparity will only grow given the population, internet access, and specialization of Sāmoa users. Something has to be done otherwise the language will likely go the way of 'Ōlelo Hawaiʻi. IonaPatamea ( talk) 20:57, 20 November 2023 (UTC)
I have seen Pablo Picasso's files on English Wikipedia that his works will be transferred to Wikimedia Commons on January 1, 2044, 70 years after his death. However under Spanish copyright law, the copyright term for Spanish authors who died before December 7, 1987, including Picasso who died on April 8, 1973, have life term plus 80 years, and for those who died otherwise have life term plus 70 years, though it is unclear if the copyright expires on his 80th death anniversary or January 1 following it. So for sure, there is no room to have his copyright expired on January 1, 2044. It's either April 8, 2053 or January 1, 2054. I am hoping for a consensus if his copyright expired on these bolded dates. Here are his works to have his copyright expiration date edited or added (if not) one-by-one. Ishagaturo ( talk) 09:07, 23 November 2023 (UTC)
This probably should not be implemented trivially for languages written in the Latin script, but with a few caveats, it seems pretty doable to write a bot that scours articles, and while staying out of appropriate templates, tags text using existing templates like {{
lang}}
as either being in a specific language, or at least being in some language written in a particular script, e.g. und-Hani or und-Cyrl as per the obligatory HTML |lang=
parameter and
ISO 639. If there is und text already tagged, it makes it much easier to see whether 漢字 is lang=ja-Hani or lang=zh-Hant, and also to quickly retag everything en masse.
If we are getting dangerous, I can think of multiple ways to further discriminate between, say, Japanese and Chinese-language text beyond simple checking for strings of
CJK ideographs.
Remsense
留 21:16, 6 December 2023 (UTC)
zh-hant
and zh-hans
or whatever they're called in the appropriate standards.
Folly Mox (
talk) 13:29, 7 December 2023 (UTC)
We have a lot of citations that could be improved using |author-link=
eg.
Special:Diff/1186331321/1186342802. The problem is it's difficult to match the correct author, it requires a human. Thus wondering if/how this might be automated in certain cases. It doesn't require every case only those it can match with greater certainty. For example we know, per the above diff, there is only one
Steven Poole there is no dab page. And we know Steven Poole writes for a publication called
Quercus. Thus any other cites that match those criteria, is a good bet that is the same person, and where an |author-link=
could be added.
Is this method 100% foolproof? Probably not, but is it at least 99% accurate in matching names? Probably. I think a test run would show how reliable it is. I don't have the time right now but wanted to mention in case anyone wants to run an experiment. Or had other ideas. A dump of CS1|2 citations on enwiki - not including cite web - can be found here. I currently have updates disabled, but can restart if anyone wants. -- Green C 20:27, 22 November 2023 (UTC)
|author-link=
and build a 2-column database: "Steven Pool = Quercus". Then find all other citations that cite Steven Pool and Quercus, and add the |author-link=
if missing. It works backwards from what is know to be true. --
Green
C 19:42, 5 December 2023 (UTC)
Can a bot make this kind of changes to multiple pages?
{{abcd|ᚠ|ᚡ|ᚢ |ᚣ|ᚤ|ᚥ|ᚦ| ᚧ|ᚨ|ᚩ}}
, {{abcd|Ꭰ|Ꭱ|Ꭲ|Ꭳ|Ꭴ}}
ᚠᚡᚢ ᚣᚤᚥᚦ ᚧᚨᚩ
, {{abcd|Ꭰ|Ꭱ|Ꭲ|Ꭳ|Ꭴ}}
That is,
[ ]?[ᚠ-ᛸ][ ]?
in each parameter.|
, but retain the text entered as parameters (including spaces).[ ]?[ᚠ-ᛸ][ ]?
), leave it as-is.172.58.208.108 ( talk) 19:19, 16 December 2023 (UTC)
Good day, can someone make a bot to run through
this and append {{
SVG-logo}} below the Non-free xxx template and add ==Summary==
above the FUR template to files that don't have it? --
Minorax«¦
talk¦» 11:22, 7 November 2023 (UTC)
This is my first time posting here, so no idea if this should be done by a bot. So, the "IPAlink" template has another variation "IPA link" (notice the space). The official representation is "IPA link" but I find the "IPAlink" variation also is quite predominant. This isn't urgent ("IPAlink" redirects to "IPA link"), but would a bot fix this sort of thing? PharyngealImplosive7 ( talk) 18:08, 21 December 2023 (UTC)
I asked at the help desk and I was told to ask here. The Organized crime task force and the Serial killer task force banners recently got added to the banner of Template:WikiProject Crime and Criminal Biography using parameters. The previous banners (as wrappers of the new one with the parameters) were mass substituted. This has left ~6700 (see Category:Unknown-importance Crime-related articles, not counting ones that didn't have an initial basic crime importance) duplicates, that have both the original crime importance and task force importance but split between two duplicate banners.
Is there any bot that can merge the importance values on the pages that have both templates so there aren't so many duplicates (for example if there's two duplicate banners, one of which has the importance for wp crime and one which has the task force importance, add them together)? Of course the ones that were not initially tagged with the original crime ones will have to be manually tagged as they don't have the basic importance parameter, but that's less than 500 which isn't as bad (compared to 6700 that already HAVE all the required importance parameters) PARAKANYAA ( talk) 17:56, 11 November 2023 (UTC)
{{
WikiProject Crime and Criminal Biography|serialkiller=yes|serialkiller-imp=low|organizedcrime=yes|organizedcrime-imp=low}}
(or having 1 or the other task forc parameters, just showing both for sake of example){{
WikiProject Crime|importance=low}}
OR {{
WikiProject Criminal Biography|importance=low}}
(also called WikiProject Criminal which iirc has quite a few transclusions){{
WikiProject Crime and Criminal Biography|importance=low|serialkiller=yes|serialkiller-imp=/nowiki>low|organizedcrime=yes|organizedcrime-imp=low}}
Hi, I would like to know whether a bot would be able to do this particular task or not. The task is to replace the existing format with the template like I did here on my sandbox to explain it better: [12]
The following articles: 2004 Andhra Pradesh Legislative Assembly election and 2009 Andhra Pradesh Legislative Assembly election require these template changes. Since I am finding this monotonous task quite difficult to do it myself, I am looking for help probably a bot might help I believe? Any info or help is appreciated. Thank you 456legend ( talk) 05:52, 7 December 2023 (UTC)
Hello, it has come to my attention that template sandboxes X21 to X71 are not automatically cleared by Cyberbot I, which clears template sandboxes X1 to X20, and the main template sandbox. So I think there should be a bot that clears the rest of the template sandboxes. This bot would be called "SandBot", and it would clear the template sandboxes at 00:00 UTC and 12:00 UTC every day. It would do additional help for Cyberbot I for clearing template sandboxes X21 to X71. This is only a proposed bot I had the idea to create. RandomWikiPerson_277 talk page or something 19:58, 12 December 2023 (UTC)
Simple idea: monitor the protection log, and any time the protection level is increased, but the expiration time is decreased, wait until a few minutes before the expiration, and restore the status quo. If it really is the intention of the protecting admin to leave the page unprotected at expiry, they can leave a keyword like NOFALLBACK
or something in the protection summary. An obvious complication would arise if the bot is lagging, and some edits slip in before protection can be restored, but that's a minor detail. Yes, I know about the PC trick, but people sometimes forget, and sometimes PC is isn't enough.
Suffusion of Yellow (
talk) 03:58, 2 December 2023 (UTC)
Category:Pages using WikiProject Film with unknown parameters, a maintenance category which exists to flag problems where a use of {{ WikiProject Film}} on a talk page is calling parameters that don't exist to be called, currently has 4,808 articles in it — and after looking at it and cleaning up the tiny single-digits handful of exceptions that existed anywhere after the letter B, I was able to determine that the remaining contents all relate entirely to an old, long-deprecated practice whereby B-Class articles in that queue were each also tagged as b1=[y/n], b2=[y/n], b3=[y/n], b4=[y/n] and b5=[y/n] for their individual success or failure in meeting each of the five B-Class criteria listed at Wikipedia:WikiProject Film/Assessment. That's long since been deprecated and isn't done anymore, which is why those are landing as unknown parameters now — but with 4,808 articles to deal with, actually cleaning them up is more work than any human editor would ever actually be inclined to undertake.
Accordingly, I wanted to ask if there's any bot that can be set loose on the task of stripping b#= parameters from the contents of that category. Bearcat ( talk) 17:20, 1 January 2024 (UTC)
For the 5832 articles listed at
Wikipedia:WikiProject Africa/The 10,000 Challenge please add |AFR10k=yes
to the project banner {{
WikiProject Africa}} on the talk page. This adds a note to the banner and also populates
Category:Articles created or improved during the WikiProject Africa 10,000 Challenge. Thanks — Martin (
MSGJ ·
talk) 19:19, 3 January 2024 (UTC)
Hello,
GNU is the Operating System and Linux is one of its Kernels. Linux is not an Operating System. Hence, why I believe a bot should locate and correct these errors. Where Linux is mentioned, it should be changed to GNU/Linux or GNU-Linux. This request is being made for Richard Stallman, who has cancer. Twillisjr ( talk) 16:07, 6 January 2024 (UTC)
I have noticed that many categories, especially content categories, include
non-free files without the __NOGALLERY__
magic word, which is against
WP:NFCC#9. I'd suggest using a bot to auto-tag such categories, skipping a whitelist for those categories covered by
WP:NFEXMP (generally those categories concerning reviews of questionable files, such as
CAT:FFD, and some maintenance categories that should contain no non-free files). –
LaundryPizza03 (
d
c̄) 12:01, 14 November 2023 (UTC)
__NOGALLERY__
and it would be actually smarter and less work to disable image showing on all categories by default (without requiring any code per page or bot work) and have a __YESGALLERY__
magic word for the much less instances of categories that actually could show images.
Gonnym (
talk) 12:38, 14 November 2023 (UTC)
Does the task at Wikipedia:Village pump (technical)#Implementation of Template:Refideas editnotice require a bot, or is there another way to accomplish that? You can respond there if you like. BOZ ( talk) 05:42, 14 November 2023 (UTC)
My website runeberg.org just recently moved from http: to https: so it would be nice if someone could update the 11,000 links accordingly. This is not urgent, as everything works fine with automatic redirects, but it would be nice. Thank you. -- LA2 ( talk) 22:33, 17 December 2023 (UTC)
Done -- Green C 16:33, 9 January 2024 (UTC)
I just moved
Saint Francis University to
Saint Francis University (Pennsylvania), because there will be a university also named Saint Francis University in Hong Kong (
Caritas Institute of Higher Education acquires university title, Government of Hong Kong Press Release). The page "Saint Francis University" will be a redirect to
University of Saint Francis. Before doing so I need to fix all pages with link to [[Saint Francis University]]
and replace it with [[Saint Francis University (Pennsylvania)|Saint Francis University]]
(or, if the link is [[Saint Francis University|something else]]
, just replace the link itself, not description), which I found hundreds. Is there a bot that can do this task for me? --
Leeyc0
(Talk) 12:04, 9 January 2024 (UTC)
There are quite a few external links templates created in recent years (See Category:Social media external link templates) and when used they offer a consistent style and allow for error tracking among other things. However there are still quite a lot of external links that don't use these. Sometimes they are bare links, while others have some kind of text with them. Would it possible for a bot to convert external links in the external links section (links in the body should be ignored as I'm not sure if these templates work in the body correctly or not) to use one of the listed templates at the bottom? Here is an example of an edit with IMDb title.
Templates:
If this is controversial and needs discussion, please point me to where it should be held. Gonnym ( talk) 15:45, 28 December 2023 (UTC)
https://example.com/person/<number assigned to person>
to
https://example.com/profile/<persons_full_name>
, the template cannot be updated in a fashion that will result in a meaningful change, since all we have on the template calls is {{example|<number>}}
. On the other hand, I think at least one of the URL bots has the ability to match old to new, so if it sees
https://example.com/person/<number>
in the text directly it can update to the new code. Either way a bot will need to update everything, but with the latter case (again, assuming it's possible) there is already a bot that can do that functionality.In other words, an elink bot will notice a change in URL if the URL is in the article, but a user has to notice a dead link if it's in a template.
Primefac (
talk) 12:27, 9 January 2024 (UTC)
{{
official}}
might get supported. Don't forget, these templates change so if a bot supports the template, and someone changes it, the bot has to be updated to avoid making errors. And this is just Enwiki there are over 300 Wikipedias, plus hundreds more in other projects. And the underlying code of saving dead links is quite complex to do correctly, only a few programmers have this down, these are large complex tools that have taken many years of development. A BOTREQ to make a dead link fixer, for one template, doesn't make sense. At best, a bot that converts templates to CS1|2 or square-link, then run the archive bots. --
Green
C 16:30, 9 January 2024 (UTC)
|url-status=
, |archive-url=
and |archive-date=
?
Gonnym (
talk) 12:50, 10 January 2024 (UTC)
|archive=
like {{
2006 Commonwealth Games profile}}
- this looks like an alternative method in use, most of those templates are sports related so it was probably conceived by a few editors at some time.|archive-url=
trio, and external link template use simply |archive=
which if it exists the template renders this URL as a replacement for what it would have rendered. It's going to be template-specific how to best approach this. Anyway, if it's true
Category:External link templates with archive parameter is only for templates that use |archive=
, it will be important to have a new category for
Category:External link templates with archive-url parameter, so bots and tools can differentiate which parameters to use. --
Green
C 16:58, 10 January 2024 (UTC)Hey all! For some background here, for TWL users to access Newspapers.com, the library sends them through a proxied domain at https://www-newspapers-com.wikipedialibrary.idm.oclc.org/. This often results in this domain name making its way into the mainspace, which is problematic because it can only be accessed by those with access to TWL.
JPxG has set up a way to replace these links with the unproxied domain using JWB (see more info and an example edit), but I feel like this is an area where a bot could step in.
Citation bot is able to clean these links up automatically (see an example edit), but it has to be triggered manually. These proxy URLs are not automatically placed in a category, which means a human editor would need to assemble a list of pages to be fixed for Citation bot to even look at them. Citation bot also wouldn't deal with these links outside of citations, such as with external links.
It's worth noting that I've previously filed a tangentially similar BRFA, which was denied as Citation bot would be easier to use and give better results. With these links, however, I don't think that's the case, mainly because Citation bot is tedious to trigger on these pages, but also because Citation bot doesn't even touch other proxied URLs, only Newspapers.com.
I'd love to make this happen using Pywikibot, but based on my previous BRFA I wanted to see some thoughts on this being fully automated. This task is already being done semi-automatically way through JWB, so I think it might as well be fully automated, potentially expanding additionally to other TWL-proxied sites. (Citation bot doesn't even touch other proxied URLs, only Newspapers.com.)
(CCing Headbomb for your thorough comments on the previous BRFA—would love to hear your opinion especially.) Bsoyka ( talk) 18:32, 28 December 2023 (UTC)
insource:wikipedialibrary.idm.oclc.org
which throws about 643 hits in mainspace, I wonder if expanding the scope to other wikipedialibrary domains would be warranted. It seems like there are a lot of links to that proxy.
Jo-Jo Eumerus (
talk) 09:55, 29 January 2024 (UTC)
Hi, I asked the following at the Help Desk, and they suggested asking here:
I noticed that there are a ton of pages tagged for needing verification from August 2022. All of the location ones really just need the first of the two notes citations (the one just going to census.gov) removed. Is there a way for someone to mass-fix this?
The note, as it is, is always in the Demographics section as:
"Note: the US Census treats Hispanic/Latino as an ethnic category. This table excludes Latinos from the racial categories and assigns them to a separate category. Hispanics/Latinos can be of any race.<ref>http://www.census.gov {{nonspecific|date=August 2022}}</ref><ref>{{cite web |title=About the Hispanic Population and its Origin |url=https://www.census.gov/topics/population/hispanic-origin/about.html |website=www.census.gov |access-date=18 May 2022}}</ref>"
It is the first of the two that needs to go, because the second has it covered.
To add: on all the pages I have fixed thus far with this error ( see: recent Texas edits), it is the only note on the page, and always attached to a table with racial demographic data.
Thanks in advance! Edenaviv5 ( talk) 16:18, 11 January 2024 (UTC)
See this discussion: is there a bot that can assist us with the deletion review process? Jarble ( talk) 19:46, 12 January 2024 (UTC)
Deleting or otherwise removing errors from uncalled references. Geardona ( talk to me?) 20:24, 17 January 2024 (UTC)
Ever since the idea of immediately moving inadequate articles to draftspace emerged as a common alternative to deletion, the amount of time that has had to be invested in cleaning up polluted categories that have draftspace pages in them has gone way up, because the people who do the sandboxing frequently forget to remove or disable the categories in the process — so I wanted to ask if there's any way that a bot can be made to clean up any overlooked stuff.
Since there's already a bot, JJMC89bot, that detects main-to-draft page moves and tags them as {{ Drafts moved from mainspace}}, the easiest thing would probably be to just have that bot automatically disable any categories on the page at the same time as it's tagging it — but when I directly approached that bot's maintainer earlier this year to ask if this could be implemented, they declined on the basis that the bot hadn't already been approved to perform that task, while failing to give me any explanation of why taking the steps necessary to get the bot approved to perform that task was somehow not an option. As an alternative, I then approached the maintainer of DannyS712bot, which catches and disables categories on drafts that are in the active AFC submission queue (which newly sandboxed former articles generally aren't, and thus don't get caught by it), but was basically told to buzz off and talk to JJMC89bot.
So, since I've already been rebuffed by the maintainers of both of the obvious candidate bots, I wanted to ask if there's any other way to either get one of those two bots on the task or make a new bot to go through Category:All content moved from mainspace to draftspace disabling any active categories, so that editors can cut down on the amount of time we have to spend on DRAFTNOCAT cleanup. If possible, such a bot would ideally also do an ifexist check, and outright remove any redlinked categories that don't even exist at all, though just disabling redlinks too would still be preferable to editors having to manually clean up hundreds of categorized drafts at a time — it's just that merely disabling the redlinks creates another load of cleanup work later on when the draft gets approved or moved by its own creator without AFC review or whatever, so killing redlinks right away is preferable to simply deferring them for a second round of future cleanup. Bearcat ( talk) 16:18, 8 November 2023 (UTC)
<nowiki>...</nowiki>
or <!--...-->
tags to prevent categorization.
GoingBatty (
talk) 01:59, 8 January 2024 (UTC)
Hi, I would like to ask if it would be possible to align text to the right in the # of votes and % of votes columns in the table listing over 460 MPs located in the List of Sejm members (2023–2027)#List of members section. The use of {{ Table alignment}} is imposible due to merged cells which help with wisual representation. There fore befoure every cell in mentioned columns which all contain numerical data, "align-text: right|" sholud be placed. Chears! — Antoni12345 ( talk) 23:49, 20 January 2024 (UTC)
|| 1
Replace: || style="text-align: right;" | 1
|| 2
Replace: || style="text-align: right;" | 2
|| 0
Replace: || style="text-align: right;" | 0
"The three most dangerous things in the world are a programmer with a soldering iron, a hardware type with a program patch, and a user with an idea."
— Rick Cook, The Wizardry Consulted
So I have an idea, and...
Is it possible for a bot to find articles that:
If a bot could automatically detect such articles, then I'd like to have it add the {{ underlinked}} template, on a schedule of perhaps a few articles being tagged per hour, to feed the seemingly popular Category:Underlinked articles for the Wikipedia:Growth Team features, without giving a large number of articles to the first editor and then leaving none for anyone else.
I realize that this would require a demonstration of consensus, but I don't want to make the suggestion, get people's hopes up, and then find out that bots can't count the number of words or links in an article. WhatamIdoing ( talk) 22:21, 30 November 2023 (UTC)
{od}
An update and some observations, in case anyone else is interested:
Red Cross
). (Pinging
Trizek (WMF))
WhatamIdoing ( talk) 04:32, 26 January 2024 (UTC)