This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 10 | ← | Archive 14 | Archive 15 | Archive 16 | Archive 17 | Archive 18 | Archive 19 |
See User_talk:Bot1058#Talk:Basarabka_and_Moldovanka. The bot has been creating redirects in the talk namespace. Most are benign but just utterly useless blank pages, while several others are actually pages that have been properly deleted per WP:G8. The bot's task listing shows no tasks that would indicate that it should be creating pages. I'm loathe to pblock the bot from the talk: namespace as it actually does useful tasks there, but there's no reason why this should be creating pages, especially ones that have been G8'd. Hog Farm Talk 23:43, 9 October 2021 (UTC)
function purgeCache
in it! Which I've never used until now. I'm gonna run some trials to check out how well it works for making null edits. That would be easier than adding a new argument nocreate
to function edit
. –
wbm1058 (
talk) 16:36, 10 October 2021 (UTC)
forcelinkupdate
updates the links table entries for the current page (e.g. category membership), but doesn't queue up to reparse all pages transcluding the purged page. The operation without either of those parameters just updates the cached HTML without updating links tables.
Anomie
⚔ 00:26, 11 October 2021 (UTC)
Worst case solution: I have for some time blocked COIBot from creating pages in mainspace (which it should not do but did) until I figured out why it did that sometimes by creating an edit filter for it. That both resolves the problem ánd logs that the problem occurred while you debug/test/patch. -- Dirk Beetstra T C 12:03, 10 October 2021 (UTC)
Last week, two bots that I check on, AdminStatsBot and BernsteinBot, both went off schedule. I checked to see if there was a lag which there wasn't. BernsteinBot started updating but irregularly and it didn't return the regular schedule it had previously maintained with no problems. I posted a note to each bot operator on their user talk pages but neither have been active here in over a month.
But I'm posting here just to see if there was some change or update that would cause both bots to stop updating. Other bots I work with such as AnomeBotIII and SDZeroBot didn't have problems so I'm not sure what is up. Thanks for any information you can provide. Liz Read! Talk! 04:47, 12 October 2021 (UTC)
Does anyone know why certain edits from bot accounts show up under Special:RecentChanges when the non-bot filter is selected? Winston ( talk) 00:56, 18 October 2021 (UTC)
Template:Row numbers recently went through an RfD due to its inability to display (at all) on the mobile app. I modified it to work correctly on the app again, but this needs changes made to the articles. I wrote a pywikibot script to assist with this in semi-automated fashion, putting the edited articles on the clipboard and opening Firefox at the right page to paste and review. The script works pretty well, after some initial difficulties with finding precisely when to escape equals signs. See [2] for example edits - a couple of the first broke ref tags in a way I didn't initially spot, but after handling them specially in the script, all later edits have been correct.
However, verifying all the articles individually is slow and there are about 100 left. Would this sort of thing be acceptable to run unattended? I'll open a proper BRFA if so. The core part of the current script (which doesn't save anything) is below, based on scripts/basic.py in pywikibot.
class RowNumberingBot(
SingleSiteBot,
ConfigParserBot,
ExistingPageBot,
NoRedirectPageBot,
AutomaticTWSummaryBot,
):
summary_key = 'basic-changing'
def __init__(self, generator, **kwargs) -> None:
"""
Initializer.
@param generator: the page generator that determines on which pages
to work
@type generator: generator
"""
# Add your own options to the bot and set their defaults
# -always option is predefined by BaseBot class
self.available_options.update({
'replace': False, # delete old text and write the new text
'summary': None, # your own bot summary
'text': 'Test', # add this text from option. 'Test' is default
'top': False, # append text on top of the page
})
# call initializer of the super class
super().__init__(site=True, **kwargs)
# assign the generator to the bot
self.generator = generator
self.regex = re.compile('(?s){{\\s*[Rr]ow (numbers|counter|indexer)\\s*\\|(?:1=)?\\s*(<nowiki>(.*?)</nowiki>)')
def treat_page(self) -> None:
text = self.current_page.text
while m := self.regex.search(text):
text = text[:m.start(2)] + escape_equals(m.group(3)) + ' ' + textm.end(2):]
if text == self.current_page.text:
print(f"Skipping {self.current_page.title()}.")
return
subprocess.run('xclip -i -sel c'.split(), input=text.encode())
subprocess.run(['firefox', self.current_page.full_url() + '?action=edit&summary='
+ urllib.parse.quote_plus(self.opt.summary)])
input("Press enter for next page...")
self_closing_ref_regex = re.compile(r'''<ref( +[^= <>]+ *= *("[^"]*"|'[^']*'|[^ "'>]*))* *\/$''')
def escape_equals(s):
"""
Escape equals signs in string s with {{=}} unless they are already within
double braces or a tag.
"""
n_brace = 0
n_square = 0
b = 0
ref = 0
out = ''
for i, ch in enumerate(s):
if ch == '{':
if n_brace < 0:
n_brace = 1
else:
n_brace += 1
elif ch == '}':
if n_brace > 0:
n_brace = -1
else:
n_brace -= 1
elif ch == '[':
if n_square < 0:
n_square = 1
else:
n_square += 1
elif ch == ']':
if n_square > 0:
n_square = -1
else:
n_square -= 1
# It seems that ref tags are special.
elif si:i+4 == '<ref':
ref += 1
assert ref == 1, s[:i + '\n\nFAILED\n\n' + si:] + '\n\n'
elif si:i+5 == '</ref' or (ch == '>' and self_closing_ref_regex.search(s[:i])):
ref -= 1
assert ref == 0, s[:i + '\n\nFAILED\n\n' + si:] + '\n\n'
else:
n_brace, n_square = (0, 0)
if n_brace == 2 or n_square == 2:
b += 1
n_brace, n_square = (0, 0)
elif n_brace == -2 or n_square == -2:
b -= 1
n_brace, n_square = (0, 0)
assert(b >= 0)
if ch == '=' and b == 0 and ref == 0:
out += '{{=}}'
else:
out += ch
assert ref == 0 and b == 0,\
f"{n_brace} {n_square} {ref} {b}"
return out
User:GKFX talk 21:42, 18 October 2021 (UTC)
What non-talk page discussion pages are there besides those in the Project namespace and the Template:Did you know nominations? This is relevant for bots working on discussion pages such as IndentBot. Winston ( talk) 03:14, 23 October 2021 (UTC)
mw.config.get('wgExtraSignatureNamespaces')
, which
will be removed soon, and via
mw.Title.wantSignaturesNamespace(). I bet it can also be retrieved through API, but I can't find it right now.
Nardog (
talk) 03:35, 23 October 2021 (UTC)
For example, this edit? How is this is "high priority" edit? Archives should usually be off limits to further editing. In this case, it's changing history even if correcting "errors" at the time. Jason Quinn ( talk) 10:20, 23 October 2021 (UTC)
Hello all (BAG, botops, etc), there is a new template {{
Last N contributions}}, shortcut {{lnc}}
, which allows for easy linking to a group of contributions for a user. For example, I can easily link to
these 10 contributions going backwards from midnight last night. I think this will be extremely useful for BRFAs and giving trial diffs. The first point of this post is to notify about the template, the second is to ask a question: should the template be linked in {{
BotTrial}}, specifically as
provide a link, or is that just overkill? Is there a better way to let botops know about this new tool?
Primefac (
talk) 14:43, 23 October 2021 (UTC)
{{lnc|monkbot|25|2021-10-06}}
→
these 25 contributions; does not work, returns 25 edits from 2020-12-31{{lnc|monkbot|25|20211006}}
→
these 25 contributions; does not work, returns 25 edits from 2021-09-25; the edits of interest were made on 2021-10-06 so the September list is counterintuitive and not helpful{{{2}}}
optional? If blank, the template substs in today's date? Don't know if that is possible.{{lnc|monkbot|25|20211006235959}}
(which gives
this), because technically speaking your example is for 20211006000000 (midnight on the 6th). As far as the first point goes, that's a reasonable issue, and I'll see about allowing multiple date formats (though the main issue becomes one of "time").
Primefac (
talk) 15:20, 23 October 2021 (UTC)
{{
BotTrialComplete}}
to a BRFA, the date of the trial's last edit is the date that I want to enter; not the date + 235959 because that is how I (and I dare say, most editors will) think about |ts=
. To accomplish that easily, perhaps use &start=
instead of &offset=
? Here is the link currently produced by {{lnc|monkbot|25|20211006}}
:
<span class="plainlinks">[//en.wikipedia.org/?title=Special:Contributions&offset=2021-10-06&limit=25&target=monkbot&namespace=all these 25 contributions]</span>
&offset=
to &start=
:
<span class="plainlinks">[//en.wikipedia.org/?title=Special:Contributions&start=2021-10-06&limit=25&target=monkbot&namespace=all these 25 contributions]</span>
&offset=
only when |ts=
has hour/minute/second precision?{{lnc|monkbot|25|20211007}}
instead of {{lnc|monkbot|25|20211006235959}}
.{{Last N contributions|Jimbo Wales|10|20030724}}
would not give you contributions from the July 24th, but from July 23rd or earlier. Using two different parameters with different behaviors under different conditions would just complicates things.I see T.seppelt doesn't feel they have time to make it comply w/ current guidelines, is there an alternative today? – SJ + 18:07, 28 October 2021 (UTC)
I distinctly remember that several years ago a bot ran that automatically applied {{ ShadowsCommons}} tags to Wikipedia files that had a non-identical file on Commons with the same name. Is it just a figment of my imagination, or did it actually exist? Jo-Jo Eumerus ( talk) 16:28, 25 October 2021 (UTC)
Hello, Bot group,
I rely on DumbBOT which updates the PROD lists but since September, the updating schedule has shifted several times. I have left inquiries about this on DumbBOT's talk page but today, after another updating change, I went to the user page of the bot operator, Tizio, and saw that he hasn't logged in for over a year. I don't want the bot disabled but it seems like it should have an active operator who still works on Wikipedia. In fact, I don't understand how the updating schedule keeps shifting if the bot operator has been absent so long! Any chance that someone else could take over as operator? Thanks for any assistance you can offer. Liz Read! Talk! 00:37, 24 November 2021 (UTC)
I rarely log in lately. In case of problems with the bot email me.so perhaps we'll get a reply. I agree that bot operators should be active, and would urge a RfC on adding this to policy ~ TheresNoTime ( to explain!) 00:48, 24 November 2021 (UTC)
:P
). Clearly it didn't happen! I entirely agree that bot operators should be responsive to concerns/bug reports/etc. I'm less sure about them being active in terms of some minimum edit requirement - as long as they respond to bot issues, that'd be good enough for me. (Which reminds me, I owe Liz a reply about the G13-warning task...)
firefly (
t ·
c ) 08:39, 24 November 2021 (UTC)
Even double redirect-fixing bots, like humans, like to procrastinate. At Special:DoubleRedirects, there is a list of hundreds of double redirects that have not been fixed for several days. Could this be considered "bot procrastination", then? GeoffreyT2000 ( talk) 15:00, 29 November 2021 (UTC)
Futher instructions
|
---|
Follow the instructions on mw:Manual:Pywikibot/PAWS to create a server and set everything up, then open a terminal. Type pwb.py redirect.py double , and then review each change it suggests. Some example edits I just did:
|
― Qwerfjkl talk 17:26, 30 November 2021 (UTC)
@
Xaosflux and
Ymblanter: In reference to
this discussion, my bot was approved for
another task that requires template-editor permission. After the issues with the last request, I completely forgot to request it again when the task was approved, and just realized now why edits haven't been going through. --
Ahecht (
TALK
PAGE) 17:18, 6 December 2021 (UTC)
In
Special:NewPagesFeed, filtering for redirects and sorting by oldest brings up hundreds of month-old redirects from November 9. Some of them are recently-redirected articles and similar things with complicated page histories, but others (like
this one at
American Chemical Manufacturing and Mining Company) have only had one revision that whole time. I see that
DannyS712 bot III's Task 66 was approved to patrol redirects automatically (following
this RfC)-- indeed,
the bot's log shows it patrolled some new redirects just a few minutes ago, and its
to-do list is empty -- so what's going on here?
jp×
g 23:38, 7 December 2021 (UTC)
Never mind, I am a fool -- in the bot's source code it fetches a whitelist from pageid 62534307, which is Wikipedia:New pages patrol/Redirect whitelist. jp× g 23:52, 7 December 2021 (UTC)
Bots Newsletter, December 2021 | ||
---|---|---|
Welcome to the eighth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Maintainers disappeared to parts unknown... bots awakening from the slumber of æons... hundreds of thousands of short descriptions... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots. Our last issue was in August 2019, so there's quite a bit of catching up to do. Due to the vast quantity of things that have happened, the next few issues will only cover a few months at a time. This month, we'll go from September 2019 through the end of the year. I won't bore you with further introductions — instead, I'll bore you with a newsletter about bots. Overall
September 2019
October 2019
November 2019
December 2019
In the next issue of Bots Newsletter:
These questions will be answered — and new questions raised — by the January 2022 Bots Newsletter. Tune in, or miss out! Signing off... jp× g 04:29, 10 December 2021 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
On 2020 July 3, XLinkBot has overwritten a redirect with the content of its target article, effectively creating a fork! Here's the diff: [3] - Waysidesc ( talk) 18:32, 2 December 2021 (UTC)
XLinkBot reverted content for no apparent reason. Not the first time I have seen (on my watchlist) XLinkBot reverting content incorrectly. -- Green C 18:42, 2 December 2021 (UTC)
Robert McClenon posted on the Teahouse ( permalink to thread) about an action of Yapperbot ( talk · contribs · deleted contribs · logs · filter log · block user · block log), whose maintainer Naypta has not edited since August 2020. While the issue they mentioned could be closed ("working as intended"), they mention that a bot should not run without supervision even if it works fine. I think that is correct; as WP:BOTCOMM requires bot maintainers to promptly reply to any concern about their bot operation, they should be active on the project or at least responsive to pings/talk page messages.
Are we really going to block Yapperbot in the absence of a malfunction? Yapperbot is only one example, I imagine dubious that there are quite a few bots that still run without a human at the other end of the leash. The question is what to do to enforce BOTCOMM.
I suspect the actual current enforcement scheme is that such "zombie" bots are left alone until they misbehave badly enough, at which point they are blocked, though no example comes to my mind (even though I have been reading this noticeboard for a few years). I also suspect that zombie bots would get blocked at the first sign of mischief, whereas bots with a responsive maintainer (who promises a fix is on the way) would be cut more slack. That is probably not what the letter of the policy says, but it is reasonable. Tigraan Click here for my talk page ("private" contact) 13:09, 7 December 2021 (UTC)
An extreme case of this issue was Cydebot, which for about 15 years processed categories which WP:CFD had agreed to merge, rename, or delete. Cydebot did about 6.8 million edits in that time.
In the latter years, the bot owner @ Cyde was around very rarely. That didn't matter while the bot did great work, but it became an issue when changes in bot functionality were needed. So a new bot was created, and when it was ready to roll, Cydebot was blocked to avoid clashes between two bots.
Cyde had done a magnificent job in creating a bot which ran error-free on such a huge scale for so long (average 1,200 edits per day for 15 years). There was no question of blocking the bot just 'cos Cyde wasn't around for chatter.
So I agree with Primefac: if a bot is acting as intended, I say let it run until a) its use is no longer required, or b) it starts messing up
. --
BrownHairedGirl
(talk) • (
contribs) 01:52, 16 December 2021 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Yesterday, I sought outside views on Wikipedia:Bots/Requests for approval/BHGbot 9, after failing to reach agreement with Headbomb and ProcrastinatingReader. See this request [4], at WP:Bots/Requests_for_approval/BHGbot_9#Other_views_sought
Unfortunately, instead of leaving it to previously-uninvolved BAG members, Headbomb decided to respond aggressively,
[5] accusing me of being a dumb-as-a-brick-no-context-mindless-automaton
, and falsely accusing me of making a personal judgement about the validity of a ref which had ben correctly tagged by another editor 4 years ago.
Headbomb has now declined
[6] the BRFA, bizarrely accusing me of taking a general
WP:BATTLEGROUND mentality
-- even though it was Headbomb who responded with a BATTLEGROUND mentality to a reasoned request.
When I had specifically sought the views of BAG members other than PR & Headbomb, it was thoroughly inappropriate of Headbomb to close the BRFA before those outside views had been given.
Please can an uninvolved BAG member either reopen this BRFA, or review it? BrownHairedGirl (talk) • ( contribs) 17:04, 16 December 2021 (UTC)
<ref>...</ref>
tags. Even if the link is otherwise bare, the misplaced tag causes the bot to not recognise it as bare.<ref>https://www.imdb.com/title/tt0108956/technical?ref_=tt_dt_spec {{better source needed|date=October 2017}}</ref>
, and treat that as a bare URL, and retain the {{
Cleanup bare URLs}} tag.<ref>...</ref>
tags. That's 0.77%.dumb-as-a-brick-no-context-mindless-automatonmischaracterizes what Headbomb said. He is characterizing the bot as such an automaton, not the proposed bot operator. I've only just gotten to this line, so there may be other merit to the OP. -- Izno ( talk) 19:10, 16 December 2021 (UTC)
dumb-as-brick". It includes (in step 5) a deep layer of sophistication which I think is superfluous and which no other commenter at BRFA wanted, but which I added at Headbomb's request.
If it was easy to code for these exceptions, I would have sighed and done the coding, and accepted a few un-needed skips. However, it's actually quite a complex bit of coding (a monster regex to include a dozen template names and all their many redirects), which will be error-prone and not transparent.
<ref[^>]*?>\s*\[?\s*https?:[^>< \|\[\]]+\s*\]?\s*<\s*/\s*ref
, this can be done by replacing the bolded part: \s*
→ ({{.*}}|\s)*
. (Not tested, possibly there is some greedy match issue etc. etc. but even as a regex newbie I feel this can be done). Surely this would cause false positives (i.e. non-bare URLs that are considered bare), but I speculate it would still cause very few cases to not be detagged by the bot.action=parse
, for example) and look for /(?<!href=")https?:\/\//
. Of course, that would require a little more coding knowledge that just plugging a regex into AWB. --
Ahecht (
TALK<ref>...</ref>
tags is that one template inside the ref tags is a valid reason for not treating the URL as bare for the purposes : {{
Dead link}}. That is because it would be absurd to ask editors to fill a link which is dead.absurd to ask editors to fill a link which is dead. ProcrastinatingReader ( talk) 21:34, 17 December 2021 (UTC)
"Don't insert tags that are similar or redundant" and "If an article has many problems, tag only the highest priority issues".I don't see the tags as redundant. See text on WP:BAREURL which says
Note how much more information is available. Even if the link no longer works, one can see that it previously linked to a web page containing some technical discussion revolving around a specific Nikon firmware update that might be obtainable through other means.A reference about a dead link can still be filled, and it being a bare URL is a separate problem from the deadness of the link. You fix dead links by adding parameters such as an archive URL/archive date, or replacing the ref altogether. You fix bare URLs by adding information about the reference. I suppose a person fixing one or the other would take care of both issues, but that doesn't make them the same issue. That's the way I saw it anyway, I understand you might see it differently or may think this is a wrong way of seeing it, which is why I left the option for another BAG member (who may see things differently) to assess the BRFA. ProcrastinatingReader ( talk) 10:05, 22 December 2021 (UTC)
A reference about a dead link can still be filled. But then you go on to explain (correctly) that the actions are to replace it or archive ... at which point it is no longer dead. But so long as it is dead, it cannot be filled.
{{
dead link}}
, or other templates, might be markers of non-bare URLs that you might want to match but that my approach would ignore. Unless you have evidence that it would cause more than 10% of false negatives, I would not care. The current situation is 100% of false negatives, because you’re not getting BAG approval as-is. And removing 90% of wrong tags would already be quite a feat IMO.
Tigraan
Click here for my talk page ("private" contact) 09:02, 20 December 2021 (UTC)
tagging only by the priority issues, it says that
If an article has many problems, tag only the highest priority issues(my emphasis added). Do we need an entire banner for a singular bare URL? Probably not, especially if the inline template will suffice, but if it's the only issue on the page there's no harm in having it. But if there is a bare URL, and there is already the maintenance tag, then the tag should not be removed until the bare URL is fixed.
Their purposes are to foster improvement of the encyclopedia by alerting editors to changes that need to be made
<ref>https://www.example.com/title {{some misplaced cleanup tag}}</ref>
this seems like it should be a fairly uncontroversial task (remove a cleanup banner that is no longer needed) - something that I would expect an editor would do unceremoniously; it also seems fairly low volume.
things got heated and Headbomb closed the discussion before that could happen. Actually, things were very calm and amicable until my civil request for outside views was met with aggression by Headbomb. The heat and the denial of outside views were entirely Headbomb's doing. BrownHairedGirl (talk) • ( contribs) 13:37, 23 December 2021 (UTC)
degrad[ing] the bot, I call "compromising to get stuff done".
thinking [my] argument is ironclad. My substantive objection is that nobody even tried to address the WP:CLEANUPTAG guidance until Primefac's reply above, which I have only just seen 'cos I wan't pinged.
dumb as a brick. A brick doesnt have 171 lines of code, so the only purpose of that comment was to insult and belittle me. Headbomb then closed the BRFA before anyone else could comment, doing so with a false assertion that I was engaging in battleground conduct,; and because the discussion was closed with that false assertion, I could not reply to it. Using disparaging language in response to a civil and reasonable request and then accusing your target of misconduct is a particularly vicious form of bullying, which tries to invert reality. It is especially inappropriate when the request was for others to review Headbomb's comments, and Headbomb should have recused themself to allow others to comment.
A few things
I have no grounds for recusal, and my closure stands. Either resubmit the task without the problematic bit, or carry out an RFC establishing consensus to carry out the removal of {{ Cleanup bare URLs}} tags when bare url remains in an article (including under what conditions it would be appropriate for the removal). Headbomb { t · c · p · b} 12:15, 23 December 2021 (UTC)
All bots are dumb-as-a-brick, and cannot determine proper context. If that is genuinely your view, then logically you should decline all bots. Since you don't actually want to ban all bots, then your comment has no relevance to the decision to be made, and its only purpose was to insult ad attack me.
what you want to do is have a bot that ignores context. Utterly false; it has 171 lines of code to determine context. I am astonished that you have made such an absurd assertion.
that this is your view that these tags are misplaced. No, it is not ... and this is far from the first time in this episode that you have falsely claimed that something is my personal view.
this is your view that these tags are misplaced. You can demonstrate that your view is shared by the community with an RFC????
As such, there is (currently at least) no consensus for your bot.
Hello! Does anyone know of a bot that tags unused fair use files for delayed speedy deletion under criterion F5? I'm going through the list of them right now, and some have been unused for a while — for example, File:"Mookajjiya Kanasugalu", Film Poster.jpg, which hasn't been used in Mookajjiya Kanasugalu (film) for more than seven weeks. (If there isn't a bot, I think I'd like to make one.) Tol ( talk | contribs) @ 00:18, 16 December 2021 (UTC)
As a followup, the bot is updated so that it is using Quarry once again (and only relying on the on-wiki database report as a backup). Hopefully that will resolve the issue. -- B ( talk) 23:23, 25 December 2021 (UTC)
A special page for double redirects contains a few hundreds double redirects for user's scripts which my bot can't process due to no permission. May be someone with sysop rights can process them? -- Emaus ( talk) 13:03, 1 January 2022 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The Rlink2 Bot is adding ghostarchive links to NYTimes citations with the edit summary "Adding archives to assist with Wikipedia:Verifiability, WP:SOURCEACCESS". Was it approved for that purpose? Is ghostarchive an approved archive? The bot is not adding "url-status=live" to the citations.
Examples
https://en.wikipedia.org/?title=European_Union&curid=9317&diff=1063238337&oldid=1063219228
https://en.wikipedia.org/?title=Cocaine&curid=7701&diff=1063234152&oldid=1062962495
More at /info/en/?search=Special:Contributions/Rlink2
cc Primefac, TheSandDoctor -- Whywhenwhohow ( talk) 05:50, 2 January 2022 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Old threads at WP:ORN is not getting archived by the assigned bot User:Lowercase sigmabot III. In particular, the section Wikipedia:No_original_research/Noticeboard#Yahwism#Torah_appears_to_be_OR from September 5 is still at the top of the page. – LaundryPizza03 ( d c̄) 06:52, 11 January 2022 (UTC)
@ ProcrastinatingReader and TheSandDoctor: Recently approved TolBot 10 created Austin Vedder
1. Redirecting to Austin A. Vedder, which is another redirect (from a misspelling, no less). The bot should not create double redirects.
2. Inserted {{DEFAULTSORT:Vedder, Austin}}
inside the {{
Redirect category shell}} sandwich. This is not a redirect category template so it should be placed outside of the sandwich.
3. Placed {{ R from short name}} inside the sandwich. This is not a valid short name, so this template should not have been put there; rather {{ R from misspelling}} would be the appropriate template to put there.
— wbm1058 ( talk) 02:20, 15 January 2022 (UTC)
You are invited to join the discussion at Template talk:Bot § Handles bots with multiple tasks poorly. {{u| Sdkb}} talk 07:35, 2 January 2022 (UTC)
Page watchers may be interested in WP:ANI#Rlink2. Izno ( talk) 23:24, 18 January 2022 (UTC)
Would a BAGger please look at Wikipedia talk:Bots/Requests for approval/Fluxbot 6 and advise? Thank you! — xaosflux Talk 16:59, 19 January 2022 (UTC)
Bots Newsletter, January 2022 | ||
---|---|---|
Welcome to the ninth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Vicious bot-on-bot edit warring... superseded tasks... policy proposals... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots. After a long hiatus between August 2019 and December 2021, there's quite a bit of ground to cover. Due to the vastness, I decided in December to split the coverage up into a few installments that covered six months each. Some people thought this was a good idea, since covering an entire year in a single issue would make it unmanageably large. Others thought this was stupid, since they were getting talk page messages about crap from almost three years ago. Ultimately, the question of whether each issue covers six months or a year is only relevant for a couple more of them, and then the problem will be behind us forever. Of course, you can also look on the bright side – we are making progress, and this issue will only be about crap from almost two years ago. Today we will pick up where we left off in December, and go through the first half of 2020. Overall January 2020
February 2020
March 2020
April 2020
May 2020
June 2020
Conclusion
These questions will be answered — and new questions raised — by the February 2022 Bots Newsletter. Tune in, or miss out! Signing off... jp× g 23:22, 31 January 2022 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
Lots of categories named "NPOV disputes from Month Year" have been moved to the "Wikipedia neutral point of view disputes from Month Year" name by JJMC89 bot III as a result of Wikipedia:Categories for discussion/Log/2021 December 23#Category:NPOV disputes. Since then, most of them have been deleted by Fastily due to an automatic G6 notice for empty monthly categories. After that, some of them have been recreated by AnomieBOT and then the recreated categories have again been moved by JJMC89 bot III and deleted by Fastily. To prevent an endless cycle of creation by AnomieBOT, moving by JJMC89 bot III, and deletion by Fastily, all of the "Wikipedia neutral point of view disputes from Month Year" categories that were deleted by Fastily should be undeleted, any recreated category by AnomieBOT should be history-merged with the old history at the new name, and the corresponding templates should be modified to prevent the categories from being empty again. Also, both JJMC89 bot III and AnomieBOT should be temporarily disabled until all this is sorted out. GeoffreyT2000 ( talk) 21:16, 19 January 2022 (UTC)
see here. Bot is not respecting {{ bots}}. In a hurry right now can't type anything else — GMX (on the go!) 21:15, 26 January 2022 (UTC)
I disabled the automatic running of the MFD archiver script. I have some new code which should be ready to go, but I'll run it manually for a few days before automating it. There are no MFDs needing archiving right now so I'll give it a shot tomorrow and lift the partial blocks then. Legoktm ( talk) 07:19, 4 February 2022 (UTC)
It looks like we have a few inactive bots by definition (both account and operator haven't edited in 2 years) according to the Majavah report. ( Majavah, if there's a listed operator, could you get their contribs or something and add that to the report too? Maybe even a column to indicate mutual activity e.g. {{ cross}} where neither are and a {{ check}} where both are. This was painful. :)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
All operators notified on their talk pages. — xaosflux Talk 14:22, 1 February 2022 (UTC)
And a few more in a month or so:
We should consider asking the active bot operators whether all the rest of the bot accounts that haven't edited since a while ago (pick a number, 5 years seems fine generally) still need a bot flag. These particularly stand out...:
These ones seem like low-hanging fruit, but I think the rest should be queried as well. Izno ( talk) 05:30, 1 February 2022 (UTC)
{{Template:User bot owner|Acebot}}
. There's probably userboxes or templates for the bot's page too. These templates also populate categories such as
Category:Wikipedia bot operators. –
Novem Linguae (
talk) 11:43, 1 February 2022 (UTC)I recently posted on the help desk ( Wikipedia:Help desk#Confusion on bot policy regarding semi-automated editing) where I explained my confusion on how one should approach/implement semi-automated edits. You may read the section I linked if you wish to. In a nutshell, I would like to semi-automate edits from a script I am running. I do not wish to create a bot account and would like to have the edits be on my main account (once I get more comfortable with the API I may consider expanding the script and may look into bot creation).
My requirements seem simple. However, according to
mw:API:Edit I require a CSRF token, but apparently I can't just lazily do mw.user.tokens.get( 'csrfToken' )
and call it a day cause
that does not work. As the
code samples demonstrate, there is a 4-step process. Fine, but I'm confused about the second step (POST with lgname+lgpassword+lgtoken), because those parameters are to be obtained from
Special:BotPasswords which states Make sure you are logged into your bot's account, and not the owner's account, before creating a bot password. , and as per
WP:SEMIAUTOMATED, A bot account should not be used for assisted editing, unless the task has been through a BRFA.. So, I should not use a bot account, but I still need a 'bot password' from an account which it seems to imply cannot be my own?
Side note: MediaWiki JS sample code works fine. What I do not like about this however is that it needs to be done in a JS console on the wiki. I'd much prefer to have a script running in a terminal (ie. using NodeJS/Python).
Side side note: Psst while I have you nerds here, can someone point me to some docs explaining how I can achieve the functionality reFill achieves by sending you to a page that shows a diff with some changes made without making those changes. My script updates statistics, since I am semi-automating it, I would love for the script to run, and then show a diff so that I can visually confirm what it has done and then just publish the changes. Satricious ( talk) 16:09, 20 February 2022 (UTC)
function goToShowChangesScreen(titleWithNamespaceAndUnderscores, wikicode, editSummary) {
let titleEncoded = encodeURIComponent(titleWithNamespaceAndUnderscores);
let wgServer = mw.config.get('wgServer');
let wgScriptPath = mw.config.get('wgScriptPath');
let baseURL = wgServer + wgScriptPath + '/';
// https://stackoverflow.com/a/12464290/3480193
$(`<form action="${baseURL}index.php?title=${titleEncoded}&action=submit" method="POST"/>`)
.append($('<input type="hidden" name="wpTextbox1">').val(wikicode))
.append($('<input type="hidden" name="wpSummary">').val(editSummary))
.append($('<input type="hidden" name="mode">').val('preview'))
.append($('<input type="hidden" name="wpDiff">').val('Show changes'))
.append($('<input type="hidden" name="wpUltimateParam">').val('1'))
.appendTo($(document.body)) //it has to be added somewhere into the <body>
.submit();
}
Hello. I am not sure if this is the correct venue. If this cant be solved here, kindly let me know where should I go.
I currently have the AWB bot on enwiki,
User:KiranBOT. It adds wikiproject banners on talkpage (simplest task, I think). In short: I need to create a fully automated/toolforge bot.
Prelude:Around 20 days ago, I got bot flag on mrwiki (AWB). Within less than 20 days (in around 7-8 runs), it racked-up more than 10k edits there ( mr:special:contributions/KiranBOT). Because of the syntax of Marathi language, and word rules (not grammar rules), there are many uncontroversial find and replace tasks. But there are less than 10 active/regular editors, so such tasks have been piled up.
To the point: On mrwiki, I would like to run a simple bot — but the one with continuous editing, like DumbBOT. Few hours ago, I created an account on wikitech/toolforge, and requested for membership. But I am still not sure how, and where to upload the bot's code. I want to code it in C#. The bot will obviously be discussed/vetted on mrwiki, along with the keywords to be replaced (I have created a rudimentary list at mr:User:Usernamekiran/typos). Any help/guidence will be appreciated a lot. —usernamekiran • sign the guestbook • (talk) 23:38, 31 December 2021 (UTC)
So I could transfer files using github, and also created files using mono on putty/CLI. But I couldnt execute the bot. First I went with C#, then python, but both didnt work. I have lots of material in .net to study/refer like dotnetwikibot framework, source code of AWB, and some other programs mentioned at mediawiki. All I need is a little guidance regarding how to compile and run it on toolforge. Your help will be appreciated a lot. Also pinging @ Mz7, JPxG, and ST47: —usernamekiran • sign the guestbook • (talk) 15:44, 18 January 2022 (UTC)
replace foo bar -search:"insource:\"foo\" "
, or am I missing something? ―
Qwerfjkl
talk 20:31, 17 February 2022 (UTC)
Here is the code:
import pywikibot
from pywikibot import pagegenerators, textlib
import re
#retrieve the page
site = pywikibot.Site()
page = pywikibot.Page(site, u"user:usernamekiran/typos")
text = page.text
#edit the page
string = page.text
page.text = string.replace("abcusernamekiran", "xyz")
#save the page
page.save(u"experimental edit with modified script")
Thanks, —usernamekiran • sign the guestbook • (talk) 09:54, 19 February 2022 (UTC)
import pywikibot
from pywikibot import pagegenerators, textlib
import re
#retrieve the pages
site = pywikibot.Site()
pages = site.search( "intitle:\"foo\"", total=5, namespaces=0)
for page in pages:
text = page.text
#edit the page
text = text.replace("abcusernamekiran", "xyz")
# or using the re module:
# text = re.sub( "abcusernamekiran", "xyz", text)
#save the page
page.save(u"experimental edit with modified script")text
Continuation of the discussion at Wikipedia:Bots/Noticeboard/Archive_16#Template_Editor_permission_for_bot.
User:MusikBot II has template protected
Module:Transclusion count/data/C so
User:Ahechtbot can no longer update it. Either the bot will need Template Editor permissions, or
User:MusikBot II/TemplateProtector/config will need to be edited to exclude the subpages of
Module:Transclusion count/data. --
Ahecht (
TALK
PAGE) 20:18, 1 April 2022 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 10 | ← | Archive 14 | Archive 15 | Archive 16 | Archive 17 | Archive 18 | Archive 19 |
See User_talk:Bot1058#Talk:Basarabka_and_Moldovanka. The bot has been creating redirects in the talk namespace. Most are benign but just utterly useless blank pages, while several others are actually pages that have been properly deleted per WP:G8. The bot's task listing shows no tasks that would indicate that it should be creating pages. I'm loathe to pblock the bot from the talk: namespace as it actually does useful tasks there, but there's no reason why this should be creating pages, especially ones that have been G8'd. Hog Farm Talk 23:43, 9 October 2021 (UTC)
function purgeCache
in it! Which I've never used until now. I'm gonna run some trials to check out how well it works for making null edits. That would be easier than adding a new argument nocreate
to function edit
. –
wbm1058 (
talk) 16:36, 10 October 2021 (UTC)
forcelinkupdate
updates the links table entries for the current page (e.g. category membership), but doesn't queue up to reparse all pages transcluding the purged page. The operation without either of those parameters just updates the cached HTML without updating links tables.
Anomie
⚔ 00:26, 11 October 2021 (UTC)
Worst case solution: I have for some time blocked COIBot from creating pages in mainspace (which it should not do but did) until I figured out why it did that sometimes by creating an edit filter for it. That both resolves the problem ánd logs that the problem occurred while you debug/test/patch. -- Dirk Beetstra T C 12:03, 10 October 2021 (UTC)
Last week, two bots that I check on, AdminStatsBot and BernsteinBot, both went off schedule. I checked to see if there was a lag which there wasn't. BernsteinBot started updating but irregularly and it didn't return the regular schedule it had previously maintained with no problems. I posted a note to each bot operator on their user talk pages but neither have been active here in over a month.
But I'm posting here just to see if there was some change or update that would cause both bots to stop updating. Other bots I work with such as AnomeBotIII and SDZeroBot didn't have problems so I'm not sure what is up. Thanks for any information you can provide. Liz Read! Talk! 04:47, 12 October 2021 (UTC)
Does anyone know why certain edits from bot accounts show up under Special:RecentChanges when the non-bot filter is selected? Winston ( talk) 00:56, 18 October 2021 (UTC)
Template:Row numbers recently went through an RfD due to its inability to display (at all) on the mobile app. I modified it to work correctly on the app again, but this needs changes made to the articles. I wrote a pywikibot script to assist with this in semi-automated fashion, putting the edited articles on the clipboard and opening Firefox at the right page to paste and review. The script works pretty well, after some initial difficulties with finding precisely when to escape equals signs. See [2] for example edits - a couple of the first broke ref tags in a way I didn't initially spot, but after handling them specially in the script, all later edits have been correct.
However, verifying all the articles individually is slow and there are about 100 left. Would this sort of thing be acceptable to run unattended? I'll open a proper BRFA if so. The core part of the current script (which doesn't save anything) is below, based on scripts/basic.py in pywikibot.
class RowNumberingBot(
SingleSiteBot,
ConfigParserBot,
ExistingPageBot,
NoRedirectPageBot,
AutomaticTWSummaryBot,
):
summary_key = 'basic-changing'
def __init__(self, generator, **kwargs) -> None:
"""
Initializer.
@param generator: the page generator that determines on which pages
to work
@type generator: generator
"""
# Add your own options to the bot and set their defaults
# -always option is predefined by BaseBot class
self.available_options.update({
'replace': False, # delete old text and write the new text
'summary': None, # your own bot summary
'text': 'Test', # add this text from option. 'Test' is default
'top': False, # append text on top of the page
})
# call initializer of the super class
super().__init__(site=True, **kwargs)
# assign the generator to the bot
self.generator = generator
self.regex = re.compile('(?s){{\\s*[Rr]ow (numbers|counter|indexer)\\s*\\|(?:1=)?\\s*(<nowiki>(.*?)</nowiki>)')
def treat_page(self) -> None:
text = self.current_page.text
while m := self.regex.search(text):
text = text[:m.start(2)] + escape_equals(m.group(3)) + ' ' + textm.end(2):]
if text == self.current_page.text:
print(f"Skipping {self.current_page.title()}.")
return
subprocess.run('xclip -i -sel c'.split(), input=text.encode())
subprocess.run(['firefox', self.current_page.full_url() + '?action=edit&summary='
+ urllib.parse.quote_plus(self.opt.summary)])
input("Press enter for next page...")
self_closing_ref_regex = re.compile(r'''<ref( +[^= <>]+ *= *("[^"]*"|'[^']*'|[^ "'>]*))* *\/$''')
def escape_equals(s):
"""
Escape equals signs in string s with {{=}} unless they are already within
double braces or a tag.
"""
n_brace = 0
n_square = 0
b = 0
ref = 0
out = ''
for i, ch in enumerate(s):
if ch == '{':
if n_brace < 0:
n_brace = 1
else:
n_brace += 1
elif ch == '}':
if n_brace > 0:
n_brace = -1
else:
n_brace -= 1
elif ch == '[':
if n_square < 0:
n_square = 1
else:
n_square += 1
elif ch == ']':
if n_square > 0:
n_square = -1
else:
n_square -= 1
# It seems that ref tags are special.
elif si:i+4 == '<ref':
ref += 1
assert ref == 1, s[:i + '\n\nFAILED\n\n' + si:] + '\n\n'
elif si:i+5 == '</ref' or (ch == '>' and self_closing_ref_regex.search(s[:i])):
ref -= 1
assert ref == 0, s[:i + '\n\nFAILED\n\n' + si:] + '\n\n'
else:
n_brace, n_square = (0, 0)
if n_brace == 2 or n_square == 2:
b += 1
n_brace, n_square = (0, 0)
elif n_brace == -2 or n_square == -2:
b -= 1
n_brace, n_square = (0, 0)
assert(b >= 0)
if ch == '=' and b == 0 and ref == 0:
out += '{{=}}'
else:
out += ch
assert ref == 0 and b == 0,\
f"{n_brace} {n_square} {ref} {b}"
return out
User:GKFX talk 21:42, 18 October 2021 (UTC)
What non-talk page discussion pages are there besides those in the Project namespace and the Template:Did you know nominations? This is relevant for bots working on discussion pages such as IndentBot. Winston ( talk) 03:14, 23 October 2021 (UTC)
mw.config.get('wgExtraSignatureNamespaces')
, which
will be removed soon, and via
mw.Title.wantSignaturesNamespace(). I bet it can also be retrieved through API, but I can't find it right now.
Nardog (
talk) 03:35, 23 October 2021 (UTC)
For example, this edit? How is this is "high priority" edit? Archives should usually be off limits to further editing. In this case, it's changing history even if correcting "errors" at the time. Jason Quinn ( talk) 10:20, 23 October 2021 (UTC)
Hello all (BAG, botops, etc), there is a new template {{
Last N contributions}}, shortcut {{lnc}}
, which allows for easy linking to a group of contributions for a user. For example, I can easily link to
these 10 contributions going backwards from midnight last night. I think this will be extremely useful for BRFAs and giving trial diffs. The first point of this post is to notify about the template, the second is to ask a question: should the template be linked in {{
BotTrial}}, specifically as
provide a link, or is that just overkill? Is there a better way to let botops know about this new tool?
Primefac (
talk) 14:43, 23 October 2021 (UTC)
{{lnc|monkbot|25|2021-10-06}}
→
these 25 contributions; does not work, returns 25 edits from 2020-12-31{{lnc|monkbot|25|20211006}}
→
these 25 contributions; does not work, returns 25 edits from 2021-09-25; the edits of interest were made on 2021-10-06 so the September list is counterintuitive and not helpful{{{2}}}
optional? If blank, the template substs in today's date? Don't know if that is possible.{{lnc|monkbot|25|20211006235959}}
(which gives
this), because technically speaking your example is for 20211006000000 (midnight on the 6th). As far as the first point goes, that's a reasonable issue, and I'll see about allowing multiple date formats (though the main issue becomes one of "time").
Primefac (
talk) 15:20, 23 October 2021 (UTC)
{{
BotTrialComplete}}
to a BRFA, the date of the trial's last edit is the date that I want to enter; not the date + 235959 because that is how I (and I dare say, most editors will) think about |ts=
. To accomplish that easily, perhaps use &start=
instead of &offset=
? Here is the link currently produced by {{lnc|monkbot|25|20211006}}
:
<span class="plainlinks">[//en.wikipedia.org/?title=Special:Contributions&offset=2021-10-06&limit=25&target=monkbot&namespace=all these 25 contributions]</span>
&offset=
to &start=
:
<span class="plainlinks">[//en.wikipedia.org/?title=Special:Contributions&start=2021-10-06&limit=25&target=monkbot&namespace=all these 25 contributions]</span>
&offset=
only when |ts=
has hour/minute/second precision?{{lnc|monkbot|25|20211007}}
instead of {{lnc|monkbot|25|20211006235959}}
.{{Last N contributions|Jimbo Wales|10|20030724}}
would not give you contributions from the July 24th, but from July 23rd or earlier. Using two different parameters with different behaviors under different conditions would just complicates things.I see T.seppelt doesn't feel they have time to make it comply w/ current guidelines, is there an alternative today? – SJ + 18:07, 28 October 2021 (UTC)
I distinctly remember that several years ago a bot ran that automatically applied {{ ShadowsCommons}} tags to Wikipedia files that had a non-identical file on Commons with the same name. Is it just a figment of my imagination, or did it actually exist? Jo-Jo Eumerus ( talk) 16:28, 25 October 2021 (UTC)
Hello, Bot group,
I rely on DumbBOT which updates the PROD lists but since September, the updating schedule has shifted several times. I have left inquiries about this on DumbBOT's talk page but today, after another updating change, I went to the user page of the bot operator, Tizio, and saw that he hasn't logged in for over a year. I don't want the bot disabled but it seems like it should have an active operator who still works on Wikipedia. In fact, I don't understand how the updating schedule keeps shifting if the bot operator has been absent so long! Any chance that someone else could take over as operator? Thanks for any assistance you can offer. Liz Read! Talk! 00:37, 24 November 2021 (UTC)
I rarely log in lately. In case of problems with the bot email me.so perhaps we'll get a reply. I agree that bot operators should be active, and would urge a RfC on adding this to policy ~ TheresNoTime ( to explain!) 00:48, 24 November 2021 (UTC)
:P
). Clearly it didn't happen! I entirely agree that bot operators should be responsive to concerns/bug reports/etc. I'm less sure about them being active in terms of some minimum edit requirement - as long as they respond to bot issues, that'd be good enough for me. (Which reminds me, I owe Liz a reply about the G13-warning task...)
firefly (
t ·
c ) 08:39, 24 November 2021 (UTC)
Even double redirect-fixing bots, like humans, like to procrastinate. At Special:DoubleRedirects, there is a list of hundreds of double redirects that have not been fixed for several days. Could this be considered "bot procrastination", then? GeoffreyT2000 ( talk) 15:00, 29 November 2021 (UTC)
Futher instructions
|
---|
Follow the instructions on mw:Manual:Pywikibot/PAWS to create a server and set everything up, then open a terminal. Type pwb.py redirect.py double , and then review each change it suggests. Some example edits I just did:
|
― Qwerfjkl talk 17:26, 30 November 2021 (UTC)
@
Xaosflux and
Ymblanter: In reference to
this discussion, my bot was approved for
another task that requires template-editor permission. After the issues with the last request, I completely forgot to request it again when the task was approved, and just realized now why edits haven't been going through. --
Ahecht (
TALK
PAGE) 17:18, 6 December 2021 (UTC)
In
Special:NewPagesFeed, filtering for redirects and sorting by oldest brings up hundreds of month-old redirects from November 9. Some of them are recently-redirected articles and similar things with complicated page histories, but others (like
this one at
American Chemical Manufacturing and Mining Company) have only had one revision that whole time. I see that
DannyS712 bot III's Task 66 was approved to patrol redirects automatically (following
this RfC)-- indeed,
the bot's log shows it patrolled some new redirects just a few minutes ago, and its
to-do list is empty -- so what's going on here?
jp×
g 23:38, 7 December 2021 (UTC)
Never mind, I am a fool -- in the bot's source code it fetches a whitelist from pageid 62534307, which is Wikipedia:New pages patrol/Redirect whitelist. jp× g 23:52, 7 December 2021 (UTC)
Bots Newsletter, December 2021 | ||
---|---|---|
Welcome to the eighth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Maintainers disappeared to parts unknown... bots awakening from the slumber of æons... hundreds of thousands of short descriptions... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots. Our last issue was in August 2019, so there's quite a bit of catching up to do. Due to the vast quantity of things that have happened, the next few issues will only cover a few months at a time. This month, we'll go from September 2019 through the end of the year. I won't bore you with further introductions — instead, I'll bore you with a newsletter about bots. Overall
September 2019
October 2019
November 2019
December 2019
In the next issue of Bots Newsletter:
These questions will be answered — and new questions raised — by the January 2022 Bots Newsletter. Tune in, or miss out! Signing off... jp× g 04:29, 10 December 2021 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
On 2020 July 3, XLinkBot has overwritten a redirect with the content of its target article, effectively creating a fork! Here's the diff: [3] - Waysidesc ( talk) 18:32, 2 December 2021 (UTC)
XLinkBot reverted content for no apparent reason. Not the first time I have seen (on my watchlist) XLinkBot reverting content incorrectly. -- Green C 18:42, 2 December 2021 (UTC)
Robert McClenon posted on the Teahouse ( permalink to thread) about an action of Yapperbot ( talk · contribs · deleted contribs · logs · filter log · block user · block log), whose maintainer Naypta has not edited since August 2020. While the issue they mentioned could be closed ("working as intended"), they mention that a bot should not run without supervision even if it works fine. I think that is correct; as WP:BOTCOMM requires bot maintainers to promptly reply to any concern about their bot operation, they should be active on the project or at least responsive to pings/talk page messages.
Are we really going to block Yapperbot in the absence of a malfunction? Yapperbot is only one example, I imagine dubious that there are quite a few bots that still run without a human at the other end of the leash. The question is what to do to enforce BOTCOMM.
I suspect the actual current enforcement scheme is that such "zombie" bots are left alone until they misbehave badly enough, at which point they are blocked, though no example comes to my mind (even though I have been reading this noticeboard for a few years). I also suspect that zombie bots would get blocked at the first sign of mischief, whereas bots with a responsive maintainer (who promises a fix is on the way) would be cut more slack. That is probably not what the letter of the policy says, but it is reasonable. Tigraan Click here for my talk page ("private" contact) 13:09, 7 December 2021 (UTC)
An extreme case of this issue was Cydebot, which for about 15 years processed categories which WP:CFD had agreed to merge, rename, or delete. Cydebot did about 6.8 million edits in that time.
In the latter years, the bot owner @ Cyde was around very rarely. That didn't matter while the bot did great work, but it became an issue when changes in bot functionality were needed. So a new bot was created, and when it was ready to roll, Cydebot was blocked to avoid clashes between two bots.
Cyde had done a magnificent job in creating a bot which ran error-free on such a huge scale for so long (average 1,200 edits per day for 15 years). There was no question of blocking the bot just 'cos Cyde wasn't around for chatter.
So I agree with Primefac: if a bot is acting as intended, I say let it run until a) its use is no longer required, or b) it starts messing up
. --
BrownHairedGirl
(talk) • (
contribs) 01:52, 16 December 2021 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Yesterday, I sought outside views on Wikipedia:Bots/Requests for approval/BHGbot 9, after failing to reach agreement with Headbomb and ProcrastinatingReader. See this request [4], at WP:Bots/Requests_for_approval/BHGbot_9#Other_views_sought
Unfortunately, instead of leaving it to previously-uninvolved BAG members, Headbomb decided to respond aggressively,
[5] accusing me of being a dumb-as-a-brick-no-context-mindless-automaton
, and falsely accusing me of making a personal judgement about the validity of a ref which had ben correctly tagged by another editor 4 years ago.
Headbomb has now declined
[6] the BRFA, bizarrely accusing me of taking a general
WP:BATTLEGROUND mentality
-- even though it was Headbomb who responded with a BATTLEGROUND mentality to a reasoned request.
When I had specifically sought the views of BAG members other than PR & Headbomb, it was thoroughly inappropriate of Headbomb to close the BRFA before those outside views had been given.
Please can an uninvolved BAG member either reopen this BRFA, or review it? BrownHairedGirl (talk) • ( contribs) 17:04, 16 December 2021 (UTC)
<ref>...</ref>
tags. Even if the link is otherwise bare, the misplaced tag causes the bot to not recognise it as bare.<ref>https://www.imdb.com/title/tt0108956/technical?ref_=tt_dt_spec {{better source needed|date=October 2017}}</ref>
, and treat that as a bare URL, and retain the {{
Cleanup bare URLs}} tag.<ref>...</ref>
tags. That's 0.77%.dumb-as-a-brick-no-context-mindless-automatonmischaracterizes what Headbomb said. He is characterizing the bot as such an automaton, not the proposed bot operator. I've only just gotten to this line, so there may be other merit to the OP. -- Izno ( talk) 19:10, 16 December 2021 (UTC)
dumb-as-brick". It includes (in step 5) a deep layer of sophistication which I think is superfluous and which no other commenter at BRFA wanted, but which I added at Headbomb's request.
If it was easy to code for these exceptions, I would have sighed and done the coding, and accepted a few un-needed skips. However, it's actually quite a complex bit of coding (a monster regex to include a dozen template names and all their many redirects), which will be error-prone and not transparent.
<ref[^>]*?>\s*\[?\s*https?:[^>< \|\[\]]+\s*\]?\s*<\s*/\s*ref
, this can be done by replacing the bolded part: \s*
→ ({{.*}}|\s)*
. (Not tested, possibly there is some greedy match issue etc. etc. but even as a regex newbie I feel this can be done). Surely this would cause false positives (i.e. non-bare URLs that are considered bare), but I speculate it would still cause very few cases to not be detagged by the bot.action=parse
, for example) and look for /(?<!href=")https?:\/\//
. Of course, that would require a little more coding knowledge that just plugging a regex into AWB. --
Ahecht (
TALK<ref>...</ref>
tags is that one template inside the ref tags is a valid reason for not treating the URL as bare for the purposes : {{
Dead link}}. That is because it would be absurd to ask editors to fill a link which is dead.absurd to ask editors to fill a link which is dead. ProcrastinatingReader ( talk) 21:34, 17 December 2021 (UTC)
"Don't insert tags that are similar or redundant" and "If an article has many problems, tag only the highest priority issues".I don't see the tags as redundant. See text on WP:BAREURL which says
Note how much more information is available. Even if the link no longer works, one can see that it previously linked to a web page containing some technical discussion revolving around a specific Nikon firmware update that might be obtainable through other means.A reference about a dead link can still be filled, and it being a bare URL is a separate problem from the deadness of the link. You fix dead links by adding parameters such as an archive URL/archive date, or replacing the ref altogether. You fix bare URLs by adding information about the reference. I suppose a person fixing one or the other would take care of both issues, but that doesn't make them the same issue. That's the way I saw it anyway, I understand you might see it differently or may think this is a wrong way of seeing it, which is why I left the option for another BAG member (who may see things differently) to assess the BRFA. ProcrastinatingReader ( talk) 10:05, 22 December 2021 (UTC)
A reference about a dead link can still be filled. But then you go on to explain (correctly) that the actions are to replace it or archive ... at which point it is no longer dead. But so long as it is dead, it cannot be filled.
{{
dead link}}
, or other templates, might be markers of non-bare URLs that you might want to match but that my approach would ignore. Unless you have evidence that it would cause more than 10% of false negatives, I would not care. The current situation is 100% of false negatives, because you’re not getting BAG approval as-is. And removing 90% of wrong tags would already be quite a feat IMO.
Tigraan
Click here for my talk page ("private" contact) 09:02, 20 December 2021 (UTC)
tagging only by the priority issues, it says that
If an article has many problems, tag only the highest priority issues(my emphasis added). Do we need an entire banner for a singular bare URL? Probably not, especially if the inline template will suffice, but if it's the only issue on the page there's no harm in having it. But if there is a bare URL, and there is already the maintenance tag, then the tag should not be removed until the bare URL is fixed.
Their purposes are to foster improvement of the encyclopedia by alerting editors to changes that need to be made
<ref>https://www.example.com/title {{some misplaced cleanup tag}}</ref>
this seems like it should be a fairly uncontroversial task (remove a cleanup banner that is no longer needed) - something that I would expect an editor would do unceremoniously; it also seems fairly low volume.
things got heated and Headbomb closed the discussion before that could happen. Actually, things were very calm and amicable until my civil request for outside views was met with aggression by Headbomb. The heat and the denial of outside views were entirely Headbomb's doing. BrownHairedGirl (talk) • ( contribs) 13:37, 23 December 2021 (UTC)
degrad[ing] the bot, I call "compromising to get stuff done".
thinking [my] argument is ironclad. My substantive objection is that nobody even tried to address the WP:CLEANUPTAG guidance until Primefac's reply above, which I have only just seen 'cos I wan't pinged.
dumb as a brick. A brick doesnt have 171 lines of code, so the only purpose of that comment was to insult and belittle me. Headbomb then closed the BRFA before anyone else could comment, doing so with a false assertion that I was engaging in battleground conduct,; and because the discussion was closed with that false assertion, I could not reply to it. Using disparaging language in response to a civil and reasonable request and then accusing your target of misconduct is a particularly vicious form of bullying, which tries to invert reality. It is especially inappropriate when the request was for others to review Headbomb's comments, and Headbomb should have recused themself to allow others to comment.
A few things
I have no grounds for recusal, and my closure stands. Either resubmit the task without the problematic bit, or carry out an RFC establishing consensus to carry out the removal of {{ Cleanup bare URLs}} tags when bare url remains in an article (including under what conditions it would be appropriate for the removal). Headbomb { t · c · p · b} 12:15, 23 December 2021 (UTC)
All bots are dumb-as-a-brick, and cannot determine proper context. If that is genuinely your view, then logically you should decline all bots. Since you don't actually want to ban all bots, then your comment has no relevance to the decision to be made, and its only purpose was to insult ad attack me.
what you want to do is have a bot that ignores context. Utterly false; it has 171 lines of code to determine context. I am astonished that you have made such an absurd assertion.
that this is your view that these tags are misplaced. No, it is not ... and this is far from the first time in this episode that you have falsely claimed that something is my personal view.
this is your view that these tags are misplaced. You can demonstrate that your view is shared by the community with an RFC????
As such, there is (currently at least) no consensus for your bot.
Hello! Does anyone know of a bot that tags unused fair use files for delayed speedy deletion under criterion F5? I'm going through the list of them right now, and some have been unused for a while — for example, File:"Mookajjiya Kanasugalu", Film Poster.jpg, which hasn't been used in Mookajjiya Kanasugalu (film) for more than seven weeks. (If there isn't a bot, I think I'd like to make one.) Tol ( talk | contribs) @ 00:18, 16 December 2021 (UTC)
As a followup, the bot is updated so that it is using Quarry once again (and only relying on the on-wiki database report as a backup). Hopefully that will resolve the issue. -- B ( talk) 23:23, 25 December 2021 (UTC)
A special page for double redirects contains a few hundreds double redirects for user's scripts which my bot can't process due to no permission. May be someone with sysop rights can process them? -- Emaus ( talk) 13:03, 1 January 2022 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
The Rlink2 Bot is adding ghostarchive links to NYTimes citations with the edit summary "Adding archives to assist with Wikipedia:Verifiability, WP:SOURCEACCESS". Was it approved for that purpose? Is ghostarchive an approved archive? The bot is not adding "url-status=live" to the citations.
Examples
https://en.wikipedia.org/?title=European_Union&curid=9317&diff=1063238337&oldid=1063219228
https://en.wikipedia.org/?title=Cocaine&curid=7701&diff=1063234152&oldid=1062962495
More at /info/en/?search=Special:Contributions/Rlink2
cc Primefac, TheSandDoctor -- Whywhenwhohow ( talk) 05:50, 2 January 2022 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Old threads at WP:ORN is not getting archived by the assigned bot User:Lowercase sigmabot III. In particular, the section Wikipedia:No_original_research/Noticeboard#Yahwism#Torah_appears_to_be_OR from September 5 is still at the top of the page. – LaundryPizza03 ( d c̄) 06:52, 11 January 2022 (UTC)
@ ProcrastinatingReader and TheSandDoctor: Recently approved TolBot 10 created Austin Vedder
1. Redirecting to Austin A. Vedder, which is another redirect (from a misspelling, no less). The bot should not create double redirects.
2. Inserted {{DEFAULTSORT:Vedder, Austin}}
inside the {{
Redirect category shell}} sandwich. This is not a redirect category template so it should be placed outside of the sandwich.
3. Placed {{ R from short name}} inside the sandwich. This is not a valid short name, so this template should not have been put there; rather {{ R from misspelling}} would be the appropriate template to put there.
— wbm1058 ( talk) 02:20, 15 January 2022 (UTC)
You are invited to join the discussion at Template talk:Bot § Handles bots with multiple tasks poorly. {{u| Sdkb}} talk 07:35, 2 January 2022 (UTC)
Page watchers may be interested in WP:ANI#Rlink2. Izno ( talk) 23:24, 18 January 2022 (UTC)
Would a BAGger please look at Wikipedia talk:Bots/Requests for approval/Fluxbot 6 and advise? Thank you! — xaosflux Talk 16:59, 19 January 2022 (UTC)
Bots Newsletter, January 2022 | ||
---|---|---|
Welcome to the ninth issue of the English Wikipedia's Bots Newsletter, your source for all things bot. Vicious bot-on-bot edit warring... superseded tasks... policy proposals... these stories, and more, are brought to you by Wikipedia's most distinguished newsletter about bots. After a long hiatus between August 2019 and December 2021, there's quite a bit of ground to cover. Due to the vastness, I decided in December to split the coverage up into a few installments that covered six months each. Some people thought this was a good idea, since covering an entire year in a single issue would make it unmanageably large. Others thought this was stupid, since they were getting talk page messages about crap from almost three years ago. Ultimately, the question of whether each issue covers six months or a year is only relevant for a couple more of them, and then the problem will be behind us forever. Of course, you can also look on the bright side – we are making progress, and this issue will only be about crap from almost two years ago. Today we will pick up where we left off in December, and go through the first half of 2020. Overall January 2020
February 2020
March 2020
April 2020
May 2020
June 2020
Conclusion
These questions will be answered — and new questions raised — by the February 2022 Bots Newsletter. Tune in, or miss out! Signing off... jp× g 23:22, 31 January 2022 (UTC) (You can subscribe or unsubscribe from future newsletters by adding or removing your name from this list.) |
Lots of categories named "NPOV disputes from Month Year" have been moved to the "Wikipedia neutral point of view disputes from Month Year" name by JJMC89 bot III as a result of Wikipedia:Categories for discussion/Log/2021 December 23#Category:NPOV disputes. Since then, most of them have been deleted by Fastily due to an automatic G6 notice for empty monthly categories. After that, some of them have been recreated by AnomieBOT and then the recreated categories have again been moved by JJMC89 bot III and deleted by Fastily. To prevent an endless cycle of creation by AnomieBOT, moving by JJMC89 bot III, and deletion by Fastily, all of the "Wikipedia neutral point of view disputes from Month Year" categories that were deleted by Fastily should be undeleted, any recreated category by AnomieBOT should be history-merged with the old history at the new name, and the corresponding templates should be modified to prevent the categories from being empty again. Also, both JJMC89 bot III and AnomieBOT should be temporarily disabled until all this is sorted out. GeoffreyT2000 ( talk) 21:16, 19 January 2022 (UTC)
see here. Bot is not respecting {{ bots}}. In a hurry right now can't type anything else — GMX (on the go!) 21:15, 26 January 2022 (UTC)
I disabled the automatic running of the MFD archiver script. I have some new code which should be ready to go, but I'll run it manually for a few days before automating it. There are no MFDs needing archiving right now so I'll give it a shot tomorrow and lift the partial blocks then. Legoktm ( talk) 07:19, 4 February 2022 (UTC)
It looks like we have a few inactive bots by definition (both account and operator haven't edited in 2 years) according to the Majavah report. ( Majavah, if there's a listed operator, could you get their contribs or something and add that to the report too? Maybe even a column to indicate mutual activity e.g. {{ cross}} where neither are and a {{ check}} where both are. This was painful. :)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
All operators notified on their talk pages. — xaosflux Talk 14:22, 1 February 2022 (UTC)
And a few more in a month or so:
We should consider asking the active bot operators whether all the rest of the bot accounts that haven't edited since a while ago (pick a number, 5 years seems fine generally) still need a bot flag. These particularly stand out...:
These ones seem like low-hanging fruit, but I think the rest should be queried as well. Izno ( talk) 05:30, 1 February 2022 (UTC)
{{Template:User bot owner|Acebot}}
. There's probably userboxes or templates for the bot's page too. These templates also populate categories such as
Category:Wikipedia bot operators. –
Novem Linguae (
talk) 11:43, 1 February 2022 (UTC)I recently posted on the help desk ( Wikipedia:Help desk#Confusion on bot policy regarding semi-automated editing) where I explained my confusion on how one should approach/implement semi-automated edits. You may read the section I linked if you wish to. In a nutshell, I would like to semi-automate edits from a script I am running. I do not wish to create a bot account and would like to have the edits be on my main account (once I get more comfortable with the API I may consider expanding the script and may look into bot creation).
My requirements seem simple. However, according to
mw:API:Edit I require a CSRF token, but apparently I can't just lazily do mw.user.tokens.get( 'csrfToken' )
and call it a day cause
that does not work. As the
code samples demonstrate, there is a 4-step process. Fine, but I'm confused about the second step (POST with lgname+lgpassword+lgtoken), because those parameters are to be obtained from
Special:BotPasswords which states Make sure you are logged into your bot's account, and not the owner's account, before creating a bot password. , and as per
WP:SEMIAUTOMATED, A bot account should not be used for assisted editing, unless the task has been through a BRFA.. So, I should not use a bot account, but I still need a 'bot password' from an account which it seems to imply cannot be my own?
Side note: MediaWiki JS sample code works fine. What I do not like about this however is that it needs to be done in a JS console on the wiki. I'd much prefer to have a script running in a terminal (ie. using NodeJS/Python).
Side side note: Psst while I have you nerds here, can someone point me to some docs explaining how I can achieve the functionality reFill achieves by sending you to a page that shows a diff with some changes made without making those changes. My script updates statistics, since I am semi-automating it, I would love for the script to run, and then show a diff so that I can visually confirm what it has done and then just publish the changes. Satricious ( talk) 16:09, 20 February 2022 (UTC)
function goToShowChangesScreen(titleWithNamespaceAndUnderscores, wikicode, editSummary) {
let titleEncoded = encodeURIComponent(titleWithNamespaceAndUnderscores);
let wgServer = mw.config.get('wgServer');
let wgScriptPath = mw.config.get('wgScriptPath');
let baseURL = wgServer + wgScriptPath + '/';
// https://stackoverflow.com/a/12464290/3480193
$(`<form action="${baseURL}index.php?title=${titleEncoded}&action=submit" method="POST"/>`)
.append($('<input type="hidden" name="wpTextbox1">').val(wikicode))
.append($('<input type="hidden" name="wpSummary">').val(editSummary))
.append($('<input type="hidden" name="mode">').val('preview'))
.append($('<input type="hidden" name="wpDiff">').val('Show changes'))
.append($('<input type="hidden" name="wpUltimateParam">').val('1'))
.appendTo($(document.body)) //it has to be added somewhere into the <body>
.submit();
}
Hello. I am not sure if this is the correct venue. If this cant be solved here, kindly let me know where should I go.
I currently have the AWB bot on enwiki,
User:KiranBOT. It adds wikiproject banners on talkpage (simplest task, I think). In short: I need to create a fully automated/toolforge bot.
Prelude:Around 20 days ago, I got bot flag on mrwiki (AWB). Within less than 20 days (in around 7-8 runs), it racked-up more than 10k edits there ( mr:special:contributions/KiranBOT). Because of the syntax of Marathi language, and word rules (not grammar rules), there are many uncontroversial find and replace tasks. But there are less than 10 active/regular editors, so such tasks have been piled up.
To the point: On mrwiki, I would like to run a simple bot — but the one with continuous editing, like DumbBOT. Few hours ago, I created an account on wikitech/toolforge, and requested for membership. But I am still not sure how, and where to upload the bot's code. I want to code it in C#. The bot will obviously be discussed/vetted on mrwiki, along with the keywords to be replaced (I have created a rudimentary list at mr:User:Usernamekiran/typos). Any help/guidence will be appreciated a lot. —usernamekiran • sign the guestbook • (talk) 23:38, 31 December 2021 (UTC)
So I could transfer files using github, and also created files using mono on putty/CLI. But I couldnt execute the bot. First I went with C#, then python, but both didnt work. I have lots of material in .net to study/refer like dotnetwikibot framework, source code of AWB, and some other programs mentioned at mediawiki. All I need is a little guidance regarding how to compile and run it on toolforge. Your help will be appreciated a lot. Also pinging @ Mz7, JPxG, and ST47: —usernamekiran • sign the guestbook • (talk) 15:44, 18 January 2022 (UTC)
replace foo bar -search:"insource:\"foo\" "
, or am I missing something? ―
Qwerfjkl
talk 20:31, 17 February 2022 (UTC)
Here is the code:
import pywikibot
from pywikibot import pagegenerators, textlib
import re
#retrieve the page
site = pywikibot.Site()
page = pywikibot.Page(site, u"user:usernamekiran/typos")
text = page.text
#edit the page
string = page.text
page.text = string.replace("abcusernamekiran", "xyz")
#save the page
page.save(u"experimental edit with modified script")
Thanks, —usernamekiran • sign the guestbook • (talk) 09:54, 19 February 2022 (UTC)
import pywikibot
from pywikibot import pagegenerators, textlib
import re
#retrieve the pages
site = pywikibot.Site()
pages = site.search( "intitle:\"foo\"", total=5, namespaces=0)
for page in pages:
text = page.text
#edit the page
text = text.replace("abcusernamekiran", "xyz")
# or using the re module:
# text = re.sub( "abcusernamekiran", "xyz", text)
#save the page
page.save(u"experimental edit with modified script")text
Continuation of the discussion at Wikipedia:Bots/Noticeboard/Archive_16#Template_Editor_permission_for_bot.
User:MusikBot II has template protected
Module:Transclusion count/data/C so
User:Ahechtbot can no longer update it. Either the bot will need Template Editor permissions, or
User:MusikBot II/TemplateProtector/config will need to be edited to exclude the subpages of
Module:Transclusion count/data. --
Ahecht (
TALK
PAGE) 20:18, 1 April 2022 (UTC)