Operator: DemonDays64 ( talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 01:22, Saturday, November 23, 2019 ( UTC)
Function overview: This bot will edit in HTTPS for applicable links with JavaScript Wiki Browser for large lists of pages that I give it, using a list of RegExes.
Automatic, Supervised, or Manual: Automatic
Programming language(s): No language; it uses JavaScript Wiki Browser, which has functionality to let bots automatically save edits. It also uses WP:JWB Annotated, a fork of JWB that the user Colin M made that is able to find pages to edit that match a certain RegEx, to generate lists of pages; it will get lists that I generate with the search tools in the fork, which will then be pasted into AWB, as it is the one that will be updated, while JWB Annotated is just a fork of one day's version of the tool.
Source code available: At User:DemonDays64 Bot/JWB-settings.js (it is not approved for JWB yet so it doesn't have that page made). It will include the URLS listed at User:DemonDays64/Bot/Links table that don't have 0 pages left to edit.
Links to relevant discussions (where appropriate):
Edit period(s): When I turn on my computer and input a list of pages for it to edit. A few hours of disjointed operation per day.
Estimated number of pages affected: Estimated at >35 per minute, but variable—depends on number of skipped pages and articles with much to change or large articles take a lot longer to load and edit with it. The number could be decreased in the JWB settings if it is too high. For the current set of links I am requesting to be allowed to edit, the number is approximately the total of the numbers listed in the first table at User:DemonDays64/Bot/Links table; somewhere around 100,000 pages over time. The number can be decreased if I use a different set of links.
Namespace(s): Mainspace/Articles
Exclusion compliant Yes:
Function details: Uses a variation of the RegeEx "http:\/\/w?w?w?.?example\.com".
Uses the Regex ((?<=(?<!\?)url ?= ?)|(?<=\[)|(?<=<ref>))(http:\/\/)?(www\.)?example\.com(?!\.)
for sites that I determine to be HTTPS-secured, finding that string and inserting "
https://example.com" in the part that matches (with the lookbehind included. It will only affect the http://
or http://www.
or the www.
and not the url=
(or variation) or the square bracket, through the power of Regex lookbehinds (see a list of cases it will catch or not catch and some more details at
User:DemonDays64/Bot/Regex Example). The last part, a negative lookahead, stops it from catching things like </nowiki>example.com.au</nowiki>
It will skip pages with the {{bots}} template and ignore unparsed content with the built-in JWB functionality to avoid editing stuff in <nowiki> tags and other similar cases UPDATE: it will not have this option on—it stops editing templates, and the cases for stuff it will edit are so limited (and it is in mainspace only) that avoiding stuff in nowiki and comments is unnecessary—there should be no edge cases that this effects. See
User:DemonDays64 Bot/JWB-settings.js for the bot's current settings. The {{bots}} exclusion functionality was tested on
User:DemonDays64/Bot/Exclusion test.
Can someone please take a look at this? I'd really like some input, and hope to get to the trial stage sometime soon. Thanks!
(Signature is very appropriate here LOL) | DemonDays64 | Tell me if I'm doing something wrong :P 05:03, 30 November 2019 (UTC) reply
DemonDays64, Archive URLs take many forms see WP:WEBARCHIVES for known archive URL schemes to skip. Generally if the lookbehind contains "/http" or "?url=http" it should be skipped. -- Green C 22:15, 3 December 2019 (UTC) reply
url=
thing for ?
; that was the only thing I found on that page that would be caught. &url=example.com
, either, using another negative lookbehind.Just to note that I granted the request for extended confirmed rights by the bot operator for this bot account, as per the administrator instructions legit socks can have their socks granted extended confirmed if their main account is extended confirmed. Furthermore, I could not see anything in WP:BOTPOL which did not allow this. I reviewed the bots edits from the trial run, and unless I missed something, they all seemed to be error free, so felt that there was very little (or even no) risk in granting the rights. There was a need for the bot to have the rights according to the bot operator (the bot operator expressed that not being able to edit extended confirmed protected pages was an obstacle to running the trial). If any of the BAGs or other administrators don't feel that the bot should have extended confirmed until it's approved and/or got the bot flag, then feel free to revert my granting of the rights (or if you cannot, ask me to do it by means of a ping). Dreamy Jazz 🎷 talk to me | my contributions 00:59, 6 December 2019 (UTC) reply
Trial complete. List of all the trial edits. — Preceding unsigned comment added by DemonDays64 ( talk • contribs) 01:55, 6 December 2019 (UTC) reply
{{ BAG assistance needed}} I have one point of clarification I'd like to be certain about—am I free to add new entries to the list of links to edit, or do I need a new bot request if I want to add any new ones? I keep finding more sites I could add, and am not sure if I have to bother the approvals group with the many potential requests to expand what the bot does, or if I can just add more links to the settings without another approval. Thanks! DemonDays64 | Tell me if I'm doing something wrong :P 02:02, 6 December 2019 (UTC) reply
Operator: DemonDays64 ( talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 01:22, Saturday, November 23, 2019 ( UTC)
Function overview: This bot will edit in HTTPS for applicable links with JavaScript Wiki Browser for large lists of pages that I give it, using a list of RegExes.
Automatic, Supervised, or Manual: Automatic
Programming language(s): No language; it uses JavaScript Wiki Browser, which has functionality to let bots automatically save edits. It also uses WP:JWB Annotated, a fork of JWB that the user Colin M made that is able to find pages to edit that match a certain RegEx, to generate lists of pages; it will get lists that I generate with the search tools in the fork, which will then be pasted into AWB, as it is the one that will be updated, while JWB Annotated is just a fork of one day's version of the tool.
Source code available: At User:DemonDays64 Bot/JWB-settings.js (it is not approved for JWB yet so it doesn't have that page made). It will include the URLS listed at User:DemonDays64/Bot/Links table that don't have 0 pages left to edit.
Links to relevant discussions (where appropriate):
Edit period(s): When I turn on my computer and input a list of pages for it to edit. A few hours of disjointed operation per day.
Estimated number of pages affected: Estimated at >35 per minute, but variable—depends on number of skipped pages and articles with much to change or large articles take a lot longer to load and edit with it. The number could be decreased in the JWB settings if it is too high. For the current set of links I am requesting to be allowed to edit, the number is approximately the total of the numbers listed in the first table at User:DemonDays64/Bot/Links table; somewhere around 100,000 pages over time. The number can be decreased if I use a different set of links.
Namespace(s): Mainspace/Articles
Exclusion compliant Yes:
Function details: Uses a variation of the RegeEx "http:\/\/w?w?w?.?example\.com".
Uses the Regex ((?<=(?<!\?)url ?= ?)|(?<=\[)|(?<=<ref>))(http:\/\/)?(www\.)?example\.com(?!\.)
for sites that I determine to be HTTPS-secured, finding that string and inserting "
https://example.com" in the part that matches (with the lookbehind included. It will only affect the http://
or http://www.
or the www.
and not the url=
(or variation) or the square bracket, through the power of Regex lookbehinds (see a list of cases it will catch or not catch and some more details at
User:DemonDays64/Bot/Regex Example). The last part, a negative lookahead, stops it from catching things like </nowiki>example.com.au</nowiki>
It will skip pages with the {{bots}} template and ignore unparsed content with the built-in JWB functionality to avoid editing stuff in <nowiki> tags and other similar cases UPDATE: it will not have this option on—it stops editing templates, and the cases for stuff it will edit are so limited (and it is in mainspace only) that avoiding stuff in nowiki and comments is unnecessary—there should be no edge cases that this effects. See
User:DemonDays64 Bot/JWB-settings.js for the bot's current settings. The {{bots}} exclusion functionality was tested on
User:DemonDays64/Bot/Exclusion test.
Can someone please take a look at this? I'd really like some input, and hope to get to the trial stage sometime soon. Thanks!
(Signature is very appropriate here LOL) | DemonDays64 | Tell me if I'm doing something wrong :P 05:03, 30 November 2019 (UTC) reply
DemonDays64, Archive URLs take many forms see WP:WEBARCHIVES for known archive URL schemes to skip. Generally if the lookbehind contains "/http" or "?url=http" it should be skipped. -- Green C 22:15, 3 December 2019 (UTC) reply
url=
thing for ?
; that was the only thing I found on that page that would be caught. &url=example.com
, either, using another negative lookbehind.Just to note that I granted the request for extended confirmed rights by the bot operator for this bot account, as per the administrator instructions legit socks can have their socks granted extended confirmed if their main account is extended confirmed. Furthermore, I could not see anything in WP:BOTPOL which did not allow this. I reviewed the bots edits from the trial run, and unless I missed something, they all seemed to be error free, so felt that there was very little (or even no) risk in granting the rights. There was a need for the bot to have the rights according to the bot operator (the bot operator expressed that not being able to edit extended confirmed protected pages was an obstacle to running the trial). If any of the BAGs or other administrators don't feel that the bot should have extended confirmed until it's approved and/or got the bot flag, then feel free to revert my granting of the rights (or if you cannot, ask me to do it by means of a ping). Dreamy Jazz 🎷 talk to me | my contributions 00:59, 6 December 2019 (UTC) reply
Trial complete. List of all the trial edits. — Preceding unsigned comment added by DemonDays64 ( talk • contribs) 01:55, 6 December 2019 (UTC) reply
{{ BAG assistance needed}} I have one point of clarification I'd like to be certain about—am I free to add new entries to the list of links to edit, or do I need a new bot request if I want to add any new ones? I keep finding more sites I could add, and am not sure if I have to bother the approvals group with the many potential requests to expand what the bot does, or if I can just add more links to the settings without another approval. Thanks! DemonDays64 | Tell me if I'm doing something wrong :P 02:02, 6 December 2019 (UTC) reply