![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 65 | ← | Archive 68 | Archive 69 | Archive 70 | Archive 71 | Archive 72 | → | Archive 75 |
In this edit the bot for some reason linked to the wrong book (U.S. Coast Pilot for part of the Atlantic coast) rather than the applicable U.S. Coast Pilot volume for Alaska. I don't know what went wrong, and I've emended the error; but I thought you might want to know in case the bot's coding needs revision. Deor ( talk) 20:25, 11 March 2020 (UTC)
![]() |
The Technical Barnstar |
Thanks for making Cyberbot! dibbydib 💬/ ✏ 03:20, 12 March 2020 (UTC) |
Hi Cyberpower, in this InternetArchiveBot edit [1] your bot conducted a number of strange actions I would like to report. When you find the time, can you please investigate and fix the bot accordingly?
Thanks and greetings. -- Matthiaspaul ( talk) 17:34, 13 March 2020 (UTC)
Hi, Cyber. I have create protected Isaac Kwaku Fokuo Jr indefinitely per a request at WP:RFPP. My protection shows clearly in the log, but your bot says I haven't done it. [2] Can you see what the problem is, because I can't? Bishonen | tålk 13:53, 11 March 2020 (UTC).
Hi. From my understanding, external links to official sites should always lead to a live, working website. If an external link (which is not used as a reference) is permanently down, this usually indicates that the link needs to be updated or removed. I don't quite see benefit in piping such links to an archived version. See this edit (from 2017), for example. In this case, this was a false positive, and has been reverted. If the site was permanently down, though, I think it would have been preferable for the bot to alert human editors to identify a new, correct link, instead of archiving it.
I previously filed this on Phabricator, but you closed it as not a bug. Maybe the issue warrants further discussion, though? -- Paul_012 ( talk) 21:21, 14 March 2020 (UTC)
Hello, I am the new owner of the R10.net forum site and said that when I want to create a wiki page, the domain name is forbidden. I have found that my research is due to the content created by a user in the hosting category. ( here) How can I solve this situation and create my own company wiki page? can you help me — Preceding unsigned comment added by 31.223.2.136 ( talk) 18:39, 16 March 2020 (UTC)
Hi there Cyberpower678,
How are you doing? I am new to submitting article for deletion and I accidentally nominated a deletion discussion page rather than the actual page. Could you be of any assistance or would you be able to direct me to someone who manages this section of Wikipedia.
/info/en/?search=Wikipedia:Articles_for_deletion/United_Macedonian_Diaspora
Kind Regards,
James — Preceding unsigned comment added by Jamesrichards12345 ( talk • contribs) 16:07, 16 March 2020 (UTC)
Good evening. Just wanted to make you aware of this edit by the bot. I was able to clean everything up, and what I'm trying to do is a very edge case that's unlikely to happen again, at least not very often. The Squirrel Conspiracy ( talk) 06:33, 17 March 2020 (UTC)
Hi Cyberpower678. As the InternetArchiveBot produces too many false positives I've blocked it on alswiki. We have tried for monthgs to report all the bugs and it seems that nothing changes in the edits. I'm sorry, but imho it makes us more additional work than it takes from us. Best regards. -- Holder ( talk) 12:29, 18 March 2020 (UTC)
I have run IABot several times on Black Death and each time it says it has rescued 76 links and modified the page but nothing has happened. Can you advise please. Dudley Miles ( talk) 15:50, 17 March 2020 (UTC)
Hello cyberpower, it's probably my third time writing to you and waiting for a reply but you didn't even bother to reply.
When I try to prompt the bot to analyse multiple pages in trwiki, I get this error and this makes the bot functionless in Turkish Wikipedia. Could you please do the required set up so we can use the bot? Thanks.-- Yagizhan49 ( talk) 18:54, 16 March 2020 (UTC)
Hi Cyberpower678 :)
Best regards
~ ToBeFree (
talk)
20:42, 18 March 2020 (UTC)
Please see this edit for how to fix it, when you get a chance. Thanks. – Jonesey95 ( talk) 03:35, 19 March 2020 (UTC)
Hello, Cyber! I hope you're having a good time in 2020! The IA Bot has been deactivated in SqWiki for more than 2 months now. Maybe since I gave you that bug report. Is this intentional from your part? In its management interface it says it is active but it hasn't made a single edit since October 2019. - Klein Muçi ( talk) 11:16, 7 January 2020 (UTC)
Hello, Cyber! How's it going? It's been over a month I've been waiting for a response about some issues with IABot so I guess you're busy this time of the year. I hope you haven't forgotten about what I've written above. I also wanted to talk with you about another problem I'm having these days with the bot. What is happening in pages like this and how do I stop it? It's been going on on quite some pages now. - Klein Muçi ( talk) 00:27, 14 February 2020 (UTC)
That strange thing with talk page messages repeating themselves ad infinitum is happening again in a lot of pages. See here:
This is getting out of hand fast unfortunately. :/ - Klein Muçi ( talk) 03:34, 20 March 2020 (UTC)
Hi! I just found out that the Olympics part and maybe other parts of sports-reference.com is going to shut down due to moving servers; this was planned for March 1, 2020, but has not happened yet. This site is used as the sole source for a giant number of articles on Wikipedia, and to have it go down would be bad if there are not extensive archives of it.
Would you be able to run the IABot to make sure there are archives on Wikipedia for every page with links to this site? I tried running a job with ID 5510 but it is taking a really long time and there are many thousands of links to it. Also I don't have a great way to get a complete list of links to it. Thank you! DemonDays64 ( talk) 20:13, 19 March 2020 (UTC) (please ping on reply)
Hi. InternetArchiveBot is making hundreds of edits like this saying Sports Ref is a dead URL. The site isn't dead (yet), it's just no longer being updated. However, the URLs are fine. Apologies if this is covered elsewhere. Thanks. Lugnuts Fire Walk with Me 17:58, 19 March 2020 (UTC)
I'm upgrading the Cantonese Wikipedia copy of Module:Citation/CS1 to mirror the latest English Wikipedia edition, and saw a bunch of "dead-url" parameter errors popping up. I remember that before the migration I got a lot of "url-status" errors - I presume that's the new equivalent parameter. Does IABot have a functionality to do that migration in wikitext automatically? Deryck C. 23:26, 14 March 2020 (UTC)
Sorry for the vagueness.
I was editing /info/en/?search=Shino_Yamanaka And saw that the reference (what is now reference 1) was tagged as failing verifiability. If you go to the archived version of the link ( https://web.archive.org/web/20120909201808/http://www.london2012.com/athlete/yamanaka-shino-1020806/) it takes you to the right page. If you remove the wayback machine bit and just go to http://www.london2012.com/athlete/yamanaka-shino-1020806 instead, it takes you to a different page - possibly because the www.london2012.com website has been smushed with the main olympics.org website. It's an easily fixed problem, just use the waybackmachine version of the url in both the url and archived url boxes, but I thought it would be worth mentioning that it occurs.
Red Fiona ( talk) 20:59, 24 March 2020 (UTC)
Hi. I am trying to use IABot Management Interface to add archive links to all references to the page Second Amendment sanctuary. However when I try to run it on that page, and I enable the "Add archives to all non-dead references (Optional)" option, I get an error telling me the page is too big and to submit a bot job instead. I did that, however there was no option to add archives to all non-dead references and the bot did not do that, it only added archives to dead links and did not add archives to non-dead links. How can I accomplish this task in the bot job interface? Thanks! Terrorist96 ( talk) 18:36, 3 March 2020 (UTC)
Hi! I asked a few weeks ago but my question ended up in the archive. How do we get the bot to work on da.wiki? Do we need someone to operate it locally or how does it work?
We are discussing the bot on da:Wikipedia:Landsbybrønden/Brug archive.org and we had an old discussion in da:Wikipedia:Landsbybrønden/Automatisk bot-kildearkivering (it died because noone knew how to proceed). I hope you can help us how to get started. -- MGA73 ( talk) 13:30, 22 March 2020 (UTC)
Hi- Sorry to bug you with this, guessing it's not your problem, but can you tell what might have gone wrong with how I started this AfD nomination? I got an unexpected result when I first attempted to create the page, and had to make a subsequent edit to post my reason for nomination ( page history). I then noticed your bot message there. Note: The page is not appearing on my (refreshed) watchlist, despite showing as watched. I also had to manually put a notice on the article's talkpage. Thanks in advance for any ideas. Eric talk 15:52, 25 March 2020 (UTC)
[5]. 83.219.136.158 ( talk) 18:25, 25 March 2020 (UTC)
Un-archived discussion with new comment:
Cyberpower678: You deleted the following discussion without taking action. The last time, you ultimately did solve the problem, and I'm confident you will solve this one also. Please do not remove this discussion from your talk page without at least acknowledging. I don't want to go through a long cycle of dredging it up from the page history over and over, like last time. (I have edited my previous comment slightly, mainly inserting several missing right parentheses.) — Anomalocaris ( talk) 05:14, 24 October 2019 (UTC)
You are messing up italics and titles in Book talk articles.
{{book report|Number 1's (Destiny's Child album)|''|GA|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
''
) and title not displaying where needed; should be (and I fixed it to){{book report|Number 1's (Destiny's Child album)|''Number 1's''|GA|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
{{book report|Number 1's (Destiny's Child album)|''|GA|chapter=Album|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
''
) and title not displaying where needed; should be (and I fixed it to){{book report|Number 1's (Destiny's Child album)|''Number 1's''|GA|chapter=Album|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
{{book report|''Stargate: The Ark of Truth|The Ark of Truth''|Unassessed|problems=* Page does not exist.
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: The Ark of Truth|''The Ark of Truth''|Unassessed|problems=* Page does not exist.
{{book report|''Stargate: Continuum|Continuum''|Unassessed|problems=|non-free=}}
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: Continuum|''Continuum''|Unassessed|problems=|non-free=}}
{{book report|''Stargate: Revolution|Revolution''|Unassessed|problems=|non-free=}}
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: Revolution|''Revolution''|Unassessed|problems=|non-free=}}
{{book report|Tak to chodí|''Tak to chodí'']] <small>by [[Michal Horáček (lyricist)|Start|chapter=Compilations|problems=|non-free=* [[:File:Tak to chodi front.jpg]]
<small>
; should be (and I fixed it and you reverted){{book report|Tak to chodí|''Tak to chodí'']] <small>by [[Michal Horáček (lyricist)</small>|Start|chapter=Compilations|problems=|non-free=* [[:File:Tak to chodi front.jpg]]
{{book report|''Wieland der Schmied (libretto)|Wieland der Schmied''|Unassessed|problems=* Page does not exist.
''
); should be (and I fixed it and you reverted){{book report|Wieland der Schmied (libretto)|''Wieland der Schmied''|Unassessed|problems=* Page does not exist.
{{WBOOKS|class=book}}{{book report start|<big>'''Wiki How To</big>'''|The basics of Wikipedia}}
{{WBOOKS|class=book}}{{book report start|<big>'''Wiki How To'''</big>|The basics of Wikipedia}}
And similar errors that I fixed that you haven't reverted yet in:
And similar errors that I haven't fixed at Lint errors: Missing end tag in the Book talk namespace. Please deal with these errors, or at least point how to generate the list of "source code" pages and how to edit them to make the problem go away. — Anomalocaris ( talk) 23:32, 15 October 2019 (UTC)
Please find a moment to investigate this. — Anomalocaris ( talk) 09:52, 16 February 2020 (UTC) and again Anomalocaris ( talk) 05:27, 29 March 2020 (UTC)
Rather than clogging up the thread about ReFill - how am I living dangerously? Giant Snowman 17:52, 30 March 2020 (UTC)
Hello C. As part of your work on refill I wonder if you can address the concerns mentioned here User talk:Zhaofeng Li/reFill#'deadurl' parameter. Since I learned about this situation I try to switch to the correct format for the dead url field but I still forget at times. Other editors using refill may not know about this at all so it would be great if they didn't have to worry about it. I don't know if this fits into your current work with the change over that you are implementing or if it can be part of a later upgrade but I did want you to be aware of it. Best regards. MarnetteD| Talk 19:35, 30 March 2020 (UTC)
I have taken the drastic step to block InternetArchiveBot on the Cantonese Wikipedia. After some investigation between User:Dabao qian and me, it seems that the majority of edits made by IABot since its reboot on 27 March has been false positives.
The main issues are:
Since IABot insists on doubling up the webarchive template after other editors have undone the duplication across multiple articles, I have shut down IABot by blocking for now. Is there some setting that I got wrong in the management interface, or was there an error in your recent deployment? Deryck C. 15:05, 29 March 2020 (UTC)
@ Cyberpower678:: I'm baffled now. I've let it run for a while and it recognises {{ cite web}} a bit under half of all pages it edited. The error rate is still too high so I have reblocked it for now. An analysis of the about two dozen edits it has done:
Also - false positive: [25] (bottom edit; top edit is correct)
-- Deryck C. 21:26, 30 March 2020 (UTC)
Daw5423, You will need to define a couple of archive templates. Enwiki uses {{
Webarchive}}
. If you gave me the syntax of your templates, I'll turn it into syntax you need to give the bot to define it.—
CYBERPOWER (
Chat)
17:48, 26 December 2019 (UTC)
Hi, it's been a while since the message was archived. You might have missed my question about how the define the templates for the bot. If I complete the bot set up, I can write a guide for private wiki use because there are US departments that are interested in setting up the bot for their wikis. daw5423 ( talk) 19:55, 31 March 2020 (UTC)
News and updates for administrators from the past month (March 2020).
![]()
|
Arbcom RfC regarding on-wiki harassment. A draft RfC has been posted at Wikipedia:Arbitration Committee/Anti-harassment RfC (Draft) and not open to comments from the community yet. Interested editors can comment on the RfC itself on its talk page.
Re this edit The bot switched from several google books, which in most cases is preferred, to those found at archive.org. Several of the new links get error results also. Is it possible to let active editors decide where to link to? Sometimes we indeed use archive.org, but not as a rule, and usually only in cases where a publication can't be found via google books. Would appreciate it greatly if the editors who are most familiar with the publications were allowed to make the decision. Thanx for your efforts. Cheers. -- Gwillhickers ( talk) 19:40, 29 March 2020 (UTC)
Gwillhickers, I made some changes. When you revert the bot it will learn and respect that and not restore the IA links, this was a design oversight not intentional - it took me a while to understand what you were saying about 'forbidding Google links', but now that I see your predicament it's understandable because the bot was acting like a dictator. If it continues to fight your editorial decisions let me know ASAP, I now understand the situation. The bot has mostly completed its work, it converted less than 10% of the installed Google book links, and is now only running on newly added links which results in less than 100 changes a week vs. thousands of new Google links added during the same period. Google the elephant and IA the mouse. -- Green C 04:52, 31 March 2020 (UTC)
I have taken the drastic step to block InternetArchiveBot on the Cantonese Wikipedia. After some investigation between User:Dabao qian and me, it seems that the majority of edits made by IABot since its reboot on 27 March has been false positives.
The main issues are:
Since IABot insists on doubling up the webarchive template after other editors have undone the duplication across multiple articles, I have shut down IABot by blocking for now. Is there some setting that I got wrong in the management interface, or was there an error in your recent deployment? Deryck C. 15:05, 29 March 2020 (UTC)
@ Cyberpower678:: I'm baffled now. I've let it run for a while and it recognises {{ cite web}} a bit under half of all pages it edited. The error rate is still too high so I have reblocked it for now. An analysis of the about two dozen edits it has done:
Also - false positive: [45] (bottom edit; top edit is correct)
-- Deryck C. 21:26, 30 March 2020 (UTC)
Hello, Cyber! :)
Please, take a look at this page. It is the same link over and over again. What should I do (if I can do anything) in cases like this so I don't bother you in the future? - Klein Muçi ( talk) 10:40, 3 April 2020 (UTC)
Dear cyberpower678, with this edit, InternetArchiveBot changed the homepage of a municipality into an InternetArchive link. However, when I click on the link given in the External links section of the previous version, the municipality's homepage is displayed. Perhaps the solution of this puzzle is, that http://www.gemeinde-melsbach.de/ is actually dead, but my browser (Firefox Quantum 68.6.1esr (64-Bit)) automatically selects https://www.gemeinde-melsbach.de/ instead? If so, can the bot be enhanced to handle such cases? (I apologize if I'm completely wrong with my idea.) -- Cyfal ( talk) 18:44, 4 April 2020 (UTC)
Hello! As you have been inactive on Swedish Wikipedia for over one year, I have now removed your rollback rights according to local policy. Rasmus 28 ( talk) 11:22, 5 April 2020 (UTC)
Hello, Cyber, I took out the adverting material from Tarik Freitekh, its the only change that caused the delete request. This page been there for over 10 years with no issue. — Preceding unsigned comment added by 172.8.146.192 ( talk) 06:14, 6 April 2020 (UTC)
Hi. Your bot here is replacing pages viewable through Google books, by Archive.com links that require registration and limited time for perusal (14 days). I am not sure this is an improvement, as Google books pages are viewable by almost anyone almost anytime, and allow for an easy reading of the references by going directly to the referenced page. Can you disable this kind of replacements, which I am afraid are not in the interest of the Wikipedia reader? पाटलिपुत्र Pat (talk) 07:16, 6 April 2020 (UTC)
Every single block listed at User:Cyberbot I/Requests for unblock report is listed to have expired 50 years ago. Beyond that, rangeblocked IPs aren't even considered. You may wish to look at the "bkip" available in the API. -- Amanda (aka DQ) 08:35, 7 April 2020 (UTC)
1 I recently submitted a job for dead links on Wikidata, about 150 links were rescued, it seems like the bot doesn't regularly scans Wikidata, should I submit more jobs to make it rescue links?
2 In addition to rescuing link, we have on Wikidata "deprecated rank" which can be used to mark broken links, it prevents them from showing up in queries and templates. Can your bot be used to make edits like this? If you don't have time for this, is there a way to use the internet archive platform to compile lists of the broken links, and I will do the edits with my bot? I currently use curl to detect broken links, and I understood there are better ways to detect them, especially those with timeout etc. Thanks, Uziel302 ( talk) 05:12, 8 April 2020 (UTC)
Hello. I'm not sure if you're still the person to go to on this, but the Internet Archive Bot has now twice replaced links to books on Google Books that were working fine on the James Longstreet biography to versions of books on Archive.org which have much less of the book available. This clearly isn't helping and I was wondering how to best prevent this. Thank you. Display name 99 ( talk) 18:37, 11 April 2020 (UTC)
![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 65 | ← | Archive 68 | Archive 69 | Archive 70 | Archive 71 | Archive 72 | → | Archive 75 |
In this edit the bot for some reason linked to the wrong book (U.S. Coast Pilot for part of the Atlantic coast) rather than the applicable U.S. Coast Pilot volume for Alaska. I don't know what went wrong, and I've emended the error; but I thought you might want to know in case the bot's coding needs revision. Deor ( talk) 20:25, 11 March 2020 (UTC)
![]() |
The Technical Barnstar |
Thanks for making Cyberbot! dibbydib 💬/ ✏ 03:20, 12 March 2020 (UTC) |
Hi Cyberpower, in this InternetArchiveBot edit [1] your bot conducted a number of strange actions I would like to report. When you find the time, can you please investigate and fix the bot accordingly?
Thanks and greetings. -- Matthiaspaul ( talk) 17:34, 13 March 2020 (UTC)
Hi, Cyber. I have create protected Isaac Kwaku Fokuo Jr indefinitely per a request at WP:RFPP. My protection shows clearly in the log, but your bot says I haven't done it. [2] Can you see what the problem is, because I can't? Bishonen | tålk 13:53, 11 March 2020 (UTC).
Hi. From my understanding, external links to official sites should always lead to a live, working website. If an external link (which is not used as a reference) is permanently down, this usually indicates that the link needs to be updated or removed. I don't quite see benefit in piping such links to an archived version. See this edit (from 2017), for example. In this case, this was a false positive, and has been reverted. If the site was permanently down, though, I think it would have been preferable for the bot to alert human editors to identify a new, correct link, instead of archiving it.
I previously filed this on Phabricator, but you closed it as not a bug. Maybe the issue warrants further discussion, though? -- Paul_012 ( talk) 21:21, 14 March 2020 (UTC)
Hello, I am the new owner of the R10.net forum site and said that when I want to create a wiki page, the domain name is forbidden. I have found that my research is due to the content created by a user in the hosting category. ( here) How can I solve this situation and create my own company wiki page? can you help me — Preceding unsigned comment added by 31.223.2.136 ( talk) 18:39, 16 March 2020 (UTC)
Hi there Cyberpower678,
How are you doing? I am new to submitting article for deletion and I accidentally nominated a deletion discussion page rather than the actual page. Could you be of any assistance or would you be able to direct me to someone who manages this section of Wikipedia.
/info/en/?search=Wikipedia:Articles_for_deletion/United_Macedonian_Diaspora
Kind Regards,
James — Preceding unsigned comment added by Jamesrichards12345 ( talk • contribs) 16:07, 16 March 2020 (UTC)
Good evening. Just wanted to make you aware of this edit by the bot. I was able to clean everything up, and what I'm trying to do is a very edge case that's unlikely to happen again, at least not very often. The Squirrel Conspiracy ( talk) 06:33, 17 March 2020 (UTC)
Hi Cyberpower678. As the InternetArchiveBot produces too many false positives I've blocked it on alswiki. We have tried for monthgs to report all the bugs and it seems that nothing changes in the edits. I'm sorry, but imho it makes us more additional work than it takes from us. Best regards. -- Holder ( talk) 12:29, 18 March 2020 (UTC)
I have run IABot several times on Black Death and each time it says it has rescued 76 links and modified the page but nothing has happened. Can you advise please. Dudley Miles ( talk) 15:50, 17 March 2020 (UTC)
Hello cyberpower, it's probably my third time writing to you and waiting for a reply but you didn't even bother to reply.
When I try to prompt the bot to analyse multiple pages in trwiki, I get this error and this makes the bot functionless in Turkish Wikipedia. Could you please do the required set up so we can use the bot? Thanks.-- Yagizhan49 ( talk) 18:54, 16 March 2020 (UTC)
Hi Cyberpower678 :)
Best regards
~ ToBeFree (
talk)
20:42, 18 March 2020 (UTC)
Please see this edit for how to fix it, when you get a chance. Thanks. – Jonesey95 ( talk) 03:35, 19 March 2020 (UTC)
Hello, Cyber! I hope you're having a good time in 2020! The IA Bot has been deactivated in SqWiki for more than 2 months now. Maybe since I gave you that bug report. Is this intentional from your part? In its management interface it says it is active but it hasn't made a single edit since October 2019. - Klein Muçi ( talk) 11:16, 7 January 2020 (UTC)
Hello, Cyber! How's it going? It's been over a month I've been waiting for a response about some issues with IABot so I guess you're busy this time of the year. I hope you haven't forgotten about what I've written above. I also wanted to talk with you about another problem I'm having these days with the bot. What is happening in pages like this and how do I stop it? It's been going on on quite some pages now. - Klein Muçi ( talk) 00:27, 14 February 2020 (UTC)
That strange thing with talk page messages repeating themselves ad infinitum is happening again in a lot of pages. See here:
This is getting out of hand fast unfortunately. :/ - Klein Muçi ( talk) 03:34, 20 March 2020 (UTC)
Hi! I just found out that the Olympics part and maybe other parts of sports-reference.com is going to shut down due to moving servers; this was planned for March 1, 2020, but has not happened yet. This site is used as the sole source for a giant number of articles on Wikipedia, and to have it go down would be bad if there are not extensive archives of it.
Would you be able to run the IABot to make sure there are archives on Wikipedia for every page with links to this site? I tried running a job with ID 5510 but it is taking a really long time and there are many thousands of links to it. Also I don't have a great way to get a complete list of links to it. Thank you! DemonDays64 ( talk) 20:13, 19 March 2020 (UTC) (please ping on reply)
Hi. InternetArchiveBot is making hundreds of edits like this saying Sports Ref is a dead URL. The site isn't dead (yet), it's just no longer being updated. However, the URLs are fine. Apologies if this is covered elsewhere. Thanks. Lugnuts Fire Walk with Me 17:58, 19 March 2020 (UTC)
I'm upgrading the Cantonese Wikipedia copy of Module:Citation/CS1 to mirror the latest English Wikipedia edition, and saw a bunch of "dead-url" parameter errors popping up. I remember that before the migration I got a lot of "url-status" errors - I presume that's the new equivalent parameter. Does IABot have a functionality to do that migration in wikitext automatically? Deryck C. 23:26, 14 March 2020 (UTC)
Sorry for the vagueness.
I was editing /info/en/?search=Shino_Yamanaka And saw that the reference (what is now reference 1) was tagged as failing verifiability. If you go to the archived version of the link ( https://web.archive.org/web/20120909201808/http://www.london2012.com/athlete/yamanaka-shino-1020806/) it takes you to the right page. If you remove the wayback machine bit and just go to http://www.london2012.com/athlete/yamanaka-shino-1020806 instead, it takes you to a different page - possibly because the www.london2012.com website has been smushed with the main olympics.org website. It's an easily fixed problem, just use the waybackmachine version of the url in both the url and archived url boxes, but I thought it would be worth mentioning that it occurs.
Red Fiona ( talk) 20:59, 24 March 2020 (UTC)
Hi. I am trying to use IABot Management Interface to add archive links to all references to the page Second Amendment sanctuary. However when I try to run it on that page, and I enable the "Add archives to all non-dead references (Optional)" option, I get an error telling me the page is too big and to submit a bot job instead. I did that, however there was no option to add archives to all non-dead references and the bot did not do that, it only added archives to dead links and did not add archives to non-dead links. How can I accomplish this task in the bot job interface? Thanks! Terrorist96 ( talk) 18:36, 3 March 2020 (UTC)
Hi! I asked a few weeks ago but my question ended up in the archive. How do we get the bot to work on da.wiki? Do we need someone to operate it locally or how does it work?
We are discussing the bot on da:Wikipedia:Landsbybrønden/Brug archive.org and we had an old discussion in da:Wikipedia:Landsbybrønden/Automatisk bot-kildearkivering (it died because noone knew how to proceed). I hope you can help us how to get started. -- MGA73 ( talk) 13:30, 22 March 2020 (UTC)
Hi- Sorry to bug you with this, guessing it's not your problem, but can you tell what might have gone wrong with how I started this AfD nomination? I got an unexpected result when I first attempted to create the page, and had to make a subsequent edit to post my reason for nomination ( page history). I then noticed your bot message there. Note: The page is not appearing on my (refreshed) watchlist, despite showing as watched. I also had to manually put a notice on the article's talkpage. Thanks in advance for any ideas. Eric talk 15:52, 25 March 2020 (UTC)
[5]. 83.219.136.158 ( talk) 18:25, 25 March 2020 (UTC)
Un-archived discussion with new comment:
Cyberpower678: You deleted the following discussion without taking action. The last time, you ultimately did solve the problem, and I'm confident you will solve this one also. Please do not remove this discussion from your talk page without at least acknowledging. I don't want to go through a long cycle of dredging it up from the page history over and over, like last time. (I have edited my previous comment slightly, mainly inserting several missing right parentheses.) — Anomalocaris ( talk) 05:14, 24 October 2019 (UTC)
You are messing up italics and titles in Book talk articles.
{{book report|Number 1's (Destiny's Child album)|''|GA|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
''
) and title not displaying where needed; should be (and I fixed it to){{book report|Number 1's (Destiny's Child album)|''Number 1's''|GA|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
{{book report|Number 1's (Destiny's Child album)|''|GA|chapter=Album|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
''
) and title not displaying where needed; should be (and I fixed it to){{book report|Number 1's (Destiny's Child album)|''Number 1's''|GA|chapter=Album|problems=|non-free=* [[:File:Destiny's Child – Number 1's.jpg]]
{{book report|''Stargate: The Ark of Truth|The Ark of Truth''|Unassessed|problems=* Page does not exist.
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: The Ark of Truth|''The Ark of Truth''|Unassessed|problems=* Page does not exist.
{{book report|''Stargate: Continuum|Continuum''|Unassessed|problems=|non-free=}}
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: Continuum|''Continuum''|Unassessed|problems=|non-free=}}
{{book report|''Stargate: Revolution|Revolution''|Unassessed|problems=|non-free=}}
''
); should be (and I fixed it and you reverted and I fixed it again to){{book report|Stargate: Revolution|''Revolution''|Unassessed|problems=|non-free=}}
{{book report|Tak to chodí|''Tak to chodí'']] <small>by [[Michal Horáček (lyricist)|Start|chapter=Compilations|problems=|non-free=* [[:File:Tak to chodi front.jpg]]
<small>
; should be (and I fixed it and you reverted){{book report|Tak to chodí|''Tak to chodí'']] <small>by [[Michal Horáček (lyricist)</small>|Start|chapter=Compilations|problems=|non-free=* [[:File:Tak to chodi front.jpg]]
{{book report|''Wieland der Schmied (libretto)|Wieland der Schmied''|Unassessed|problems=* Page does not exist.
''
); should be (and I fixed it and you reverted){{book report|Wieland der Schmied (libretto)|''Wieland der Schmied''|Unassessed|problems=* Page does not exist.
{{WBOOKS|class=book}}{{book report start|<big>'''Wiki How To</big>'''|The basics of Wikipedia}}
{{WBOOKS|class=book}}{{book report start|<big>'''Wiki How To'''</big>|The basics of Wikipedia}}
And similar errors that I fixed that you haven't reverted yet in:
And similar errors that I haven't fixed at Lint errors: Missing end tag in the Book talk namespace. Please deal with these errors, or at least point how to generate the list of "source code" pages and how to edit them to make the problem go away. — Anomalocaris ( talk) 23:32, 15 October 2019 (UTC)
Please find a moment to investigate this. — Anomalocaris ( talk) 09:52, 16 February 2020 (UTC) and again Anomalocaris ( talk) 05:27, 29 March 2020 (UTC)
Rather than clogging up the thread about ReFill - how am I living dangerously? Giant Snowman 17:52, 30 March 2020 (UTC)
Hello C. As part of your work on refill I wonder if you can address the concerns mentioned here User talk:Zhaofeng Li/reFill#'deadurl' parameter. Since I learned about this situation I try to switch to the correct format for the dead url field but I still forget at times. Other editors using refill may not know about this at all so it would be great if they didn't have to worry about it. I don't know if this fits into your current work with the change over that you are implementing or if it can be part of a later upgrade but I did want you to be aware of it. Best regards. MarnetteD| Talk 19:35, 30 March 2020 (UTC)
I have taken the drastic step to block InternetArchiveBot on the Cantonese Wikipedia. After some investigation between User:Dabao qian and me, it seems that the majority of edits made by IABot since its reboot on 27 March has been false positives.
The main issues are:
Since IABot insists on doubling up the webarchive template after other editors have undone the duplication across multiple articles, I have shut down IABot by blocking for now. Is there some setting that I got wrong in the management interface, or was there an error in your recent deployment? Deryck C. 15:05, 29 March 2020 (UTC)
@ Cyberpower678:: I'm baffled now. I've let it run for a while and it recognises {{ cite web}} a bit under half of all pages it edited. The error rate is still too high so I have reblocked it for now. An analysis of the about two dozen edits it has done:
Also - false positive: [25] (bottom edit; top edit is correct)
-- Deryck C. 21:26, 30 March 2020 (UTC)
Daw5423, You will need to define a couple of archive templates. Enwiki uses {{
Webarchive}}
. If you gave me the syntax of your templates, I'll turn it into syntax you need to give the bot to define it.—
CYBERPOWER (
Chat)
17:48, 26 December 2019 (UTC)
Hi, it's been a while since the message was archived. You might have missed my question about how the define the templates for the bot. If I complete the bot set up, I can write a guide for private wiki use because there are US departments that are interested in setting up the bot for their wikis. daw5423 ( talk) 19:55, 31 March 2020 (UTC)
News and updates for administrators from the past month (March 2020).
![]()
|
Arbcom RfC regarding on-wiki harassment. A draft RfC has been posted at Wikipedia:Arbitration Committee/Anti-harassment RfC (Draft) and not open to comments from the community yet. Interested editors can comment on the RfC itself on its talk page.
Re this edit The bot switched from several google books, which in most cases is preferred, to those found at archive.org. Several of the new links get error results also. Is it possible to let active editors decide where to link to? Sometimes we indeed use archive.org, but not as a rule, and usually only in cases where a publication can't be found via google books. Would appreciate it greatly if the editors who are most familiar with the publications were allowed to make the decision. Thanx for your efforts. Cheers. -- Gwillhickers ( talk) 19:40, 29 March 2020 (UTC)
Gwillhickers, I made some changes. When you revert the bot it will learn and respect that and not restore the IA links, this was a design oversight not intentional - it took me a while to understand what you were saying about 'forbidding Google links', but now that I see your predicament it's understandable because the bot was acting like a dictator. If it continues to fight your editorial decisions let me know ASAP, I now understand the situation. The bot has mostly completed its work, it converted less than 10% of the installed Google book links, and is now only running on newly added links which results in less than 100 changes a week vs. thousands of new Google links added during the same period. Google the elephant and IA the mouse. -- Green C 04:52, 31 March 2020 (UTC)
I have taken the drastic step to block InternetArchiveBot on the Cantonese Wikipedia. After some investigation between User:Dabao qian and me, it seems that the majority of edits made by IABot since its reboot on 27 March has been false positives.
The main issues are:
Since IABot insists on doubling up the webarchive template after other editors have undone the duplication across multiple articles, I have shut down IABot by blocking for now. Is there some setting that I got wrong in the management interface, or was there an error in your recent deployment? Deryck C. 15:05, 29 March 2020 (UTC)
@ Cyberpower678:: I'm baffled now. I've let it run for a while and it recognises {{ cite web}} a bit under half of all pages it edited. The error rate is still too high so I have reblocked it for now. An analysis of the about two dozen edits it has done:
Also - false positive: [45] (bottom edit; top edit is correct)
-- Deryck C. 21:26, 30 March 2020 (UTC)
Hello, Cyber! :)
Please, take a look at this page. It is the same link over and over again. What should I do (if I can do anything) in cases like this so I don't bother you in the future? - Klein Muçi ( talk) 10:40, 3 April 2020 (UTC)
Dear cyberpower678, with this edit, InternetArchiveBot changed the homepage of a municipality into an InternetArchive link. However, when I click on the link given in the External links section of the previous version, the municipality's homepage is displayed. Perhaps the solution of this puzzle is, that http://www.gemeinde-melsbach.de/ is actually dead, but my browser (Firefox Quantum 68.6.1esr (64-Bit)) automatically selects https://www.gemeinde-melsbach.de/ instead? If so, can the bot be enhanced to handle such cases? (I apologize if I'm completely wrong with my idea.) -- Cyfal ( talk) 18:44, 4 April 2020 (UTC)
Hello! As you have been inactive on Swedish Wikipedia for over one year, I have now removed your rollback rights according to local policy. Rasmus 28 ( talk) 11:22, 5 April 2020 (UTC)
Hello, Cyber, I took out the adverting material from Tarik Freitekh, its the only change that caused the delete request. This page been there for over 10 years with no issue. — Preceding unsigned comment added by 172.8.146.192 ( talk) 06:14, 6 April 2020 (UTC)
Hi. Your bot here is replacing pages viewable through Google books, by Archive.com links that require registration and limited time for perusal (14 days). I am not sure this is an improvement, as Google books pages are viewable by almost anyone almost anytime, and allow for an easy reading of the references by going directly to the referenced page. Can you disable this kind of replacements, which I am afraid are not in the interest of the Wikipedia reader? पाटलिपुत्र Pat (talk) 07:16, 6 April 2020 (UTC)
Every single block listed at User:Cyberbot I/Requests for unblock report is listed to have expired 50 years ago. Beyond that, rangeblocked IPs aren't even considered. You may wish to look at the "bkip" available in the API. -- Amanda (aka DQ) 08:35, 7 April 2020 (UTC)
1 I recently submitted a job for dead links on Wikidata, about 150 links were rescued, it seems like the bot doesn't regularly scans Wikidata, should I submit more jobs to make it rescue links?
2 In addition to rescuing link, we have on Wikidata "deprecated rank" which can be used to mark broken links, it prevents them from showing up in queries and templates. Can your bot be used to make edits like this? If you don't have time for this, is there a way to use the internet archive platform to compile lists of the broken links, and I will do the edits with my bot? I currently use curl to detect broken links, and I understood there are better ways to detect them, especially those with timeout etc. Thanks, Uziel302 ( talk) 05:12, 8 April 2020 (UTC)
Hello. I'm not sure if you're still the person to go to on this, but the Internet Archive Bot has now twice replaced links to books on Google Books that were working fine on the James Longstreet biography to versions of books on Archive.org which have much less of the book available. This clearly isn't helping and I was wondering how to best prevent this. Thank you. Display name 99 ( talk) 18:37, 11 April 2020 (UTC)