Operator: KyraVixen
Automatic or Manually Assisted: Automatic
Programming Language(s): Python
Function Summary: Clone of BetacommandBot, task 3. Wikipedia:Bots/Requests for approval/BetacommandBot 3
Edit period(s) (e.g. Continuous, daily, one time run):
Edit rate requested: Unknown. Edits as needed.
Already has a bot flag (Y/N): Yes
Function Details: Performs linksearches (ie, checking if a specified domain exists within Wikipedia) using Special:Linksearch using requests taken from IRC and posts the results of the search to subpages of Wikipedia:WPSPAM. This is a clone of Wikipedia:Bots/Requests for approval/BetacommandBot 3; additional information is there (questions, comments, etc), but all pertinent information is here.
The following was added 01:13, 23 March 2007 (UTC): When a user initiates a linksearch from IRC, the bot queries Wikipedia using Special:Linksearch, and compiles a list of pages with the link on them. If the command passed to the bot is linksearch then the bot creates a page containing the links such as this one, or updates it with the most current list if the page already exists.
If the request is a linksearch2 the bot will do the above, but instead of creating a separate page, it will log the number of links here. It will do this on a 'linksearch' as well.
I don't plan on doing the next function alot, if at all, but a linkgenupdate uses the entries on this page to update the count of the pages that have the domain at Wikipedia:WikiProject Spam/Report. This function updates the corresponding /Linksearch/<Site> page if the number of links within Wikipedia has changed for the corresponding domain.
The crosswiki command searches for a supplied domain link throughout the following wikis: en, de, ja, fr, pl, it, nl, es, pt, zh, ru, fi, no, he, and sco. The pages it finds are made into a list, and the result overwrites the list at this page, as well as supplying the username of who initiated the cross-wiki search.
The blacklistupdate command is a "privileged" command (meaning trusted users that are added to the bot can execute this function) which simply performs a standard 'linksearch' with domain names stored in an array that are added by privileged users.
The bot does not remove any links, nor does it have the capability to do so. It merely reports the links gathered by Special:Linksearch.
The following was added 01:57, 23 March 2007 (UTC): The command linkfilter will move entries with zero links on Wikipedia from Wikipedia:WikiProject Spam/LinkSearch/List to Wikipedia:WikiProject Spam/LinkSearch/Holding 1, the contents from holding 1 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Holding 2, and the contents of holding 2 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Old. No further moves occur. Every 24 hours the bot checks /List, /Holding 1, and /Holding 2. If the link has no instances on Wikipedia, it will be shuffled off to the next page, with /Old as the dumping ground. However if a link is detected in holding one or two with one or more links, the link will be returned to /List.
Speedy approved Betacommand ( talk • contribs • Bot) 02:50, 21 March 2007 (UTC) reply
Hold up. Please wait to run this bot until the other request is approved. Also, Betacommand, if you could use the correct templates listed at {{ BAG Admin Tools}} it would help the BAGBot keep track of bot requests. And don't approve your own bots or clones of them before the original one is approved. — METS501 ( talk) 03:11, 21 March 2007 (UTC) reply
{{
BotDenied}} See
Wikipedia:Bots/Requests for approval/BetacommandBot 3. —
METS501 (
talk) 00:29, 22 March 2007 (UTC)
reply
Application is being reopened. Mets, who closed the application, has agreed to this action by email. -- kingboyk 20:23, 22 March 2007 (UTC) reply
I have added a more detailed function list of the commands that directly interact with Wikipedia to try and further clarify the function of this code; there are a few others in the code (such as add to a temporary list of 'watchlisted' sites used for a blacklistupdate, or adding users to the list of trusted users), but they seem rather trivial at the moment. If it is desired, I will go back through the code and hash out what they do. Kyra ~(talk) 01:13, 23 March 2007 (UTC) reply
Getting page
Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
Sleeping for 8.5 seconds, 2007-03-23 17:27:31
Changing page
Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
Approved for trial. Please provide a link to the relevant contributions and/or diffs when the trial is complete. You may run this task for up to 3 days while discussion continues. At the request of any admin or BAG member you must stop immediately. Data collected is not to be used at this stage by any automated or semi-automated process for link removal. -- kingboyk 13:12, 23 March 2007 (UTC) reply
Approved. While the results have not been posted here, I've looked over them and checked a sample number of them for accuracy with no problems detected. It even filters out duplicates nicely. There have been no comments posted to this user's talk page about this bot being problamatic either. This is a clone of Betacommand's bot which was just speedily approved on the basis of this test and its own previous work. -- RM 12:44, 26 March 2007 (UTC) reply
Operator: KyraVixen
Automatic or Manually Assisted: Automatic
Programming Language(s): Python
Function Summary: Clone of BetacommandBot, task 3. Wikipedia:Bots/Requests for approval/BetacommandBot 3
Edit period(s) (e.g. Continuous, daily, one time run):
Edit rate requested: Unknown. Edits as needed.
Already has a bot flag (Y/N): Yes
Function Details: Performs linksearches (ie, checking if a specified domain exists within Wikipedia) using Special:Linksearch using requests taken from IRC and posts the results of the search to subpages of Wikipedia:WPSPAM. This is a clone of Wikipedia:Bots/Requests for approval/BetacommandBot 3; additional information is there (questions, comments, etc), but all pertinent information is here.
The following was added 01:13, 23 March 2007 (UTC): When a user initiates a linksearch from IRC, the bot queries Wikipedia using Special:Linksearch, and compiles a list of pages with the link on them. If the command passed to the bot is linksearch then the bot creates a page containing the links such as this one, or updates it with the most current list if the page already exists.
If the request is a linksearch2 the bot will do the above, but instead of creating a separate page, it will log the number of links here. It will do this on a 'linksearch' as well.
I don't plan on doing the next function alot, if at all, but a linkgenupdate uses the entries on this page to update the count of the pages that have the domain at Wikipedia:WikiProject Spam/Report. This function updates the corresponding /Linksearch/<Site> page if the number of links within Wikipedia has changed for the corresponding domain.
The crosswiki command searches for a supplied domain link throughout the following wikis: en, de, ja, fr, pl, it, nl, es, pt, zh, ru, fi, no, he, and sco. The pages it finds are made into a list, and the result overwrites the list at this page, as well as supplying the username of who initiated the cross-wiki search.
The blacklistupdate command is a "privileged" command (meaning trusted users that are added to the bot can execute this function) which simply performs a standard 'linksearch' with domain names stored in an array that are added by privileged users.
The bot does not remove any links, nor does it have the capability to do so. It merely reports the links gathered by Special:Linksearch.
The following was added 01:57, 23 March 2007 (UTC): The command linkfilter will move entries with zero links on Wikipedia from Wikipedia:WikiProject Spam/LinkSearch/List to Wikipedia:WikiProject Spam/LinkSearch/Holding 1, the contents from holding 1 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Holding 2, and the contents of holding 2 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Old. No further moves occur. Every 24 hours the bot checks /List, /Holding 1, and /Holding 2. If the link has no instances on Wikipedia, it will be shuffled off to the next page, with /Old as the dumping ground. However if a link is detected in holding one or two with one or more links, the link will be returned to /List.
Speedy approved Betacommand ( talk • contribs • Bot) 02:50, 21 March 2007 (UTC) reply
Hold up. Please wait to run this bot until the other request is approved. Also, Betacommand, if you could use the correct templates listed at {{ BAG Admin Tools}} it would help the BAGBot keep track of bot requests. And don't approve your own bots or clones of them before the original one is approved. — METS501 ( talk) 03:11, 21 March 2007 (UTC) reply
{{
BotDenied}} See
Wikipedia:Bots/Requests for approval/BetacommandBot 3. —
METS501 (
talk) 00:29, 22 March 2007 (UTC)
reply
Application is being reopened. Mets, who closed the application, has agreed to this action by email. -- kingboyk 20:23, 22 March 2007 (UTC) reply
I have added a more detailed function list of the commands that directly interact with Wikipedia to try and further clarify the function of this code; there are a few others in the code (such as add to a temporary list of 'watchlisted' sites used for a blacklistupdate, or adding users to the list of trusted users), but they seem rather trivial at the moment. If it is desired, I will go back through the code and hash out what they do. Kyra ~(talk) 01:13, 23 March 2007 (UTC) reply
Getting page
Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
Sleeping for 8.5 seconds, 2007-03-23 17:27:31
Changing page
Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
Approved for trial. Please provide a link to the relevant contributions and/or diffs when the trial is complete. You may run this task for up to 3 days while discussion continues. At the request of any admin or BAG member you must stop immediately. Data collected is not to be used at this stage by any automated or semi-automated process for link removal. -- kingboyk 13:12, 23 March 2007 (UTC) reply
Approved. While the results have not been posted here, I've looked over them and checked a sample number of them for accuracy with no problems detected. It even filters out duplicates nicely. There have been no comments posted to this user's talk page about this bot being problamatic either. This is a clone of Betacommand's bot which was just speedily approved on the basis of this test and its own previous work. -- RM 12:44, 26 March 2007 (UTC) reply