This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 50 | ← | Archive 55 | Archive 56 | Archive 57 | Archive 58 | Archive 59 | Archive 60 |
The Wikipedia 1.0 project needs someone with experience working with large (en.wikipedia-size) MySQL databases. If interested, contact me or the WP 1.0 Editorial Team.— Wolfgang42 ( talk) 02:19, 20 October 2013 (UTC)
SELECT page_title,
IF ( rd_from = page_id,
rd_title,
/*ELSE*/IF (pl_from = page_id,
pl_title,
/*ELSE*/
NULL -- Can't happen, due to WHERE clause below
))
FROM page, redirect, pagelinks
WHERE (rd_from = page_id OR pl_from = page_id)
AND page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
— Wolfgang42 ( talk) 22:53, 23 October 2013 (UTC)
Try this and see if it works any better. -- WOSlinker ( talk) 06:00, 24 October 2013 (UTC)
SELECT page_title, COALESCE(rd_title, pl_title)
FROM page
LEFT JOIN redirect ON page_id = rd_from
LEFT JOIN pagelinks ON page_id = pl_from
WHERE page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
G'day, WPMILHIST has a quarterly awards system for editors that complete reviews (positive or negative) of articles that fall within the WikiProject. So far we (the project coordinators) have done this tallying manually, which is pretty labour-intensive. We have recently included GA reviews, and are having difficulty identifying negative GA reviews using the standard tools. We were wondering if someone could build a bot that could tally all FA, FL, A-Class, Peer and GA reviews of articles that fall within WikiProject Military history? In terms of frequency, we usually tally the points and hand out awards in the first week after the end of each quarter (first weeks in January, April, July and October), but it would be useful functionality to be able to run the bot as needed if that is possible. Regards, Peacemaker67 ( send... over) 23:36, 13 October 2013 (UTC)
20:11, 17 October 2013 > message A reference problem
05:19, 18 October 2013 > message A reference problem
11:19, 22 October 2013 > message A reference problem Two replys
The discussion is at Wikipedia talk:WikiProject Australian Roads/Shields, but in summary, there are sets of images transferred from Commons to here as {{ PD-ineligible-USonly}}. The user that moved the files (downloaded them from commons then uploaded them here) wants to remove his involvement due to potential legal issues in Australia. Under existing policy, revdel, oversight, and office actions are not appropriate. It was suggested that a bot could upload the same files under a different name and nominates the old ones for deletion per WP:CSD#F1. - Evad37 ( talk) 06:42, 26 October 2013 (UTC)
Can someone make a bot to mark all talk pages of Vital articles (all levels) with {{ VA}}, and fill out its parameters (level, class, topic) if possible. It should also remove such templates from non-VAs.
Ideally this should run on a regular basis, but even a one-off run would be very helpful. -- Ypnypn ( talk) 18:48, 28 October 2013 (UTC)
In this first paragraph, I will summarize my request: It would be good if someone could please create a bot which tags articles which were PRODded but survived (I shall call these "Survivors"). And/or which tags articles which were PROD-deleted then recreated (I shall call these "Recreated Articles"). You may tag them with {{ old prod full}}. You may leave all the template's parameters blank, or you may fill some in.
Rationale: Such tags warn us not to re-add another PROD tag. They also make it more obvious to us that perhaps we should consider nominating the page for WP:AfD.
Here are some things you could do, but which I don't recommend: You could download a database dump with history, parse it, and look for Survivors. But such a dump is over ten terabytes of XML once uncompressed. [1] You could download the dump of all logs, download the dump of all page titles, parse the two, and look for Recreated Articles. User:Tim1357 tried parsing a dump, [2], but he didn't succeed: the matter is still on the latest revision of his to-do list. I suspect it may not be worth attempting either of these difficult tasks.
Here is what I do recommend: It would be worthwhile to create a bot to watch Category:Proposed deletion and tag future Survivors. And to watch for new pages and tag Recreated Articles. User:Abductive suggests some optional refinements. [3]
It would be good if someone could please start writing a bot to do either or both of these tasks. It would be even better if they could provide us with a link to their code-in-progress. User:Kingpin13 and User:ThaddeusB have expressed interest, [4] but nobody seems to have actually written any code to do these tasks on the live Wikipedia.
User:Rockfang started tagging Survivors in 2008 using AWB (the wrong tool for the job) but later stopped. S/he wrote that s/he "got distracted".
AnomieBOT already does one related task. If an article is AfDed, then recreated, AnomieBOT's NewArticleAFDTagger task puts {{ old AfD multi}} on that article's talk page. The task's open-source [5] code is here. Maybe you could build on it, and maybe you could even ask Anomie to run it for you. Dear User:Anomie: Do you know if you or any bot ever tagged the pages which were recreated in the years before you wrote your bot?
Cheers, — Unforgettableid ( talk) 04:32, 16 October 2013 (UTC)
Per WP:BRINT, redirects are undesirable in templates. Currently after a page move, bots (bless their hearts) sweep up all of the broken or double redirects etc., but the links in templates are left untouched. For instance, a page was moved from here to here in January but the accompanying template was not updated until today. Is is possible for a bot to fix redirects on templates that are on a page that is moved? Rgrds. -- 64.85.216.235 ( talk) 05:51, 4 November 2013 (UTC)
Bot for Star Wars articles needed, maybe? Might help monitor changes. 20-13-rila ( talk) 11:19, 5 November 2013 (UTC)
We have WP:FANMP (a list of FAs yet to appear on the main page) and WP:WBFAN (a list of FAs and former FAs by nominator). Can someone think of a way to produce a hybrid for me, i.e. a list of FAs yet to appear on the main page by nominator? Bencherlite Talk 20:30, 5 November 2013 (UTC)
It seems that the articles have been moved to http://community.amd.com and http://developer.amd.com. I think all links to http://blogs.amd.com should be marked with {{ dead link}} at least. Please fix them semi-automatically if you can. -- 4th-otaku ( talk) 12:35, 4 November 2013 (UTC)
User:Femto Bot used to populate Wikipedia:WikiProject Medicine/Recent changes which in turn updated Special:RecentChangesLinked/Wikipedia:WikiProject Medicine/Recent changes. I think that's how it worked. It reported all changes to pages with {{ WPMED}} on their talk page. Anyway, it was an awesome tool for patrolling some of Wikipedia's most sensitive content. But since Rich Farmborough was banned from bot work it's stopped working - it only reports recent changes to pages beginning with "A".
This tool aims to do the same thing but it's slow and often times out, and when it works it's running a couple of days behind.
There was also Tim1357's tool, but his account has expired from the Toolserver.
I was wondering if somebody here would be able to provide WP:MED with something to replace these? With something like this a handful of experienced medical editors can effectively patrol all of Wikipedia's medical content. Without it, there's no telling what's happening. -- Anthonyhcole ( talk · contribs · email) 17:58, 4 November 2013 (UTC)
|RELC list1 namespace1=Article [space] + Talk [space]
, |RELC list2 namespace2=Template + template talk
, |other parameters like 1x/month=
. The template is invisible. Just like what
User:MiszaBot/config does on talkpages to archive.The "OLAC" (Open Language Archives Community) website has consistently helpful pages about resources for the languages of the world, especially the endangered and lesser-taught languages. The OLAC pages use a URL which ends with a three-letter code from the ISO 639-3 language code list, which is found in our language articles infobox. Each OLAC page has a nice descriptive title at the top, such as OLAC resources in and about the Aguaruna language.
Rather than adding several thousand OLAC page links to the External links sections of language articles by hand, couldn't we just write a bot to do this?
I know some languages have multiple language codes in their Wikipedia infobox, due to multiple dialects or language variants. Even if the bot didn't add links for languages with multiple codes, it would still be a big time-saver!
What do you think? Djembayz ( talk)
Without getting too deep into
tin foil territory, encrypting is one of many essential steps to ensure readers' privacy. Since October 24, 2013, the
Internet Archive now uses
HTTP Secure (https://
) by default
[8]. Just this week they updated their server software so it can handle
TLS 1.2, the latest version. It is safe to say they encourage their visitors to access their site using an encrypted connection.
In my opinion, Wikipedia should support this effort and switch all outgoing links to the Internet Archive to HTTPS. According to
Alexa, Wikipedia currently ranks fourth among upstream sites to archive.org
[9]. {{
Wayback}} was
already updated in that regard, but most of the links to the Wayback Machine are implemented in one of the many citation templates as encouraged at
WP:WBM. I started to fix a lot of those links manually, before realizing it would be a perfect job for a bot.
The Wayback Machine links have a common scheme, e.g. https://web.archive.org/web/20020930123525/http://www.wikipedia.org/
. So the task is this: find http://web.archive.org/web/
throughout the article namespace and replace with https://web.archive.org/web/
. That's it. --
bender235 (
talk) 20:51, 8 November 2013 (UTC)
http://archive.org/
to http://web.archive.org/
, which would indeed change nothing. But switching to https
changes the transport mechanism, from unencrypted to encrypted. Even tho it looks simple, it has significant consequences. --
bender235 (
talk) 21:17, 8 November 2013 (UTC)
Would it be possible to import those 3 standardized codes into the professions infoboxes ? --Teolemon
The individual occupation items don't have yet any SOC codes associated with them, but they are in broad occupation categories in enwiki that should make it easier to match:
Here's the list of SOC codes for matching with the existing items.
— Preceding unsigned comment added by 2A01:E35:2EA8:950:5BF:1AF3:3374:F3D0 ( talk • contribs)
copied from WP:VPT -- Frze > talk 07:17, 18 October 2013 (UTC)
DPL bot and
BracketBot are the best inventions of Wikipedia. It's time for a new Bot. We need the
If a user contributes a broken reference name, an incorrect ref formating (or a missing reflist), please inform the user who caused this error. It is so outrageously hard work to correct all these errors afterwards, from someone who is not holding the factual knowledge. For example: it took me a week to work up the backlog of Category:Pages with broken reference names - more than 1500 items, some disregarded more than two years. Search with WikiBlame for first entry of ref, making the changes, inform the users... annoying. Thank you very much. -- Frze > talk 12:25, 17 October 2013 (UTC)
== A reference problem ==
Hi [[User:SpidErxD|SpidErxD]]! Some users have been working hard on [[:Category:Pages with broken reference names]].
[https://en.wikipedia.org/?title=Nuclear_program_of_Iran&diff=577623223&oldid=577620891 Here] you added new references '''ref name=OPBW''' and '''ref name="status"''' but didn't define it. This has been showing as an error at the bottom of the article. <small>'''''Cite error: The named reference was invoked but never defined.'''''</small> Can you take a look and work out what you were trying to do? Thanks --User:REFBot
"Take a look at the page XYZ. There is a citation error. It could be in the text:
- or take a look at the bottom of the page:
Thanks, RefBot talk 10:05, 21 October 2013 (UTC)"
"It might be hard to eliminate false positves though."There might be a small problem with valid checking if templates are present in the article. I've seen a fair share of error messages in articles that resulted from an edit to a template and not an edit to the article itself. The error message still shows up in the article. i.e. if a user adds a citation to the template and there isn't a
{{Reflist}}
template in the article. The bot would have to check for that I assume. —
JJJ (
say hello) 15:15, 22 October 2013 (UTC)Yesterday's mistakes
|
---|
Category:Pages with DOI errors
|
Yesterday's Mistakes
|
---|
:
Arjayay
edited
Transcendental Meditation technique causing Category:Pages with ISBN errors, Category:Pages with citations using unsupported parameters, Category:Pages using citations with old-style implicit et al., Category:Pages using citations with accessdate and no URL
|
A930913 TheJJJunk I'm looking forward with happy anticipation to the implementation of my idea. Thanks to you all. -- Frze > talk 04:28, 30 October 2013 (UTC)
- Forcible-Prevention-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, refusing to let them save it in a broken state (they must fix it first)
- Loud-Warning-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, but permit the user to override and save anyways (in the broken state)... then nothing
- Silent-Warning-Filter. Same as #2. Additionally, RefBot allows the editor to opt-out of receiving RefBot filter-warnings.
- Loud-Fix-It-Later-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, but permit the user to override and save anyways (in the broken state)... however, after their edit is saved, RefBot should rollback that one edit (not rollback the last N edits by the editor in question), and then RefBot should automagically post to the article-talkpage with a diff-link to the attempted-ref-edit that it just reverted
- Silent-Fix-It-Later-Filter. Same as #4. Additionally, RefBot allows the editor to opt-out of receiving RefBot filter-warnings.
- Loud-Warning-Bot. RefBot should be implemented as a bot, and eventually warn the editor on their talkpage, but should leave the article alone (no opt-out capability)
- Silent-Warning-Bot. Same at #6. Additionally, RefBot allows the editor to opt-out of receiving RefBot talkpage-messages.
- Loud-Fix-It-Later-Bot. RefBot should be implemented as a bot, and eventually warn the editor on their talkpage, plus RefBot should rollback that one edit (not rollback the last N edits by the editor in question), and then RefBot should automagically post to the article-talkpage with a diff-link to the attempted-ref-edit that it just reverted. Plus, ideally, RefBot's user-talkpage-message should have a one-click-to-put-my-broken-edit-back hyperlink, which also redirects the editor to the article (this prevents them from needing to manually visit the article, enter the edit-history, manually undo RefBot, and then go back to editing the article). Since the editor might not utilize the one-click-magic 'soon' by standards of how quickly the article in question is changing, prolly the one-click-magic should only work if the sub-section of the article in question has *not* been changed by any editors, since RefBot reverted this editor's work; otherwise, the one-click-magic might do more harm than good.
- Silent-Fix-It-Later-Bot. Same at #8. Additionally, RefBot allows the editor to opt-out of receiving RefBot talkpage-messages.
I support the idea of this bot existing with functionality similar to that of BracketBot. I have specific ideas for a different bot that would actually fix CS1 citation errors, but I will describe that functionality in a separate request.
As stated above by others, I do not think it would be productive to apply this new bot's activity to all of the subcategories of Category:Articles with incorrect citation syntax. That would generate a LOT of error messages on people's Talk pages, and some error messages are not even displayed on the article pages by default, so it will be hard for people to figure out where they made an error or if they have fixed it. I recommend starting with the following categories, each of which has been emptied through diligent work by wikignomes:
Also as requested above, the bot should operate on articles in:
I estimate that a total of 20 to 50 articles are added to all of the categories above (combined) each day; someone here might be able to scrub the logs and get a better count.
The bot should post a message similar to Bracketbot's message on the Talk page of the editor who makes the change. Since these categories are already empty, the situation described above in which a revert reintroduces an error should be a rare case.
Also, the bot should have a built-in waiting period (Bracketbot waits five minutes) to allow editors to fix errors themselves if they notice them. Please contact me if you need help writing the error notification text for each category. – Jonesey95 ( talk) 16:23, 6 November 2013 (UTC)
Examples of what the bot would generate from that
|
---|
On
User:Tesfazgi Teklezgi:
Please check these pages and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:14.139.160.4:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:98.230.108.226:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:Soetermans:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:Chrisd915:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:71.173.129.226:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:128.8.228.120:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) |
{{User:ReferenceBot/inform/top}}
, {{User:ReferenceBot/inform/middle}}
and {{User:ReferenceBot/inform/bottom}}
. See also
User:ReferenceBot.After a requested move and a move review, the page 30 Seconds to Mars was moved to Thirty Seconds to Mars, which is the official name of the band. After long discussions, we came to an end and all links to 30 Seconds to Mars should be replaced with Thirty Seconds to Mars. Please fix them if you can.-- 95.245.58.53 ( talk) 21:16, 11 November 2013 (UTC)
Thanks for your work. The same thing should be done for MTV Unplugged: 30 Seconds to Mars, Attack (30 Seconds to Mars song), Kings and Queens (30 Seconds to Mars song), Hurricane (30 Seconds to Mars song), Night of the Hunter (30 Seconds to Mars song), Search and Destroy (30 Seconds to Mars song), City of Angels (30 Seconds to Mars song), Do or Die (30 Seconds to Mars song).-- Earthh ( talk) 20:13, 13 November 2013 (UTC)
The Wayback Machine respects robots.txt across time. If a website has a robots.txt that permits archiving at one point, an editor could archive that page; a subsequent change to robots.txt on that site could lead to an inaccessible archive. For example:
South Park has link to
WebCite doesn't cause us problems in this way.
I believe a bot is required to repair these archive links. They can be detected by running a report against the database for all external links to archive.org, and for each link checking the link still works (will a HEAD command be sufficient?). Dead archiveurl links would need to be archived at webcite, or if the original link is unavailable then that need flagging with {{ dead}}. Josh Parris 02:22, 16 November 2013 (UTC)
I have a task for bots: It's needed to replace all diacritics Ş ş Ţ ţ with Ș ș Ț ț in articles from categories about Moldova and Romania. In romanian language correct are second variant, but in Windows XP is a bug and in place of them are those wrong ^ diacritics. I have corrected a part of articles about football, but they are more. Its needed an bot that also can rename pages, because sometimes diacritics (wrong) are in title. Examples:
I repeat that in romanian language Ş ş Ţ ţ does not exist. Those are turkish diacritics, so you can freely to run bot in category ″Moldova″ and ″Romania″ + all subcategories. Thanks. XXN ( talk) 14:49, 17 November 2013 (UTC)
Hi, I'm one of the Wikimedia org admins at mw:Google Code-in. We are looking for technical tasks that can be completed by students e.g. create/update a bot, improve its documentation... We also need mentors for these tasks. You can start simple with one mentor proposing one task, or you can use this program to organize a taskforce of mentors with the objective of getting dozens of small technical tasks completed. You can check the current Wikimedia tasks here. The program started on Monday, but there is still time to jump in. Give Google Code-in students a chance!-- Qgil ( talk) 16:11, 21 November 2013 (UTC)
hello, how do i go about getting a bot for my chatroom? — Preceding unsigned comment added by Hannsg8000 ( talk • contribs) 19:06, 21 November 2013 (UTC)
Hi all, I was recently granted a trial with my bot (see Mdann52 bot BRFA). However, it turned out that the script I was trying to use ( mw:Manual:Pywikibot/weblinkchecker.py) did not check links in-between ref tags, so was not very useful for the task as I first thought. As my python skills are not very good at the minute, can someone rewrite the script (or produce a version of it) that only checks links in-between ref tags (and possibly ignores any tagged with {{ dead link}}?) Thanks -- Mdann 52 talk to me! 13:39, 22 November 2013 (UTC)
There's currently a table of women physicists at User:Headbomb/sandbox2. If someone could code a bot to fetch the articles, and fill in the other columns, that would be nice and much appreciated.
For clarity, I've filled the first line of the table. The request is for a one-time run for now, but a weekly/monthly run could be done when at some point in the future when the table gets hosted as its permanent location. Feel free to do tests directly on my sandbox2. Headbomb { talk / contribs / physics / books} 18:01, 26 November 2013 (UTC)
The newly formed WikiProject Women artists could use a bot to add project banners to the talk pages of articles within certain categories. Gobōnobō + c 03:55, 28 November 2013 (UTC)
A bot that finds ban violations (e.g. editing someone's userpage when there is an interaction ban between the new editors, editing during a site ban, etc) and reports and possibly reverts them. 2Awwsome Tell me where I screwed up. See where I screwed up. 20:02, 26 November 2013 (UTC)
I want a bot for that automated or semi-automated for making repetitive edits that would be extremely tedious to do manually. repetitive tasks. for example adding the same category or template for a 1000 article. -- DIYAR DESIGN ( talk) 18:08, 27 November 2013 (UTC)
So you want a bot. What sort of bot? What do you want it to do? Idea is not well explained. 2Awwsome Tell me where I screwed up. See where I screwed up. 17:06, 28 November 2013 (UTC)
My mistake and apologies. — Preceding unsigned comment added by 71.222.78.246 ( talk) 22:38, 1 December 2013 (UTC)
A couple days ago, The Canadian Encyclopedia completely overhauled its website and, unfortunately, completely changed its URL format. This has broken over 5,500 links, but I think many of them could be repaired by a bot. The old url format was like http://www.thecanadianencyclopedia.com/index.cfm?PgNm=TCE&Params=A1ARTA0005015, while the new uses the article's title: http://www.thecanadianencyclopedia.ca/en/article/sir-andrew-macphail/. Would it be possible to have a bot check the title associated with a citation or external links entry and update with the proper URL where it can? I would imagine there would still be plenty of bare references and the like that we would still have to manually fix, but if a bot can take care of most of these, it would make the job manageable. Thanks! Reso lute 22:05, 29 November 2013 (UTC)
Some of you will know that main page images that are hosted at Commons are - or should be - protected at Commons by an adminbot there by adding them to a casade-protected page at Commons. This prevents alterations or fresh versions being uploaded there, while our local cascading protection of files in today's and tomorrow's main pages prevents local upload of images. However, there was this thread at Talk:Main Page recently:
user:KrinkleBot hasn't edited since 9 November, meaning there is no autocascade protection on Commons. Promoting admins, please do check image protection status and upload a local protected copy if you can't protect on Commons - recent TFA and TFP images have not been protected. Materialscientist ( talk) 02:59, 14 November 2013 (UTC)
- This is exactly why I've argued against relying upon KrinkleBot as a first-line file protection measure. It's a useful fallback (its intended purpose), but this isn't the first outage that's occurred (and it probably won't be the last). — David Levy 04:00, 14 November 2013 (UTC)
So it occurs to me that a useful adminbot task would be to check WP:Main Page/Tomorrow and Wikipedia:Main Page queue (perhaps even Template:Did you know/Queue) and usefully uploading local copies of images found there (including the source information and licence tag}, adding {{ Uploaded from Commons}}. Adminbot powers would be useful but not essential (a non-adminbot wouldn't be able to upload local copies of tomorrow's images since cascading protetion would have kicked in, but it would catch TFL/TFA/OTD images scheduled more than a day in advance. Thoughts / volunteers? Bencherlite Talk 23:55, 18 November 2013 (UTC)
Tag all entries in - http://tools.wmflabs.org/betacommand-dev/reports/Media_lacking_US_status_indication.txt
for inclusion in Category:All media files lacking a US status indication to be created.
This can either be done by a bot, or by tweaking templates. I prefer a mass tag run by a bot. Sfan00 IMG ( talk) 15:41, 7 December 2013 (UTC)
Copying from myself at Wikipedia:VPT#Bulk_change_of_domain_in_external_URLs:
I probably have added 50-100+ external URLs to policy and discussion spaces that link back to a personal domain, where I host my academic writings and datasets relevant to Wikipedia and collaborative security. This domain has now changed, and while there is an HTTP redirect in place, administrative policies dictate that will not survive forever. The file paths are constant. This is a touch painstaking manually. Is there a way to automate this? If so, is that solution limited to en.wp or is this something that can be done for all WMF properties (I know I have links on Wikimania wikis and Metawiki, at minimum)?
I am looking to change everything of the form, http://www.cis.upenn.edu/~westand to http://www.andrew-g-west.com. Based on the request history here, it seems like some functionality is in place to take care of this? IOs it worth your troubles? Thanks, West.andrew.g ( talk) 21:39, 11 December 2013 (UTC)
There is a sentiment among some users that PROD (not Sticky prod) is useless because anyone, including the creator, can simply remove the PROD tag. I've seen this expressed a few times recently in various fora. It can be pointed out that every week we successfully delete a few hundred pages through prod, so we know it works and they're not always removed, but it would be nice to see what the real statistics are – what percentage of taggings are successful and toher data about the process. To this end, I thought it would be a simple task to have a bot compile a list of prod taggings over some length of time, say one month. No human being could do this because they would miss all or many of the prods taggings that were placed and then removed within a short time, whereas a bot can simply, inhumanly, keep refreshing today's prod category, compare against a list its been compiling and add any new entry. That's the germ of the idea. A human at the end of the data gathering period can easily calculate a gross percentage of success by the number of red-linked and blue-linked and delve further to make sure deletions weren't by other methods but actual as a result of the prodding (if the bot couldn't do this as well). And there's lots of other data that could be gathered, which could be done through the bot if someone would be willing to set it up or by a human willing to spend the time, such as list how long after creation the prodding occurred, who removed, whether they were the creator, how long between tagging and removal, whether the creator was warned or not and I'm sure there are other interesting areas of inquiry I haven't thought of. Is this feasible? Feasible and easy? Feasible but too difficult to bother with? Anyone willing?-- Fuhghettaboutit ( talk) 00:21, 12 December 2013 (UTC)
Here's a bot that would be super-useful:
Search for links like the one at /info/en/?search=Wikipedia:WikiProject_Inline_Templates#Created Replace broken links links to since-archived discussions to the archived discussion: Replace
/info/en/?search=Wikipedia_talk:WikiProject_Inline_Templates#Fact_template_discusison_needs_comments with
Presumably, we'll need pilot runs, big runs, and ongoing maintenance runs. Anyone up for it?-- Elvey ( talk) 01:19, 12 December 2013 (UTC)
This is actually two requests. In hundreds, perhaps thousands, of articles, en.wikipedia.org is used instead of Wikilinking. In others it is used as a reference.
Could a bot be programmed to replace en.wikipedia.org in the text body with the link that was intended, per WP:WIKILINK?
Separately, could a bot be programmed so that when there any en.wikipedia.org within <ref></ref>, the whole lot is replaced with {{cn}}, per WP:CIRCULAR? Simply south.... .. eating lexicological sandwiches for just 7 years 19:45, 15 December 2013 (UTC)
Hi All, I am Dr. Noa Rappaport, scientific leader of the MalaCards database of human diseases. Following a suggestion by Andrew Su ( /info/en/?search=Wikipedia:WikiProject_Molecular_and_Cellular_Biology/Proposals#MalaCards_-_www.malacards.org) we were asked to write a bot that updates the disease box external references within disease entries in Wikipedia: /info/en/?search=User:ProteinBoxBot/Phase_3#Disease. We found it to be a non trivial task. Does anyone know of any such bot that exists or can help us write it ? Thanks. — Preceding unsigned comment added by Noa.rappaport ( talk • contribs) 10:22, 28 November 2013 (UTC)
I am the webmaster for SeacoastNH.com
The site is built in joomla and we use the extension SEFAdvance that used to use underscores (__) in links, but the new version doesn't allow underscores and instead uses dashes (-) in links.
Consequently many of the links and references on wiki that use the old underscored links now present 404 errors.
Could the bot go and find all links on wiki for seacoastnh.com that use underscores and convert them to dashes? — Preceding unsigned comment added by Adcetera692 ( talk • contribs) 15:29, 12 December 2013 (UTC)
Hello Wiki, I'd like to request for a tutorial guide on "creating and configure a bot" for Age of Wushu. Eg; Harvest, mining or Kidnapping bot, and etc. — Preceding unsigned comment added by 175.139.223.168 ( talk) 05:57, 20 December 2013 (UTC)
Following this RfC, orphan tags should be in the talk namespace now. Where in the talk namespace wasn't addressed, but I believe that below all the existing templates, but before the first section, should be OK. I rewrote the documentation that way. A bot should do the articles currently tagged, and possibly articles tagged in the future by editors unaware of the change. Ramaksoud2000 ( Talk to me) 02:08, 21 December 2013 (UTC)
I've noticed that moving a section from one article to another can break all incoming links to that section. So far, I haven't found any way to automatically redirect a section of one article to a section of another article. ( A comprehensive list of all broken section links can be found here - they are quite numerous, and there is not yet any automated solution for fixing them, as far as I know.)
@ GoingBatty: For example, a template {{anchor|Code readability|redirect=Computer programming#Code readability}} could be used to specify a section of an article that a section anchor would redirect to, and all incoming links to that anchor would be re-targeted by a bot. If this feature were implemented, it would make it much easier to re-target sections from one article to another. Jarble ( talk) 17:07, 21 December 2013 (UTC)
Following
this WP:VPT talk, and tipped by
Anomie (I guess Anomie picks up here).
The new taskforce is in
WP:MEDICINE:
Society_and_medicine.
From my talkpage
[13]:
If I could magically use bots, I'd use a bot to tag every article with the taskforce:
- any article simultaneously under WP:BIO and WP:MED
- any article simultaneously under WP:COMPANIES and WP:MED
That should net the majority of the articles we wish to catch. -- LT910001 ( talk) 15:38, 12 December 2013 (UTC)
The bot edit, I suggest:
{{WikiProject Medicine|...|society=yes|society-imp=<TBD>}}
{{WPMED|...|society=yes|society-imp=<TBD>}}
???
" or "mid
", ask taskforce members.Please do not contact me in this, I am just a middle man for the taskforce @ LT910001, Bluerasberry, and Jinkinson:. User:DePiep 14:18, 13 December 2013 (UTC)
|society=no
and the bot understands? -
DePiep (
talk) 17:11, 13 December 2013 (UTC)
There seems to be some interest from the other task forces in this bot, however I feel that it may be better to first get a functioning bot, and then add additional usage cases for the additional 10+ task-forces after it is functioning. If at a later date this could be expanded to multiple taskforces it would be extremely valuable for WPMED and I am sure many users would be very grateful. If I may add two additional cases, to a total of four:-- LT910001 ( talk) 01:54, 14 December 2013 (UTC)
Question: is it possible to tag articles that have certain categories? I worry the difficulty with that may be that categories have a cascading structure and may be difficult to implement -- LT910001 ( talk) 01:54, 14 December 2013 (UTC)
Hi Hasteur, how is the bot coding going? I understand in many countries the festive season has arrived, so I'll be happy to wait if you're busy, however this bot would be bery useful, so I'm enthusiastic about seeing it acutated. -- LT910001 ( talk) 03:39, 22 December 2013 (UTC)
Tagged under the 'society and medicine' task force:
these entries need to be linked to en.wikipedia.org/wiki/Ampelography in an automated way
Xb2u7Zjzc32 (
talk) 04:18, 23 December 2013 (UTC)
We may need some sort of bot or script at WT:OP#Proposal_to_unblock_indeffed_IPs_en_masse. In particular we would like to know which IPs are rangeblocked or globally blocked. Your input would be appreciated. Thanks. -- zzuuzz (talk) 10:57, 23 December 2013 (UTC)
Here's a bot that would be super-useful:
Search for links like the one at /info/en/?search=Wikipedia:WikiProject_Inline_Templates#Created Replace broken links links to since-archived discussions to the archived discussion: Replace
/info/en/?search=Wikipedia_talk:WikiProject_Inline_Templates#Fact_template_discusison_needs_comments with
Presumably, we'll need pilot runs, big runs, and ongoing maintenance runs. Anyone up for it?-- Elvey ( talk) 01:19, 12 December 2013 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 50 | ← | Archive 55 | Archive 56 | Archive 57 | Archive 58 | Archive 59 | Archive 60 |
The Wikipedia 1.0 project needs someone with experience working with large (en.wikipedia-size) MySQL databases. If interested, contact me or the WP 1.0 Editorial Team.— Wolfgang42 ( talk) 02:19, 20 October 2013 (UTC)
SELECT page_title,
IF ( rd_from = page_id,
rd_title,
/*ELSE*/IF (pl_from = page_id,
pl_title,
/*ELSE*/
NULL -- Can't happen, due to WHERE clause below
))
FROM page, redirect, pagelinks
WHERE (rd_from = page_id OR pl_from = page_id)
AND page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
— Wolfgang42 ( talk) 22:53, 23 October 2013 (UTC)
Try this and see if it works any better. -- WOSlinker ( talk) 06:00, 24 October 2013 (UTC)
SELECT page_title, COALESCE(rd_title, pl_title)
FROM page
LEFT JOIN redirect ON page_id = rd_from
LEFT JOIN pagelinks ON page_id = pl_from
WHERE page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
G'day, WPMILHIST has a quarterly awards system for editors that complete reviews (positive or negative) of articles that fall within the WikiProject. So far we (the project coordinators) have done this tallying manually, which is pretty labour-intensive. We have recently included GA reviews, and are having difficulty identifying negative GA reviews using the standard tools. We were wondering if someone could build a bot that could tally all FA, FL, A-Class, Peer and GA reviews of articles that fall within WikiProject Military history? In terms of frequency, we usually tally the points and hand out awards in the first week after the end of each quarter (first weeks in January, April, July and October), but it would be useful functionality to be able to run the bot as needed if that is possible. Regards, Peacemaker67 ( send... over) 23:36, 13 October 2013 (UTC)
20:11, 17 October 2013 > message A reference problem
05:19, 18 October 2013 > message A reference problem
11:19, 22 October 2013 > message A reference problem Two replys
The discussion is at Wikipedia talk:WikiProject Australian Roads/Shields, but in summary, there are sets of images transferred from Commons to here as {{ PD-ineligible-USonly}}. The user that moved the files (downloaded them from commons then uploaded them here) wants to remove his involvement due to potential legal issues in Australia. Under existing policy, revdel, oversight, and office actions are not appropriate. It was suggested that a bot could upload the same files under a different name and nominates the old ones for deletion per WP:CSD#F1. - Evad37 ( talk) 06:42, 26 October 2013 (UTC)
Can someone make a bot to mark all talk pages of Vital articles (all levels) with {{ VA}}, and fill out its parameters (level, class, topic) if possible. It should also remove such templates from non-VAs.
Ideally this should run on a regular basis, but even a one-off run would be very helpful. -- Ypnypn ( talk) 18:48, 28 October 2013 (UTC)
In this first paragraph, I will summarize my request: It would be good if someone could please create a bot which tags articles which were PRODded but survived (I shall call these "Survivors"). And/or which tags articles which were PROD-deleted then recreated (I shall call these "Recreated Articles"). You may tag them with {{ old prod full}}. You may leave all the template's parameters blank, or you may fill some in.
Rationale: Such tags warn us not to re-add another PROD tag. They also make it more obvious to us that perhaps we should consider nominating the page for WP:AfD.
Here are some things you could do, but which I don't recommend: You could download a database dump with history, parse it, and look for Survivors. But such a dump is over ten terabytes of XML once uncompressed. [1] You could download the dump of all logs, download the dump of all page titles, parse the two, and look for Recreated Articles. User:Tim1357 tried parsing a dump, [2], but he didn't succeed: the matter is still on the latest revision of his to-do list. I suspect it may not be worth attempting either of these difficult tasks.
Here is what I do recommend: It would be worthwhile to create a bot to watch Category:Proposed deletion and tag future Survivors. And to watch for new pages and tag Recreated Articles. User:Abductive suggests some optional refinements. [3]
It would be good if someone could please start writing a bot to do either or both of these tasks. It would be even better if they could provide us with a link to their code-in-progress. User:Kingpin13 and User:ThaddeusB have expressed interest, [4] but nobody seems to have actually written any code to do these tasks on the live Wikipedia.
User:Rockfang started tagging Survivors in 2008 using AWB (the wrong tool for the job) but later stopped. S/he wrote that s/he "got distracted".
AnomieBOT already does one related task. If an article is AfDed, then recreated, AnomieBOT's NewArticleAFDTagger task puts {{ old AfD multi}} on that article's talk page. The task's open-source [5] code is here. Maybe you could build on it, and maybe you could even ask Anomie to run it for you. Dear User:Anomie: Do you know if you or any bot ever tagged the pages which were recreated in the years before you wrote your bot?
Cheers, — Unforgettableid ( talk) 04:32, 16 October 2013 (UTC)
Per WP:BRINT, redirects are undesirable in templates. Currently after a page move, bots (bless their hearts) sweep up all of the broken or double redirects etc., but the links in templates are left untouched. For instance, a page was moved from here to here in January but the accompanying template was not updated until today. Is is possible for a bot to fix redirects on templates that are on a page that is moved? Rgrds. -- 64.85.216.235 ( talk) 05:51, 4 November 2013 (UTC)
Bot for Star Wars articles needed, maybe? Might help monitor changes. 20-13-rila ( talk) 11:19, 5 November 2013 (UTC)
We have WP:FANMP (a list of FAs yet to appear on the main page) and WP:WBFAN (a list of FAs and former FAs by nominator). Can someone think of a way to produce a hybrid for me, i.e. a list of FAs yet to appear on the main page by nominator? Bencherlite Talk 20:30, 5 November 2013 (UTC)
It seems that the articles have been moved to http://community.amd.com and http://developer.amd.com. I think all links to http://blogs.amd.com should be marked with {{ dead link}} at least. Please fix them semi-automatically if you can. -- 4th-otaku ( talk) 12:35, 4 November 2013 (UTC)
User:Femto Bot used to populate Wikipedia:WikiProject Medicine/Recent changes which in turn updated Special:RecentChangesLinked/Wikipedia:WikiProject Medicine/Recent changes. I think that's how it worked. It reported all changes to pages with {{ WPMED}} on their talk page. Anyway, it was an awesome tool for patrolling some of Wikipedia's most sensitive content. But since Rich Farmborough was banned from bot work it's stopped working - it only reports recent changes to pages beginning with "A".
This tool aims to do the same thing but it's slow and often times out, and when it works it's running a couple of days behind.
There was also Tim1357's tool, but his account has expired from the Toolserver.
I was wondering if somebody here would be able to provide WP:MED with something to replace these? With something like this a handful of experienced medical editors can effectively patrol all of Wikipedia's medical content. Without it, there's no telling what's happening. -- Anthonyhcole ( talk · contribs · email) 17:58, 4 November 2013 (UTC)
|RELC list1 namespace1=Article [space] + Talk [space]
, |RELC list2 namespace2=Template + template talk
, |other parameters like 1x/month=
. The template is invisible. Just like what
User:MiszaBot/config does on talkpages to archive.The "OLAC" (Open Language Archives Community) website has consistently helpful pages about resources for the languages of the world, especially the endangered and lesser-taught languages. The OLAC pages use a URL which ends with a three-letter code from the ISO 639-3 language code list, which is found in our language articles infobox. Each OLAC page has a nice descriptive title at the top, such as OLAC resources in and about the Aguaruna language.
Rather than adding several thousand OLAC page links to the External links sections of language articles by hand, couldn't we just write a bot to do this?
I know some languages have multiple language codes in their Wikipedia infobox, due to multiple dialects or language variants. Even if the bot didn't add links for languages with multiple codes, it would still be a big time-saver!
What do you think? Djembayz ( talk)
Without getting too deep into
tin foil territory, encrypting is one of many essential steps to ensure readers' privacy. Since October 24, 2013, the
Internet Archive now uses
HTTP Secure (https://
) by default
[8]. Just this week they updated their server software so it can handle
TLS 1.2, the latest version. It is safe to say they encourage their visitors to access their site using an encrypted connection.
In my opinion, Wikipedia should support this effort and switch all outgoing links to the Internet Archive to HTTPS. According to
Alexa, Wikipedia currently ranks fourth among upstream sites to archive.org
[9]. {{
Wayback}} was
already updated in that regard, but most of the links to the Wayback Machine are implemented in one of the many citation templates as encouraged at
WP:WBM. I started to fix a lot of those links manually, before realizing it would be a perfect job for a bot.
The Wayback Machine links have a common scheme, e.g. https://web.archive.org/web/20020930123525/http://www.wikipedia.org/
. So the task is this: find http://web.archive.org/web/
throughout the article namespace and replace with https://web.archive.org/web/
. That's it. --
bender235 (
talk) 20:51, 8 November 2013 (UTC)
http://archive.org/
to http://web.archive.org/
, which would indeed change nothing. But switching to https
changes the transport mechanism, from unencrypted to encrypted. Even tho it looks simple, it has significant consequences. --
bender235 (
talk) 21:17, 8 November 2013 (UTC)
Would it be possible to import those 3 standardized codes into the professions infoboxes ? --Teolemon
The individual occupation items don't have yet any SOC codes associated with them, but they are in broad occupation categories in enwiki that should make it easier to match:
Here's the list of SOC codes for matching with the existing items.
— Preceding unsigned comment added by 2A01:E35:2EA8:950:5BF:1AF3:3374:F3D0 ( talk • contribs)
copied from WP:VPT -- Frze > talk 07:17, 18 October 2013 (UTC)
DPL bot and
BracketBot are the best inventions of Wikipedia. It's time for a new Bot. We need the
If a user contributes a broken reference name, an incorrect ref formating (or a missing reflist), please inform the user who caused this error. It is so outrageously hard work to correct all these errors afterwards, from someone who is not holding the factual knowledge. For example: it took me a week to work up the backlog of Category:Pages with broken reference names - more than 1500 items, some disregarded more than two years. Search with WikiBlame for first entry of ref, making the changes, inform the users... annoying. Thank you very much. -- Frze > talk 12:25, 17 October 2013 (UTC)
== A reference problem ==
Hi [[User:SpidErxD|SpidErxD]]! Some users have been working hard on [[:Category:Pages with broken reference names]].
[https://en.wikipedia.org/?title=Nuclear_program_of_Iran&diff=577623223&oldid=577620891 Here] you added new references '''ref name=OPBW''' and '''ref name="status"''' but didn't define it. This has been showing as an error at the bottom of the article. <small>'''''Cite error: The named reference was invoked but never defined.'''''</small> Can you take a look and work out what you were trying to do? Thanks --User:REFBot
"Take a look at the page XYZ. There is a citation error. It could be in the text:
- or take a look at the bottom of the page:
Thanks, RefBot talk 10:05, 21 October 2013 (UTC)"
"It might be hard to eliminate false positves though."There might be a small problem with valid checking if templates are present in the article. I've seen a fair share of error messages in articles that resulted from an edit to a template and not an edit to the article itself. The error message still shows up in the article. i.e. if a user adds a citation to the template and there isn't a
{{Reflist}}
template in the article. The bot would have to check for that I assume. —
JJJ (
say hello) 15:15, 22 October 2013 (UTC)Yesterday's mistakes
|
---|
Category:Pages with DOI errors
|
Yesterday's Mistakes
|
---|
:
Arjayay
edited
Transcendental Meditation technique causing Category:Pages with ISBN errors, Category:Pages with citations using unsupported parameters, Category:Pages using citations with old-style implicit et al., Category:Pages using citations with accessdate and no URL
|
A930913 TheJJJunk I'm looking forward with happy anticipation to the implementation of my idea. Thanks to you all. -- Frze > talk 04:28, 30 October 2013 (UTC)
- Forcible-Prevention-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, refusing to let them save it in a broken state (they must fix it first)
- Loud-Warning-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, but permit the user to override and save anyways (in the broken state)... then nothing
- Silent-Warning-Filter. Same as #2. Additionally, RefBot allows the editor to opt-out of receiving RefBot filter-warnings.
- Loud-Fix-It-Later-Filter. RefBot should be implemented as an edit-filter, and immediately warn the user when they preview or save a busted ref, but permit the user to override and save anyways (in the broken state)... however, after their edit is saved, RefBot should rollback that one edit (not rollback the last N edits by the editor in question), and then RefBot should automagically post to the article-talkpage with a diff-link to the attempted-ref-edit that it just reverted
- Silent-Fix-It-Later-Filter. Same as #4. Additionally, RefBot allows the editor to opt-out of receiving RefBot filter-warnings.
- Loud-Warning-Bot. RefBot should be implemented as a bot, and eventually warn the editor on their talkpage, but should leave the article alone (no opt-out capability)
- Silent-Warning-Bot. Same at #6. Additionally, RefBot allows the editor to opt-out of receiving RefBot talkpage-messages.
- Loud-Fix-It-Later-Bot. RefBot should be implemented as a bot, and eventually warn the editor on their talkpage, plus RefBot should rollback that one edit (not rollback the last N edits by the editor in question), and then RefBot should automagically post to the article-talkpage with a diff-link to the attempted-ref-edit that it just reverted. Plus, ideally, RefBot's user-talkpage-message should have a one-click-to-put-my-broken-edit-back hyperlink, which also redirects the editor to the article (this prevents them from needing to manually visit the article, enter the edit-history, manually undo RefBot, and then go back to editing the article). Since the editor might not utilize the one-click-magic 'soon' by standards of how quickly the article in question is changing, prolly the one-click-magic should only work if the sub-section of the article in question has *not* been changed by any editors, since RefBot reverted this editor's work; otherwise, the one-click-magic might do more harm than good.
- Silent-Fix-It-Later-Bot. Same at #8. Additionally, RefBot allows the editor to opt-out of receiving RefBot talkpage-messages.
I support the idea of this bot existing with functionality similar to that of BracketBot. I have specific ideas for a different bot that would actually fix CS1 citation errors, but I will describe that functionality in a separate request.
As stated above by others, I do not think it would be productive to apply this new bot's activity to all of the subcategories of Category:Articles with incorrect citation syntax. That would generate a LOT of error messages on people's Talk pages, and some error messages are not even displayed on the article pages by default, so it will be hard for people to figure out where they made an error or if they have fixed it. I recommend starting with the following categories, each of which has been emptied through diligent work by wikignomes:
Also as requested above, the bot should operate on articles in:
I estimate that a total of 20 to 50 articles are added to all of the categories above (combined) each day; someone here might be able to scrub the logs and get a better count.
The bot should post a message similar to Bracketbot's message on the Talk page of the editor who makes the change. Since these categories are already empty, the situation described above in which a revert reintroduces an error should be a rare case.
Also, the bot should have a built-in waiting period (Bracketbot waits five minutes) to allow editors to fix errors themselves if they notice them. Please contact me if you need help writing the error notification text for each category. – Jonesey95 ( talk) 16:23, 6 November 2013 (UTC)
Examples of what the bot would generate from that
|
---|
On
User:Tesfazgi Teklezgi:
Please check these pages and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:14.139.160.4:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:98.230.108.226:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:Soetermans:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:Chrisd915:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:71.173.129.226:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) On
User:128.8.228.120:
Please check this page and fix the errors highlighted. If you think this is a false positive, you can report it to my operator. Thanks, 930913( Congratulate) 16:26, 7 November 2013 (UTC) |
{{User:ReferenceBot/inform/top}}
, {{User:ReferenceBot/inform/middle}}
and {{User:ReferenceBot/inform/bottom}}
. See also
User:ReferenceBot.After a requested move and a move review, the page 30 Seconds to Mars was moved to Thirty Seconds to Mars, which is the official name of the band. After long discussions, we came to an end and all links to 30 Seconds to Mars should be replaced with Thirty Seconds to Mars. Please fix them if you can.-- 95.245.58.53 ( talk) 21:16, 11 November 2013 (UTC)
Thanks for your work. The same thing should be done for MTV Unplugged: 30 Seconds to Mars, Attack (30 Seconds to Mars song), Kings and Queens (30 Seconds to Mars song), Hurricane (30 Seconds to Mars song), Night of the Hunter (30 Seconds to Mars song), Search and Destroy (30 Seconds to Mars song), City of Angels (30 Seconds to Mars song), Do or Die (30 Seconds to Mars song).-- Earthh ( talk) 20:13, 13 November 2013 (UTC)
The Wayback Machine respects robots.txt across time. If a website has a robots.txt that permits archiving at one point, an editor could archive that page; a subsequent change to robots.txt on that site could lead to an inaccessible archive. For example:
South Park has link to
WebCite doesn't cause us problems in this way.
I believe a bot is required to repair these archive links. They can be detected by running a report against the database for all external links to archive.org, and for each link checking the link still works (will a HEAD command be sufficient?). Dead archiveurl links would need to be archived at webcite, or if the original link is unavailable then that need flagging with {{ dead}}. Josh Parris 02:22, 16 November 2013 (UTC)
I have a task for bots: It's needed to replace all diacritics Ş ş Ţ ţ with Ș ș Ț ț in articles from categories about Moldova and Romania. In romanian language correct are second variant, but in Windows XP is a bug and in place of them are those wrong ^ diacritics. I have corrected a part of articles about football, but they are more. Its needed an bot that also can rename pages, because sometimes diacritics (wrong) are in title. Examples:
I repeat that in romanian language Ş ş Ţ ţ does not exist. Those are turkish diacritics, so you can freely to run bot in category ″Moldova″ and ″Romania″ + all subcategories. Thanks. XXN ( talk) 14:49, 17 November 2013 (UTC)
Hi, I'm one of the Wikimedia org admins at mw:Google Code-in. We are looking for technical tasks that can be completed by students e.g. create/update a bot, improve its documentation... We also need mentors for these tasks. You can start simple with one mentor proposing one task, or you can use this program to organize a taskforce of mentors with the objective of getting dozens of small technical tasks completed. You can check the current Wikimedia tasks here. The program started on Monday, but there is still time to jump in. Give Google Code-in students a chance!-- Qgil ( talk) 16:11, 21 November 2013 (UTC)
hello, how do i go about getting a bot for my chatroom? — Preceding unsigned comment added by Hannsg8000 ( talk • contribs) 19:06, 21 November 2013 (UTC)
Hi all, I was recently granted a trial with my bot (see Mdann52 bot BRFA). However, it turned out that the script I was trying to use ( mw:Manual:Pywikibot/weblinkchecker.py) did not check links in-between ref tags, so was not very useful for the task as I first thought. As my python skills are not very good at the minute, can someone rewrite the script (or produce a version of it) that only checks links in-between ref tags (and possibly ignores any tagged with {{ dead link}}?) Thanks -- Mdann 52 talk to me! 13:39, 22 November 2013 (UTC)
There's currently a table of women physicists at User:Headbomb/sandbox2. If someone could code a bot to fetch the articles, and fill in the other columns, that would be nice and much appreciated.
For clarity, I've filled the first line of the table. The request is for a one-time run for now, but a weekly/monthly run could be done when at some point in the future when the table gets hosted as its permanent location. Feel free to do tests directly on my sandbox2. Headbomb { talk / contribs / physics / books} 18:01, 26 November 2013 (UTC)
The newly formed WikiProject Women artists could use a bot to add project banners to the talk pages of articles within certain categories. Gobōnobō + c 03:55, 28 November 2013 (UTC)
A bot that finds ban violations (e.g. editing someone's userpage when there is an interaction ban between the new editors, editing during a site ban, etc) and reports and possibly reverts them. 2Awwsome Tell me where I screwed up. See where I screwed up. 20:02, 26 November 2013 (UTC)
I want a bot for that automated or semi-automated for making repetitive edits that would be extremely tedious to do manually. repetitive tasks. for example adding the same category or template for a 1000 article. -- DIYAR DESIGN ( talk) 18:08, 27 November 2013 (UTC)
So you want a bot. What sort of bot? What do you want it to do? Idea is not well explained. 2Awwsome Tell me where I screwed up. See where I screwed up. 17:06, 28 November 2013 (UTC)
My mistake and apologies. — Preceding unsigned comment added by 71.222.78.246 ( talk) 22:38, 1 December 2013 (UTC)
A couple days ago, The Canadian Encyclopedia completely overhauled its website and, unfortunately, completely changed its URL format. This has broken over 5,500 links, but I think many of them could be repaired by a bot. The old url format was like http://www.thecanadianencyclopedia.com/index.cfm?PgNm=TCE&Params=A1ARTA0005015, while the new uses the article's title: http://www.thecanadianencyclopedia.ca/en/article/sir-andrew-macphail/. Would it be possible to have a bot check the title associated with a citation or external links entry and update with the proper URL where it can? I would imagine there would still be plenty of bare references and the like that we would still have to manually fix, but if a bot can take care of most of these, it would make the job manageable. Thanks! Reso lute 22:05, 29 November 2013 (UTC)
Some of you will know that main page images that are hosted at Commons are - or should be - protected at Commons by an adminbot there by adding them to a casade-protected page at Commons. This prevents alterations or fresh versions being uploaded there, while our local cascading protection of files in today's and tomorrow's main pages prevents local upload of images. However, there was this thread at Talk:Main Page recently:
user:KrinkleBot hasn't edited since 9 November, meaning there is no autocascade protection on Commons. Promoting admins, please do check image protection status and upload a local protected copy if you can't protect on Commons - recent TFA and TFP images have not been protected. Materialscientist ( talk) 02:59, 14 November 2013 (UTC)
- This is exactly why I've argued against relying upon KrinkleBot as a first-line file protection measure. It's a useful fallback (its intended purpose), but this isn't the first outage that's occurred (and it probably won't be the last). — David Levy 04:00, 14 November 2013 (UTC)
So it occurs to me that a useful adminbot task would be to check WP:Main Page/Tomorrow and Wikipedia:Main Page queue (perhaps even Template:Did you know/Queue) and usefully uploading local copies of images found there (including the source information and licence tag}, adding {{ Uploaded from Commons}}. Adminbot powers would be useful but not essential (a non-adminbot wouldn't be able to upload local copies of tomorrow's images since cascading protetion would have kicked in, but it would catch TFL/TFA/OTD images scheduled more than a day in advance. Thoughts / volunteers? Bencherlite Talk 23:55, 18 November 2013 (UTC)
Tag all entries in - http://tools.wmflabs.org/betacommand-dev/reports/Media_lacking_US_status_indication.txt
for inclusion in Category:All media files lacking a US status indication to be created.
This can either be done by a bot, or by tweaking templates. I prefer a mass tag run by a bot. Sfan00 IMG ( talk) 15:41, 7 December 2013 (UTC)
Copying from myself at Wikipedia:VPT#Bulk_change_of_domain_in_external_URLs:
I probably have added 50-100+ external URLs to policy and discussion spaces that link back to a personal domain, where I host my academic writings and datasets relevant to Wikipedia and collaborative security. This domain has now changed, and while there is an HTTP redirect in place, administrative policies dictate that will not survive forever. The file paths are constant. This is a touch painstaking manually. Is there a way to automate this? If so, is that solution limited to en.wp or is this something that can be done for all WMF properties (I know I have links on Wikimania wikis and Metawiki, at minimum)?
I am looking to change everything of the form, http://www.cis.upenn.edu/~westand to http://www.andrew-g-west.com. Based on the request history here, it seems like some functionality is in place to take care of this? IOs it worth your troubles? Thanks, West.andrew.g ( talk) 21:39, 11 December 2013 (UTC)
There is a sentiment among some users that PROD (not Sticky prod) is useless because anyone, including the creator, can simply remove the PROD tag. I've seen this expressed a few times recently in various fora. It can be pointed out that every week we successfully delete a few hundred pages through prod, so we know it works and they're not always removed, but it would be nice to see what the real statistics are – what percentage of taggings are successful and toher data about the process. To this end, I thought it would be a simple task to have a bot compile a list of prod taggings over some length of time, say one month. No human being could do this because they would miss all or many of the prods taggings that were placed and then removed within a short time, whereas a bot can simply, inhumanly, keep refreshing today's prod category, compare against a list its been compiling and add any new entry. That's the germ of the idea. A human at the end of the data gathering period can easily calculate a gross percentage of success by the number of red-linked and blue-linked and delve further to make sure deletions weren't by other methods but actual as a result of the prodding (if the bot couldn't do this as well). And there's lots of other data that could be gathered, which could be done through the bot if someone would be willing to set it up or by a human willing to spend the time, such as list how long after creation the prodding occurred, who removed, whether they were the creator, how long between tagging and removal, whether the creator was warned or not and I'm sure there are other interesting areas of inquiry I haven't thought of. Is this feasible? Feasible and easy? Feasible but too difficult to bother with? Anyone willing?-- Fuhghettaboutit ( talk) 00:21, 12 December 2013 (UTC)
Here's a bot that would be super-useful:
Search for links like the one at /info/en/?search=Wikipedia:WikiProject_Inline_Templates#Created Replace broken links links to since-archived discussions to the archived discussion: Replace
/info/en/?search=Wikipedia_talk:WikiProject_Inline_Templates#Fact_template_discusison_needs_comments with
Presumably, we'll need pilot runs, big runs, and ongoing maintenance runs. Anyone up for it?-- Elvey ( talk) 01:19, 12 December 2013 (UTC)
This is actually two requests. In hundreds, perhaps thousands, of articles, en.wikipedia.org is used instead of Wikilinking. In others it is used as a reference.
Could a bot be programmed to replace en.wikipedia.org in the text body with the link that was intended, per WP:WIKILINK?
Separately, could a bot be programmed so that when there any en.wikipedia.org within <ref></ref>, the whole lot is replaced with {{cn}}, per WP:CIRCULAR? Simply south.... .. eating lexicological sandwiches for just 7 years 19:45, 15 December 2013 (UTC)
Hi All, I am Dr. Noa Rappaport, scientific leader of the MalaCards database of human diseases. Following a suggestion by Andrew Su ( /info/en/?search=Wikipedia:WikiProject_Molecular_and_Cellular_Biology/Proposals#MalaCards_-_www.malacards.org) we were asked to write a bot that updates the disease box external references within disease entries in Wikipedia: /info/en/?search=User:ProteinBoxBot/Phase_3#Disease. We found it to be a non trivial task. Does anyone know of any such bot that exists or can help us write it ? Thanks. — Preceding unsigned comment added by Noa.rappaport ( talk • contribs) 10:22, 28 November 2013 (UTC)
I am the webmaster for SeacoastNH.com
The site is built in joomla and we use the extension SEFAdvance that used to use underscores (__) in links, but the new version doesn't allow underscores and instead uses dashes (-) in links.
Consequently many of the links and references on wiki that use the old underscored links now present 404 errors.
Could the bot go and find all links on wiki for seacoastnh.com that use underscores and convert them to dashes? — Preceding unsigned comment added by Adcetera692 ( talk • contribs) 15:29, 12 December 2013 (UTC)
Hello Wiki, I'd like to request for a tutorial guide on "creating and configure a bot" for Age of Wushu. Eg; Harvest, mining or Kidnapping bot, and etc. — Preceding unsigned comment added by 175.139.223.168 ( talk) 05:57, 20 December 2013 (UTC)
Following this RfC, orphan tags should be in the talk namespace now. Where in the talk namespace wasn't addressed, but I believe that below all the existing templates, but before the first section, should be OK. I rewrote the documentation that way. A bot should do the articles currently tagged, and possibly articles tagged in the future by editors unaware of the change. Ramaksoud2000 ( Talk to me) 02:08, 21 December 2013 (UTC)
I've noticed that moving a section from one article to another can break all incoming links to that section. So far, I haven't found any way to automatically redirect a section of one article to a section of another article. ( A comprehensive list of all broken section links can be found here - they are quite numerous, and there is not yet any automated solution for fixing them, as far as I know.)
@ GoingBatty: For example, a template {{anchor|Code readability|redirect=Computer programming#Code readability}} could be used to specify a section of an article that a section anchor would redirect to, and all incoming links to that anchor would be re-targeted by a bot. If this feature were implemented, it would make it much easier to re-target sections from one article to another. Jarble ( talk) 17:07, 21 December 2013 (UTC)
Following
this WP:VPT talk, and tipped by
Anomie (I guess Anomie picks up here).
The new taskforce is in
WP:MEDICINE:
Society_and_medicine.
From my talkpage
[13]:
If I could magically use bots, I'd use a bot to tag every article with the taskforce:
- any article simultaneously under WP:BIO and WP:MED
- any article simultaneously under WP:COMPANIES and WP:MED
That should net the majority of the articles we wish to catch. -- LT910001 ( talk) 15:38, 12 December 2013 (UTC)
The bot edit, I suggest:
{{WikiProject Medicine|...|society=yes|society-imp=<TBD>}}
{{WPMED|...|society=yes|society-imp=<TBD>}}
???
" or "mid
", ask taskforce members.Please do not contact me in this, I am just a middle man for the taskforce @ LT910001, Bluerasberry, and Jinkinson:. User:DePiep 14:18, 13 December 2013 (UTC)
|society=no
and the bot understands? -
DePiep (
talk) 17:11, 13 December 2013 (UTC)
There seems to be some interest from the other task forces in this bot, however I feel that it may be better to first get a functioning bot, and then add additional usage cases for the additional 10+ task-forces after it is functioning. If at a later date this could be expanded to multiple taskforces it would be extremely valuable for WPMED and I am sure many users would be very grateful. If I may add two additional cases, to a total of four:-- LT910001 ( talk) 01:54, 14 December 2013 (UTC)
Question: is it possible to tag articles that have certain categories? I worry the difficulty with that may be that categories have a cascading structure and may be difficult to implement -- LT910001 ( talk) 01:54, 14 December 2013 (UTC)
Hi Hasteur, how is the bot coding going? I understand in many countries the festive season has arrived, so I'll be happy to wait if you're busy, however this bot would be bery useful, so I'm enthusiastic about seeing it acutated. -- LT910001 ( talk) 03:39, 22 December 2013 (UTC)
Tagged under the 'society and medicine' task force:
these entries need to be linked to en.wikipedia.org/wiki/Ampelography in an automated way
Xb2u7Zjzc32 (
talk) 04:18, 23 December 2013 (UTC)
We may need some sort of bot or script at WT:OP#Proposal_to_unblock_indeffed_IPs_en_masse. In particular we would like to know which IPs are rangeblocked or globally blocked. Your input would be appreciated. Thanks. -- zzuuzz (talk) 10:57, 23 December 2013 (UTC)
Here's a bot that would be super-useful:
Search for links like the one at /info/en/?search=Wikipedia:WikiProject_Inline_Templates#Created Replace broken links links to since-archived discussions to the archived discussion: Replace
/info/en/?search=Wikipedia_talk:WikiProject_Inline_Templates#Fact_template_discusison_needs_comments with
Presumably, we'll need pilot runs, big runs, and ongoing maintenance runs. Anyone up for it?-- Elvey ( talk) 01:19, 12 December 2013 (UTC)