This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 25 | Archive 26 | Archive 27 | Archive 28 | Archive 29 | Archive 30 | → | Archive 35 |
Many, many case citations link to case citation. THis was probably done by a bot, and is completely useless as case citation itsself, says to hit back and look on the previous page. If these can be linked to the actual cases, great, but otherwise please get rid of these useless, misleading, links. I have talked to numerous people who absolutely hate these links and want them gone. However, i think they were put ehre by a bot, and only really a bot can remove them all. Scientus ( talk) 07:58, 12 May 2009 (UTC)
Yeah, those really are annoying. Of course we don't want to remove all links to case citation; just those that pipe the link, where the shown string starts with a number. Let me look at this. Doing... – Quadell ( talk) 12:35, 12 May 2009 (UTC)
Considering that our audience is mainly laypersons who may not be familiar with the formatting for legal citation, shouldn't at least one of the citations in an article (perhaps the first one) include the link? Jim Simmons ( talk) 15:30, 12 May 2009 (UTC)
I left a note at User talk:Quadell about this. I'd like to see more discussion about a large-scale change like this. At least something more concrete than, "I have talked to numerous people who absolutely hate these links and want them gone." ;-) -- MZMcBride ( talk) 21:28, 12 May 2009 (UTC)
The reason I see for why they might have been put there in the first place is to boost the already crazy-high Wikipedia SEO: having specific links that don't point to the context of their name mislead not only people but search engines. Scientus ( talk) 00:15, 13 May 2009 (UTC)
I've never liked having the first link in an article point outside of Wikipedia, which is what you have with external links to the full opinion in the article lead. A better practice might be to link to the article on the specific case reporter, which would go a longer way towards explaining the citation than the general case citation article. At this point we should have articles on all the main American ones, such as United States Reports, Federal Reporter, Federal Supplement...even the regional reporters for states, such as Pacific Reporter. Postdlf ( talk) 15:39, 13 May 2009 (UTC)
The reports were updated with data from the March dump. For the first time we have the full results on the various scans being run. For some we already have identified how these could be fixed by bot (e.g. by AWB or pywikipediabot), for others this still needs to be done. I'd appreciate your help. -- User:Docu
It still needs to be done, so... I don't know what's the procedure for dealing with that... Help? Headbomb { ταλκ κοντριβς – WP Physics} 03:31, 14 May 2009 (UTC)
Is it possible for a bot to go through the history?
For old archives that have a lot of unsigned posts, it would be nice to have a bot that would go through history and figure out where to use the {{ unsigned}} to sign the unsigned posts with name (or IP) and date/time.
Thanks,
-- stmrlbs| talk 05:47, 18 May 2009 (UTC)
Is there a bot that can take a list of titles, check to see which ones aren't already occupied, and then create pages for them with identical prespecified code? That would be really useful for mass de-redlinking of various types of prehistoric animals. Sorry if this is a stupid question, I have no idea how the whole bot thing works here on Wikipedia. Abyssal ( talk) 17:14, 19 May 2009 (UTC)
Oh, and while we're on the subject, could the bot used for this project, or another bot, be used to add project headers to the talk pages of the created articles? Abyssal ( talk) 18:02, 19 May 2009 (UTC)
Would it be a good idea for us to have a bot take care of the mundane task of updating Wikipedia:Good articles/recent with the titles of recently passed GA noms? This seems like a simple task, and we already have bots to update the articlehistory and {{ GA number}}. -- ErgoSum88 ( talk) 14:50, 10 May 2009 (UTC)
Is it possible for a bot to generate a list of all articles which are both
In other words, can a bot create a list of all redirects which are also tagged in their Talk space with a WP Films project banner?
Many thanks in advance, Girolamo Savonarola ( talk) 01:03, 16 May 2009 (UTC)
All Ohio townships have a section (identical for virtually all townships statewide) detailing their form of government; see the Government section of Madison Township, Richland County, Ohio for an example. You'll see that "...There is also an elected township clerk, who serves a four-year term...Vacancies in the clerkship or on the board of trustees..." is part of this text. The state legislature recently changed the title of this type of official from "clerk" to "fiscal officer"; could someone write a bot to convert "clerk" to "fiscal officer" in these exact strings of text? A few townships have already been updated, but there are well over 1,000 that haven't been updated yet. Nyttend ( talk) 01:08, 20 May 2009 (UTC)
Can a bot notify the creator and contributors with x amount of edits of an AfD? This would be done by examining the wikipedia edit history of a page (or the below two webpages):
If this is not possible, can a bot be made to notify the creator of an article of an AfD?
For example, the first editor on a page is found here:
http://en.wikipedia.org/?title=Wikipedia:Bot requests&dir=prev&action=history&limit=1
The list of today's afd's is found here: WP:AFDT
Ikip ( talk) 00:14, 3 May 2009 (UTC)
Is this related to Wikipedia:Bots/Requests for approval/CSDCheckBot? – Quadell ( talk) 12:57, 13 May 2009 (UTC)
I've started work on this. -- Erwin ( talk) 21:36, 21 May 2009 (UTC)
The bot's up for approval at Wikipedia:Bots/Requests for approval/Erwin85Bot 8. -- Erwin ( talk) 11:05, 22 May 2009 (UTC)
Request of 14 May 2009, by : — Sniff ( talk) 21:42, 14 May 2009 (UTC)
Request :
List all pages of your category
Prince Edward Island in subcategorys and show french articles with him (cf. Exemple) Also, count the number of English articles and French articles. If there are any questions, feel free. Thanks!
It's to translate your articles in French (or just to prepare a list for translators).
I'm guessing that adding ":fr" to the front of the links that (I think) are in the French language Wikipedia will make this request make more sense, and have done so. I also note that the words "English" are links here, though gray; that's somewhat disconcerting. -- John Broughton (♫♫) 01:14, 16 May 2009 (UTC)
I'd like to request a bot for the Lojban wikipedia (English article on Lojban). The bot ought to do one or more of the following tasks:
text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==third lowest level heading== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ===second lowest level heading=== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ====lowest level headint==== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==third lowest level headint== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text
ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==ni'oni'oni'oni'o third lowest level heading== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ===ni'oni'oni'o second lowest level heading=== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ====ni'oni'o lowest level headint==== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==ni'oni'oni'oni'o third lowest level headint== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text
I know it may not be the most apropiate thing to ask here for a bot for another wikipedia, but didn't find anywhere else to do so.-- Homo logos ( talk) 15:57, 23 May 2009 (UTC)
I propose a task for finding and tagging with the tag {{subst:dated|uncategorized}} pages without any categories. Beagel ( talk) 19:14, 23 May 2009 (UTC)
In Bugzilla:18829 we learn that
however many created previous to that date lack border="1", and will barely readable when cut and pasted outside wikipedia, or by text browser users.
By writing and deploying this bot, you will help Wikipedia better fulfil its Wikipedia:Accessibility#Users_with_limited_CSS.2FJavaScript_support goals of Wikipedia:Accessibility#Tables.
The tables in question, those with class wikitable, are all expected to have borders on them, as that is what is in the stylesheet. So all the author needs to do is check for the class, wikitable, and then double check if there already is a border=... parameter. And then only if not, go ahead and add border="1". (There is no need to also check for "before Aug 2008", as that is not an exact check anyway, as people might have cloned the bad tables from elsewhere later.
Also tables without class wikitable do not have the problem (they look the same, borders or not, stylesheets or not.) so their border= choices should be respected and not tampered with.
Jidanni ( talk) 14:13, 18 May 2009 (UTC)
Okay, who wants to write and run a bot to retag 1.6 million files? ;-)
See commons:Commons:License Migration Task Force. Dragons flight ( talk) 22:56, 27 May 2009 (UTC)
AMG has changed their url format. Because of this the link may soon no longer work. So can you search and change the two templates removing the 1: and 2:.
changing them from:
This would be a GREAT help for us. Thanks. -- Phoenix ( talk) 04:31, 19 May 2009 (UTC)
Insert non-formatted text here
Currently the movie templates are used like {{Amg movie |1:356351 |Quantum of Solace}}, which creates the link http://www.allmovie.com/cg/avg.dll?p=avg&sql=1:356351 which redirects to http://www.allmovie.com/work/356351. Luckily the link http://www.allmovie.com/work/1:356351 also pulls up the page just fine (at the moment), so if you change the template to link to http://www.allmovie.com/work/{{{1}}} it shouldn't break any links. Then someone (I can do it) could run AWB to remove 1: and 2: from the templates. – Quadell ( talk) 01:26, 20 May 2009 (UTC)
Thanks :-) -- Phoenix ( talk) 01:14, 28 May 2009 (UTC)
Some ticker symbols (e.g. JAKK) do not link to the companies they represent. I'd like to see a bot that would ensure that every company that had an article and a ticker symbol had its article linked (either as a redirect or on a disambiguation page) from the ticker symbol. Neon Merlin 05:00, 28 May 2009 (UTC)
Hi there, I often find myself obsessive-compulsively moving around <ref> tags because I believe they don't comply to what I think I once read are their usage guidelines.
So, if we denote our citation with "[1]", I believe the following applies (please kindly point me to the written and agreed-upon guidelines because I can't find them any more)
1. Ref should go after punctuation
2. No whitespace between punctuation and ref, or between ref and ref.
Would this be something that could be done by a bot? Or does it fall into the "cosmetics" class, which I understand is not bottable? If this is possible and considered useful, please let me know. I could have a go and do it myself. Thank you. 114.150.83.231 ( talk) 18:49, 28 May 2009 (UTC)
Thanks. Could you please give me examples of the possible false positives you have in mind, so I can understand? Cheers. 114.150.83.231 ( talk) 18:59, 28 May 2009 (UTC)
I think at least automatically eating any whitespace to the left of any reference tag (item 2 above) should be fairly safe, no? 114.150.83.231 ( talk) 19:03, 28 May 2009 (UTC)
Thanks. It looks to me like there is a clear consensus for point 2 above (spaces). I'm new to bots. Could you kindly summarize (or point me to relevant discussion/policy) for the rationale behind "it would be a cleanup thing that isn't worth having a bot"? Thank you. 114.150.72.41 ( talk) 00:26, 29 May 2009 (UTC)
I'm not sure I understand. It's bad to have a bot do this, but nobody would object to users wasting their time effecting the same number of edits manually fixing the refs in compliance with agreed-upon guidelines. If server load is the problem, can we not throttle the bot to whatever level is perceived necessary? 114.150.72.41 ( talk) 02:27, 29 May 2009 (UTC) ...and as for the usefulness of such a bot, I guess it's subjective. To me, it would be helping making Wikipedia look more consistent and professional. 114.150.72.41 ( talk) 02:30, 29 May 2009 (UTC)
I need a bot to go through the articles in Category:Mixed Drinks articles by quality and change any banner with the |focus=bar switch so that the banner will contain the |bar=yes switch.
eg
{{
WikiProject Mixed Drinks|focus=bar}}
→ {{
WikiProject Mixed Drinks|bar=yes}}
The total number of articles is around three hundred (300) and about 30% have the |focus=bar switch
Thank you for your time, -- Jeremy ( blah blah) 02:31, 25 May 2009 (UTC)
I'd love to do this using AWB, I even downloaded it but cannot figure out how to use it. -- Jeremy ( blah blah) 06:18, 25 May 2009 (UTC)
Thank you, that would be greatly appreciated. -- Jeremy ( blah blah) 14:30, 25 May 2009 (UTC)
{{
Doing}}
- I ran a filter through the list, so it's going through all the talk pages in the Category - a total of 298. As you say there are only about 100 find/replaces to be done, it should go through the list and complete the task soon! :)
The
Helpful
One 14:58, 25 May 2009 (UTC)Could some one tag all of the articles under Category:Bartending with WPMIX |bar=yes? The reason this was the problem is that not all articles have the |bar= switch on them. -- Jeremy ( blah blah) 04:48, 27 May 2009 (UTC)
I thank you sir. -- Jeremy ( blah blah) 04:07, 30 May 2009 (UTC)
This is a large request and I would like to know if this can be done easily...
I would like to have a bot to go through the Category:Unassessed Food and drink articles and assess the stubs. The problem is there are 10,322 unassessed articles and I am afraid this would bog the system down.
Could this be done without disturbing the system?
-- Jeremy ( blah blah) 02:46, 25 May 2009 (UTC)
Thank you, if that would not be a bother, it would be appreciated. -- Jeremy ( blah blah) 04:44, 25 May 2009 (UTC)
Those are very good, most are dead on there were a couple that were rated as Start that could be a C. The difference I use is the amount of content in the article. I would say go ahead and do that for us.-- Jeremy ( blah blah) 00:59, 26 May 2009 (UTC)
Thank you very much sir! -- Jeremy ( blah blah) 04:06, 30 May 2009 (UTC)
We need a bot to report cases of certain abuse filters being tripped. I watch several AFs where the only activity is by easy-to-spot long-term-abuse socks. But if I am not actively watching them and do not have abuse IRC open, then the socks just keep trying and trying until they find a way around the filter. If a bot monitored such AFs and immediately reported them to WP:AIV/TB2 - which is watchlisted by many admins - the socks would be blocked very quickly. Recommend the bot operator have sole discretion over which AFs are reported immediately, taking discussion/consensus into account in contentious cases. As a bonus, maybe other AFs are reported only if tripped a certain number of times - as happens in the abuse IRC. Thanks! Wknight94 talk 18:32, 30 May 2009 (UTC)
List all articles alphabetically (no doublon please) of the category and subcategories Prince Edward Island here. Thanks! — Sniff ( talk) 14:05, 31 May 2009 (UTC)
I'm requesting for a bot to stick the {{ Longtalk}} template onto talkpages that are indeed excessively long. This should be a very very simple bot to code.=D Smallman12q ( talk) 18:15, 31 May 2009 (UTC)
Will one of the message delivery bots please deliver the message discussed at WP:VPP#NoMultiLicense template to the talk pages of each user using {{ NoMultiLicense}}, in preparation for the Wikimedia relicensing? There seems to be about 127 of them. Thanks. Anomie ⚔ 01:57, 2 June 2009 (UTC)
Hello! I was wondering if it might be possible to see a list of articles which are tagged for both WikiProject Films and WikiProject Biography, since the former's scope does not include biographies, but the project has a tough time tracking mistagged articles. A one-time generated list is fine; something on the toolserver which dynamically updates would be even better! :) Many thanks in advance, Girolamo Savonarola ( talk) 18:03, 2 June 2009 (UTC)
Could you tag the talk pages of these portals and their subpages as follows:
{{
WikiProject Food and drink|drink=yes}}
, and categorize the article pages as
Category:Drink Portal{{
WikiProject Food and drink}}
, and categorize the article pages as
Category:Food Portal{{
WikiProject Wine}}
, and categorize the article pages as
Category:Wine Portal{{
WikiProject Beer}}
, and categorize the article pages as
Category:Beer Portal{{
WikiProject France}}
, and categorize the article pages as
Category:Lyon Portal{{
WikiProject Massachusetts}}
, and categorize the article pages as
Category:Massachusetts Portal{{
WikiProject Visual arts}}
, and categorize the article pages as
Category:Arts portalsThank you for your time, -- Jeremy ( blah blah) 06:40, 1 June 2009 (UTC)
Sorry about not being more specific; here is what I would like to be done:
Is that a little more helpful? -- Jeremy ( blah blah) 21:58, 1 June 2009 (UTC)
A few more answers:
Is there any other questions you need me to answer? I want to make this as smooth as possible for you and will gladly help. -- Jeremy ( blah blah) 23:18, 1 June 2009 (UTC)
Thank you -- Jeremy ( blah blah) 05:09, 3 June 2009 (UTC)
Would a bot owner deliver the message at Wikipedia talk:Meetup/DC 7#Final announcement? (That page lists the three user categories for whom the message delivery is requested.)
Thanks!
-- John Broughton (♫♫) 15:59, 2 June 2009 (UTC)
A bot is needed to update Wikipedia:WikiProject Football/Unreferenced BLPs/Sorted by country by computing the intersections of each subcategory of Category:Football (soccer) players by nationality with Category:All unreferenced BLPs, per the discussion on my talk page. Though I'm a bot operator myself, this is beyond what I can easily handle. Erik9 ( talk) 23:51, 28 May 2009 (UTC)
We need a bot to check subcategories of commons:Category:OTRS pending (the ones that are older than 30 days) and notify the uploaders of those files (using a template) and also replace the {{ OTRS pending}} tag on the file page with {{ No OTRS permission since}}. There are further details here. Feel free to ask any questions here, there, or on either of my talk pages. - Rjd0060 ( talk) 23:38, 2 June 2009 (UTC)
Hello people! I'm here to tell the English Wikipedia's community about import a bot to the Portuguese Wikipedia. The bot is the CorenSearchBot ( talk · contribs · count), that is operated by Coren ( talk · contribs · count), and the bot search for Violation of Copyright in articles. The user HyperBroad was suggested it there, and the suggestion was aproveted ( link here). But, i talked with Coren and he said me that he don't have time and he don't speak Portuguese to operate it. So, he said to me write here to ask: Any body wants to operate it there? Coren will help and divugate the source code if any one wants. Vitorbraziledit ( talk) 02:59, 4 June 2009 (UTC)
It would be nice if a bot retrieved the |journal= parameter from {{ citation}} and {{ cite journal}} (probably using data dumps) and built a list of journals and journal abbreviations with the number of times they are found. This would be useful for Wikipedia:WikiProject Journals, so they could assess what are the high-priority missing journals, redirect to main articles, etc...
The list should be alphabetically ordered, with entries linked. Redirects should be italicized. Place the list at Wikipedia:WikiProject Academic Journals/Bot compilation/X1, where X is the appropriate letter. If articles start with The X, then classify according to X. A 500 entries per page limit would be a good idea (then go to X2, X3...). After this is done, any redlink with a count of over 10 hits (1 citation = 1 hit) should be placed at Wikipedia:WikiProject Academic Journals/Bot compilation/Missing articles and redirects/1. Again a 500 entries per page limit would be a good idea (then go to /2, /3, ...). Headbomb { ταλκ κοντριβς – WP Physics} 03:10, 30 May 2009 (UTC)
(unindent) Oh, and to the bot coder, if you could also place existing articles & redirect in bold (in addition to what I just wrote above), it would make it much easier to keep track of progress. Here's an example of what the end product should look like in case I'm not very clear.
Alphabetical | By hits |
|
|
Thanks. Headbomb { ταλκ κοντριβς – WP Physics} 15:07, 2 June 2009 (UTC)
Currently, WP Elements and the inactive WP Isotopes are using multiple banners with various parameters used in a more or less consistent manner to achieve various things. The request is sort of "three-step", although implementation doesn't have to be. These tables might prove useful.
{{
WP Elements}}
and {{
V0.5}}
. The parameters |class= and |importance= should be imported into {{
WP Elements}}
and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed.{{
WP Elements|isotope=yes}}
. The parameters |class= and |importance= should be imported when possible, and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed. If the pages are redirects, they should be tagged with |class=redirect.{{
WP Elements|isotope=yes}}
. The parameters |class= and |importance= should be imported when possible, and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed. If the pages are redirects, they should be tagged with |class=redirect.{{
WP Elements|class=category|importance=na}}
and the categories of
Category:Isotopes of elementname should be tagged with {{
WP Elements|class=category|importance=na|isotopes=yes}}
.{{
WikiProjectNotice|PROJECT=XXX}}
, where XXX contains the words "chemistry", "isotope/isotopes" or "element/elements", or combinations of them should be removed, as they are redundant with the new banner.That was a mouthful. Headbomb { ταλκ κοντριβς – WP Physics} 21:09, 6 June 2009 (UTC)
I'd like to request a bot that would go through the news and discover when links are made to wikipedia articles and create a press multi on the talk page. Ideally, the bot would scrape google news and see which articles contain a wikipedia link. Does anyone support this? Smallman12q ( talk) 20:09, 5 May 2009 (UTC)
(New indent)No I suppose-I think I just assumed, well I don't really know!(I was a bit rushed)! OK, I'll keep looking! dottydotdot ( talk) 07:27, 20 May 2009 (UTC)
Regardless, accessing Google News for links to Wikipedia via a bot would be a breach of their Terms of Service:
"You agree not to access (or attempt to access) any of the Services by any means other than through the interface that is provided by Google, unless you have been specifically allowed to do so in a separate agreement with Google. You specifically agree not to access (or attempt to access) any of the Services through any automated means (including use of scripts or web crawlers) and shall ensure that you comply with the instructions set out in any robots.txt file present on the Services." — Google Terms of Service, Section 5.3
You'd have to do it manually, as Google does not issue API keys anymore. Another option is to try another news search engine, such as Yahoo News, that does give out API keys. I ran into the same problem with EarwigBot II ( BRFA · contribs · actions log · block log · flag log · user rights), as did Coren with CorenSearchBot ( BRFA · contribs · actions log · block log · flag log · user rights). I hope that helps! The Earwig ( Talk | Editor review) 02:58, 25 May 2009 (UTC)
Still doing... Most of the code is done. I just have to finish writing two of the functions (query() and makechanges()), and then I'll put up a BRFA for it. The Earwig ( Talk | Editor review) 23:56, 30 May 2009 (UTC)
{{
press}}
a parameter, such as {{
press|bot=yes}}
, identifying it as done automatically. Then, users viewing the talk page could notice it, and verify it if there was actually a citing on the page. Does anyone support this?
The Earwig (
Talk |
Editor review) 01:50, 1 June 2009 (UTC)I'm almost done. One of the scripts is complete, while the other has two remaining elements to be coded before it will be ready. I should have been done sooner, but some unexpected work came up, and I didn't have time to finish the code. I'll definitely be done by the week's end. The Earwig ( Talk | Editor review) 20:01, 2 June 2009 (UTC)
I did a Google search for "wikipedia.org" for May and June ...this is what I [ http://news.google.com/news?pz=1&ned=us&hl=en&q=%22wikipedia.org%22&as_drrb=b&as_minm=6&as_mind=1&as_maxm=6&as_maxd=8 found]
There are many more that have a link title or are referred such as
I believe that either the yahoo API isn't being used correctly, or simply that yahoo has feeds from fewer sites. Perhaps you made a mistake somewhere. But on google news and bing news, I get results. I am still on a wikibreak, and have responded to show that you probably made a mistake somewhere, and as such, I hope you won't abandon the bot. Thanks. Smallman12q ( talk) 00:20, 8 June 2009 (UTC)
Would scraping Wikipedia's referer logs be of any use? — Dispenser 17:22, 8 June 2009 (UTC)
I think Mr.Z-man brings up an excellent point. I'm not willing to run a bot that has a potential to make edits with a 63% error rate— we really need to refine this first. To be less error-prone, per the statistics provided, {{ onlinesource}} might be a better template to use, even though {{ press}} is written with a broader audience in mind. Here's another idea: I didn't want to do this, because I absolutely despise this kind of thing when it comes to unsupervised bots, but what about if the bot simply produced a report in the userspace and didn't tag any talk pages at all? It would require a user to manually review each one, which I don't think will work at all, and may produce a backlog. It will, however, lower the error rate to virtually zero. What do we think of these ideas?
This is why I did not want to run this bot initally, because I knew there would be some annoying issues to sort through. Hopefully, there's some way to get this to work. I have the code, but there's still some things we need to do before this could work. Thanks, The Earwig ( Talk | Editor review) 21:57, 8 June 2009 (UTC)
As discussed here, it would be useful to see the evolution of the data from Special:Statistics. I'd like to request a bot regularly saving this data in Wikipedia space, so we can have regular data for each parameter, then create graphs, etc, with the exceptions of Founder, Stewards, Importers, Transwiki importers and Uploaders, as they are not changing or unused. The first eight parameters change more often, so they may need to be saved more often than others, say every day, while others maybe every week. Cenarium ( talk) 13:42, 27 May 2009 (UTC)
A daily snapshot is already being recorded by a bot on this page. The stats go back to November 2008. Not sure if that's what you're looking for, but there you go. -- Andrew Kelly ( talk) 03:47, 10 June 2009 (UTC)
Template:Startrekproject is out on a whole lot of WP:TREK pages, redirecting to the current template, at Template:WikiProject Star Trek. If possible, can a bot replace the old template with the new one on all pages, but LEAVE the existing quality and importance ratings in place? -- Aatrek / TALK 18:15, 9 June 2009 (UTC)
…is very small.
Bonjour,
Is an English bot allowed to work in other Wikipedias ? Please, see here and here. Thanks.
Budelberger ( ) 15:09, 11 June 2009 (UTC).
Where can I find a bot that only grabs pages and searches text inside? For example, I have a list of 50 articles, and I want to see which ones contain „string”. So I need a bot that just reads, and does not save, does not even need to authenticate. Thanks Ark25 ( talk) 18:52, 11 June 2009 (UTC)
Thanks. Still, I need some tool to fetch the codes of the pages. Or can AWB save the content of pages to files? Sometimes I do not need to modify the pages, just need to see what is the code, and check which ones contain a certain string. Ark25 ( talk) 10:36, 15 June 2009 (UTC)
Does someone have a Bot that can go through the 100+ archives and add
{{
talkarchive}}
{{
archive-nav}}
to the top of each page, where xx is the archive page number (without the extra markup - tl and br)? Most or even all of them already have talkarchive, few have archive-nav. I'm up to 19 but realized there may be an easier solution. 199.125.109.126 ( talk) 01:43, 14 June 2009 (UTC)
Done. Thanks Chris G. I don't know if you type fast or used a bot, but thanks. 199.125.109.126 ( talk) 03:40, 14 June 2009 (UTC)
Please make my bot called GameBot please. Thanks, WimpyKid ( talk) 11:45, 15 June 2009 (UTC)
Please make my bot "WorldBot"
It will help out newcomers, write on wikipedians' talk pages, and create new articles.
Thanks, WimpyKid ( talk) 00:49, 16 June 2009 (UTC)
Hello, can someone go through all of the articles in Category:Food and drink stubs and tag those articles that do not have the {{ WikiProject Food and drink}} banner on their talk page? Also, if there are any of these stubs that do have the banner but do not have a class assignment please add it (stub of course).
Thanks, Jeremy ( blah blah) 06:11, 16 June 2009 (UTC)
[I'm relisting this August 2008 request (including subsequent revisions), as the editor who said he would make the edits has not done so, nor replied to many enquiries as to progress (due at least in part to understandable family matters). Since there are hundreds of templates in need of this overdue change, and this currently emitting broken microformats, the need may be considered pressing]
I've compiled a list of relevant infoboxes at User:Pigsonthewing/to-do#Date conversions.
Thank you. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:10, 16 June 2009 (UTC)
Hi, This template: {{ user lua}} was recently moved to {{ user LUA}} because the letters "lua" referred to a language code. However that template was not modified on people's userpages and affects the following people [13] and needs to be modified. The full discussion can be found here. Thanks. -- Amazonien ( talk) 05:49, 17 June 2009 (UTC)
Could someone run a bot on articles found in Category:Needed-Class articles (and subcats) and update the ratings? It seems like most of them are redirects or disambigs. Or is this normal? Headbomb { ταλκ κοντριβς – WP Physics} 15:14, 17 June 2009 (UTC)
The {{ translated page}} template is often added incorrectly to articles, rather than to the talk page (in violation of WP:SELF). I've been fixing them manually when they appear, but if there's a bot that could do this automatically on a regular basis (say once or twice a week) it would save me the effort. This could probably be done by an existing template clean-up bots. The task would be as follows:
Are there any exsiting bot owners who are willing to add this task? — Tivedshambo ( t/ c) 06:50, 17 June 2009 (UTC)
This is a pretty simple request, and I could do it myself, but if memory serves, someone else already has an approved bot that does this.
The short version of the request is: I need a bot to go through the pages in Category:Possible cut-and-paste moves and remove said category from the page. All of the involved pages will be redirects. There's about 2,500 pages involved.
At Magioladitis's request, I had the bot tack on this category to any redirects, so that a person could go through and figure out whether or not they needed to be history merged. I since came up with a much better method (see User:Mikaey/Possible cut-and-paste moves), so now I just need a bot to go through and detag all these pages for me.
Thanks, Matt ( talk) 09:38, 18 June 2009 (UTC)
I don't usually like history merges in general. If there's any overlap between the two pages, the page history gets mangled, diffs become confusing and unreadable, etc. There's a core feature (disabled by default) that was written by one of Wikimedia's contractors to deal with split histories safely (see "#$wgGroupPermissions['sysop']['mergehistory'] = true;" in DefaultSettings.php). If possible, I'd much rather seem that activated than an ad hoc bot solution. For what it's worth. Perhaps can someone can convince me this is a good idea in a bot request given appropriate safety measures.... -- MZMcBride ( talk) 02:46, 19 June 2009 (UTC)
I'd like a bot to go through all articles using {{ Nihongo}} and check for capitalization errors in the content of that template. For example, in Fukuyama University, the {{ nihongo}} template appears like this: {{nihongo|'''Fukuyama University'''|福山大学|Fukuyama daigaku}}. The bot would need to check the first parameter ('''Fukuyama University''') and check to see if the contents of the third parameter used the same capitalization. In this case, it is using "Fukuyama daigaku" instead of "Fukuyama Daigaku". The bot would change the capitalization of the contents of the third parameter if they did not match, in this case changing them to "Fukuyama Daigaku". I hope that makes sense. Can this be done? Thanks! ··· 日本穣 ? · Talk to Nihonjoe 21:34, 20 June 2009 (UTC)
In the recent overhaul of WP:Elements templates and categories I've started created redirects to main articles for each isotopes, but this is very tedious, as there are lots of isotopes (thousands).
For example, Hydrogen-7 redirects to Isotopes of hydrogen#Hydrogen-7 and is categorized in Category:Isotopes of hydrogen. Such redirects need to be recreate for all isotopes found in the lists of isotopes (don't bother with metastable isotopes for now).
The text for creation is (pay attention to capitalization please)
#REDIRECT[[Isotopes of elementname#Elementname-XX]] [[Category:Isotopes of elementname]]
Here are the Elementnames and the range of XX that needs to be created.
More to come. Headbomb { ταλκ κοντριβς – WP Physics} 02:29, 18 June 2009 (UTC)
Hi, i have a bot request partially drafted at Wikipedia:WikiProject National Register of Historic Places/Botrequest2, am hoping to get NRHP wikiproject banner added to many articles, perhaps thousands, lacking it currently. Currently the wikiproject has about 20,000 articles tagged with its banner, i am thinking a bot could add several thousand by addressing articles in categories and having certain titles. And I am hoping a bot could complete a day or two before July 4, end date of a cleanup drive we have going. It is taking time to get all the categories stated and checked carefully for the bot request. I wonder if i could ask:
Any responses will be appreciated. doncram ( talk) 11:38, 21 June 2009 (UTC)
I'd like to make a simple Python script which, given two Wikipedia articles, attempted to find a way to get from one to the other via a series of intermediary links. It would not have to log in as a user and only has to be able to browse Wikipedia pages normally as a user would, and would be completely harmless in every way. —Preceding unsigned comment added by 124.170.152.177 ( talk • contribs) 14:44, 20 June 2009 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 25 | Archive 26 | Archive 27 | Archive 28 | Archive 29 | Archive 30 | → | Archive 35 |
Many, many case citations link to case citation. THis was probably done by a bot, and is completely useless as case citation itsself, says to hit back and look on the previous page. If these can be linked to the actual cases, great, but otherwise please get rid of these useless, misleading, links. I have talked to numerous people who absolutely hate these links and want them gone. However, i think they were put ehre by a bot, and only really a bot can remove them all. Scientus ( talk) 07:58, 12 May 2009 (UTC)
Yeah, those really are annoying. Of course we don't want to remove all links to case citation; just those that pipe the link, where the shown string starts with a number. Let me look at this. Doing... – Quadell ( talk) 12:35, 12 May 2009 (UTC)
Considering that our audience is mainly laypersons who may not be familiar with the formatting for legal citation, shouldn't at least one of the citations in an article (perhaps the first one) include the link? Jim Simmons ( talk) 15:30, 12 May 2009 (UTC)
I left a note at User talk:Quadell about this. I'd like to see more discussion about a large-scale change like this. At least something more concrete than, "I have talked to numerous people who absolutely hate these links and want them gone." ;-) -- MZMcBride ( talk) 21:28, 12 May 2009 (UTC)
The reason I see for why they might have been put there in the first place is to boost the already crazy-high Wikipedia SEO: having specific links that don't point to the context of their name mislead not only people but search engines. Scientus ( talk) 00:15, 13 May 2009 (UTC)
I've never liked having the first link in an article point outside of Wikipedia, which is what you have with external links to the full opinion in the article lead. A better practice might be to link to the article on the specific case reporter, which would go a longer way towards explaining the citation than the general case citation article. At this point we should have articles on all the main American ones, such as United States Reports, Federal Reporter, Federal Supplement...even the regional reporters for states, such as Pacific Reporter. Postdlf ( talk) 15:39, 13 May 2009 (UTC)
The reports were updated with data from the March dump. For the first time we have the full results on the various scans being run. For some we already have identified how these could be fixed by bot (e.g. by AWB or pywikipediabot), for others this still needs to be done. I'd appreciate your help. -- User:Docu
It still needs to be done, so... I don't know what's the procedure for dealing with that... Help? Headbomb { ταλκ κοντριβς – WP Physics} 03:31, 14 May 2009 (UTC)
Is it possible for a bot to go through the history?
For old archives that have a lot of unsigned posts, it would be nice to have a bot that would go through history and figure out where to use the {{ unsigned}} to sign the unsigned posts with name (or IP) and date/time.
Thanks,
-- stmrlbs| talk 05:47, 18 May 2009 (UTC)
Is there a bot that can take a list of titles, check to see which ones aren't already occupied, and then create pages for them with identical prespecified code? That would be really useful for mass de-redlinking of various types of prehistoric animals. Sorry if this is a stupid question, I have no idea how the whole bot thing works here on Wikipedia. Abyssal ( talk) 17:14, 19 May 2009 (UTC)
Oh, and while we're on the subject, could the bot used for this project, or another bot, be used to add project headers to the talk pages of the created articles? Abyssal ( talk) 18:02, 19 May 2009 (UTC)
Would it be a good idea for us to have a bot take care of the mundane task of updating Wikipedia:Good articles/recent with the titles of recently passed GA noms? This seems like a simple task, and we already have bots to update the articlehistory and {{ GA number}}. -- ErgoSum88 ( talk) 14:50, 10 May 2009 (UTC)
Is it possible for a bot to generate a list of all articles which are both
In other words, can a bot create a list of all redirects which are also tagged in their Talk space with a WP Films project banner?
Many thanks in advance, Girolamo Savonarola ( talk) 01:03, 16 May 2009 (UTC)
All Ohio townships have a section (identical for virtually all townships statewide) detailing their form of government; see the Government section of Madison Township, Richland County, Ohio for an example. You'll see that "...There is also an elected township clerk, who serves a four-year term...Vacancies in the clerkship or on the board of trustees..." is part of this text. The state legislature recently changed the title of this type of official from "clerk" to "fiscal officer"; could someone write a bot to convert "clerk" to "fiscal officer" in these exact strings of text? A few townships have already been updated, but there are well over 1,000 that haven't been updated yet. Nyttend ( talk) 01:08, 20 May 2009 (UTC)
Can a bot notify the creator and contributors with x amount of edits of an AfD? This would be done by examining the wikipedia edit history of a page (or the below two webpages):
If this is not possible, can a bot be made to notify the creator of an article of an AfD?
For example, the first editor on a page is found here:
http://en.wikipedia.org/?title=Wikipedia:Bot requests&dir=prev&action=history&limit=1
The list of today's afd's is found here: WP:AFDT
Ikip ( talk) 00:14, 3 May 2009 (UTC)
Is this related to Wikipedia:Bots/Requests for approval/CSDCheckBot? – Quadell ( talk) 12:57, 13 May 2009 (UTC)
I've started work on this. -- Erwin ( talk) 21:36, 21 May 2009 (UTC)
The bot's up for approval at Wikipedia:Bots/Requests for approval/Erwin85Bot 8. -- Erwin ( talk) 11:05, 22 May 2009 (UTC)
Request of 14 May 2009, by : — Sniff ( talk) 21:42, 14 May 2009 (UTC)
Request :
List all pages of your category
Prince Edward Island in subcategorys and show french articles with him (cf. Exemple) Also, count the number of English articles and French articles. If there are any questions, feel free. Thanks!
It's to translate your articles in French (or just to prepare a list for translators).
I'm guessing that adding ":fr" to the front of the links that (I think) are in the French language Wikipedia will make this request make more sense, and have done so. I also note that the words "English" are links here, though gray; that's somewhat disconcerting. -- John Broughton (♫♫) 01:14, 16 May 2009 (UTC)
I'd like to request a bot for the Lojban wikipedia (English article on Lojban). The bot ought to do one or more of the following tasks:
text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==third lowest level heading== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ===second lowest level heading=== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ====lowest level headint==== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==third lowest level headint== text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text
ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==ni'oni'oni'oni'o third lowest level heading== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ===ni'oni'oni'o second lowest level heading=== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ====ni'oni'o lowest level headint==== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text ==ni'oni'oni'oni'o third lowest level headint== ni'o text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text text
I know it may not be the most apropiate thing to ask here for a bot for another wikipedia, but didn't find anywhere else to do so.-- Homo logos ( talk) 15:57, 23 May 2009 (UTC)
I propose a task for finding and tagging with the tag {{subst:dated|uncategorized}} pages without any categories. Beagel ( talk) 19:14, 23 May 2009 (UTC)
In Bugzilla:18829 we learn that
however many created previous to that date lack border="1", and will barely readable when cut and pasted outside wikipedia, or by text browser users.
By writing and deploying this bot, you will help Wikipedia better fulfil its Wikipedia:Accessibility#Users_with_limited_CSS.2FJavaScript_support goals of Wikipedia:Accessibility#Tables.
The tables in question, those with class wikitable, are all expected to have borders on them, as that is what is in the stylesheet. So all the author needs to do is check for the class, wikitable, and then double check if there already is a border=... parameter. And then only if not, go ahead and add border="1". (There is no need to also check for "before Aug 2008", as that is not an exact check anyway, as people might have cloned the bad tables from elsewhere later.
Also tables without class wikitable do not have the problem (they look the same, borders or not, stylesheets or not.) so their border= choices should be respected and not tampered with.
Jidanni ( talk) 14:13, 18 May 2009 (UTC)
Okay, who wants to write and run a bot to retag 1.6 million files? ;-)
See commons:Commons:License Migration Task Force. Dragons flight ( talk) 22:56, 27 May 2009 (UTC)
AMG has changed their url format. Because of this the link may soon no longer work. So can you search and change the two templates removing the 1: and 2:.
changing them from:
This would be a GREAT help for us. Thanks. -- Phoenix ( talk) 04:31, 19 May 2009 (UTC)
Insert non-formatted text here
Currently the movie templates are used like {{Amg movie |1:356351 |Quantum of Solace}}, which creates the link http://www.allmovie.com/cg/avg.dll?p=avg&sql=1:356351 which redirects to http://www.allmovie.com/work/356351. Luckily the link http://www.allmovie.com/work/1:356351 also pulls up the page just fine (at the moment), so if you change the template to link to http://www.allmovie.com/work/{{{1}}} it shouldn't break any links. Then someone (I can do it) could run AWB to remove 1: and 2: from the templates. – Quadell ( talk) 01:26, 20 May 2009 (UTC)
Thanks :-) -- Phoenix ( talk) 01:14, 28 May 2009 (UTC)
Some ticker symbols (e.g. JAKK) do not link to the companies they represent. I'd like to see a bot that would ensure that every company that had an article and a ticker symbol had its article linked (either as a redirect or on a disambiguation page) from the ticker symbol. Neon Merlin 05:00, 28 May 2009 (UTC)
Hi there, I often find myself obsessive-compulsively moving around <ref> tags because I believe they don't comply to what I think I once read are their usage guidelines.
So, if we denote our citation with "[1]", I believe the following applies (please kindly point me to the written and agreed-upon guidelines because I can't find them any more)
1. Ref should go after punctuation
2. No whitespace between punctuation and ref, or between ref and ref.
Would this be something that could be done by a bot? Or does it fall into the "cosmetics" class, which I understand is not bottable? If this is possible and considered useful, please let me know. I could have a go and do it myself. Thank you. 114.150.83.231 ( talk) 18:49, 28 May 2009 (UTC)
Thanks. Could you please give me examples of the possible false positives you have in mind, so I can understand? Cheers. 114.150.83.231 ( talk) 18:59, 28 May 2009 (UTC)
I think at least automatically eating any whitespace to the left of any reference tag (item 2 above) should be fairly safe, no? 114.150.83.231 ( talk) 19:03, 28 May 2009 (UTC)
Thanks. It looks to me like there is a clear consensus for point 2 above (spaces). I'm new to bots. Could you kindly summarize (or point me to relevant discussion/policy) for the rationale behind "it would be a cleanup thing that isn't worth having a bot"? Thank you. 114.150.72.41 ( talk) 00:26, 29 May 2009 (UTC)
I'm not sure I understand. It's bad to have a bot do this, but nobody would object to users wasting their time effecting the same number of edits manually fixing the refs in compliance with agreed-upon guidelines. If server load is the problem, can we not throttle the bot to whatever level is perceived necessary? 114.150.72.41 ( talk) 02:27, 29 May 2009 (UTC) ...and as for the usefulness of such a bot, I guess it's subjective. To me, it would be helping making Wikipedia look more consistent and professional. 114.150.72.41 ( talk) 02:30, 29 May 2009 (UTC)
I need a bot to go through the articles in Category:Mixed Drinks articles by quality and change any banner with the |focus=bar switch so that the banner will contain the |bar=yes switch.
eg
{{
WikiProject Mixed Drinks|focus=bar}}
→ {{
WikiProject Mixed Drinks|bar=yes}}
The total number of articles is around three hundred (300) and about 30% have the |focus=bar switch
Thank you for your time, -- Jeremy ( blah blah) 02:31, 25 May 2009 (UTC)
I'd love to do this using AWB, I even downloaded it but cannot figure out how to use it. -- Jeremy ( blah blah) 06:18, 25 May 2009 (UTC)
Thank you, that would be greatly appreciated. -- Jeremy ( blah blah) 14:30, 25 May 2009 (UTC)
{{
Doing}}
- I ran a filter through the list, so it's going through all the talk pages in the Category - a total of 298. As you say there are only about 100 find/replaces to be done, it should go through the list and complete the task soon! :)
The
Helpful
One 14:58, 25 May 2009 (UTC)Could some one tag all of the articles under Category:Bartending with WPMIX |bar=yes? The reason this was the problem is that not all articles have the |bar= switch on them. -- Jeremy ( blah blah) 04:48, 27 May 2009 (UTC)
I thank you sir. -- Jeremy ( blah blah) 04:07, 30 May 2009 (UTC)
This is a large request and I would like to know if this can be done easily...
I would like to have a bot to go through the Category:Unassessed Food and drink articles and assess the stubs. The problem is there are 10,322 unassessed articles and I am afraid this would bog the system down.
Could this be done without disturbing the system?
-- Jeremy ( blah blah) 02:46, 25 May 2009 (UTC)
Thank you, if that would not be a bother, it would be appreciated. -- Jeremy ( blah blah) 04:44, 25 May 2009 (UTC)
Those are very good, most are dead on there were a couple that were rated as Start that could be a C. The difference I use is the amount of content in the article. I would say go ahead and do that for us.-- Jeremy ( blah blah) 00:59, 26 May 2009 (UTC)
Thank you very much sir! -- Jeremy ( blah blah) 04:06, 30 May 2009 (UTC)
We need a bot to report cases of certain abuse filters being tripped. I watch several AFs where the only activity is by easy-to-spot long-term-abuse socks. But if I am not actively watching them and do not have abuse IRC open, then the socks just keep trying and trying until they find a way around the filter. If a bot monitored such AFs and immediately reported them to WP:AIV/TB2 - which is watchlisted by many admins - the socks would be blocked very quickly. Recommend the bot operator have sole discretion over which AFs are reported immediately, taking discussion/consensus into account in contentious cases. As a bonus, maybe other AFs are reported only if tripped a certain number of times - as happens in the abuse IRC. Thanks! Wknight94 talk 18:32, 30 May 2009 (UTC)
List all articles alphabetically (no doublon please) of the category and subcategories Prince Edward Island here. Thanks! — Sniff ( talk) 14:05, 31 May 2009 (UTC)
I'm requesting for a bot to stick the {{ Longtalk}} template onto talkpages that are indeed excessively long. This should be a very very simple bot to code.=D Smallman12q ( talk) 18:15, 31 May 2009 (UTC)
Will one of the message delivery bots please deliver the message discussed at WP:VPP#NoMultiLicense template to the talk pages of each user using {{ NoMultiLicense}}, in preparation for the Wikimedia relicensing? There seems to be about 127 of them. Thanks. Anomie ⚔ 01:57, 2 June 2009 (UTC)
Hello! I was wondering if it might be possible to see a list of articles which are tagged for both WikiProject Films and WikiProject Biography, since the former's scope does not include biographies, but the project has a tough time tracking mistagged articles. A one-time generated list is fine; something on the toolserver which dynamically updates would be even better! :) Many thanks in advance, Girolamo Savonarola ( talk) 18:03, 2 June 2009 (UTC)
Could you tag the talk pages of these portals and their subpages as follows:
{{
WikiProject Food and drink|drink=yes}}
, and categorize the article pages as
Category:Drink Portal{{
WikiProject Food and drink}}
, and categorize the article pages as
Category:Food Portal{{
WikiProject Wine}}
, and categorize the article pages as
Category:Wine Portal{{
WikiProject Beer}}
, and categorize the article pages as
Category:Beer Portal{{
WikiProject France}}
, and categorize the article pages as
Category:Lyon Portal{{
WikiProject Massachusetts}}
, and categorize the article pages as
Category:Massachusetts Portal{{
WikiProject Visual arts}}
, and categorize the article pages as
Category:Arts portalsThank you for your time, -- Jeremy ( blah blah) 06:40, 1 June 2009 (UTC)
Sorry about not being more specific; here is what I would like to be done:
Is that a little more helpful? -- Jeremy ( blah blah) 21:58, 1 June 2009 (UTC)
A few more answers:
Is there any other questions you need me to answer? I want to make this as smooth as possible for you and will gladly help. -- Jeremy ( blah blah) 23:18, 1 June 2009 (UTC)
Thank you -- Jeremy ( blah blah) 05:09, 3 June 2009 (UTC)
Would a bot owner deliver the message at Wikipedia talk:Meetup/DC 7#Final announcement? (That page lists the three user categories for whom the message delivery is requested.)
Thanks!
-- John Broughton (♫♫) 15:59, 2 June 2009 (UTC)
A bot is needed to update Wikipedia:WikiProject Football/Unreferenced BLPs/Sorted by country by computing the intersections of each subcategory of Category:Football (soccer) players by nationality with Category:All unreferenced BLPs, per the discussion on my talk page. Though I'm a bot operator myself, this is beyond what I can easily handle. Erik9 ( talk) 23:51, 28 May 2009 (UTC)
We need a bot to check subcategories of commons:Category:OTRS pending (the ones that are older than 30 days) and notify the uploaders of those files (using a template) and also replace the {{ OTRS pending}} tag on the file page with {{ No OTRS permission since}}. There are further details here. Feel free to ask any questions here, there, or on either of my talk pages. - Rjd0060 ( talk) 23:38, 2 June 2009 (UTC)
Hello people! I'm here to tell the English Wikipedia's community about import a bot to the Portuguese Wikipedia. The bot is the CorenSearchBot ( talk · contribs · count), that is operated by Coren ( talk · contribs · count), and the bot search for Violation of Copyright in articles. The user HyperBroad was suggested it there, and the suggestion was aproveted ( link here). But, i talked with Coren and he said me that he don't have time and he don't speak Portuguese to operate it. So, he said to me write here to ask: Any body wants to operate it there? Coren will help and divugate the source code if any one wants. Vitorbraziledit ( talk) 02:59, 4 June 2009 (UTC)
It would be nice if a bot retrieved the |journal= parameter from {{ citation}} and {{ cite journal}} (probably using data dumps) and built a list of journals and journal abbreviations with the number of times they are found. This would be useful for Wikipedia:WikiProject Journals, so they could assess what are the high-priority missing journals, redirect to main articles, etc...
The list should be alphabetically ordered, with entries linked. Redirects should be italicized. Place the list at Wikipedia:WikiProject Academic Journals/Bot compilation/X1, where X is the appropriate letter. If articles start with The X, then classify according to X. A 500 entries per page limit would be a good idea (then go to X2, X3...). After this is done, any redlink with a count of over 10 hits (1 citation = 1 hit) should be placed at Wikipedia:WikiProject Academic Journals/Bot compilation/Missing articles and redirects/1. Again a 500 entries per page limit would be a good idea (then go to /2, /3, ...). Headbomb { ταλκ κοντριβς – WP Physics} 03:10, 30 May 2009 (UTC)
(unindent) Oh, and to the bot coder, if you could also place existing articles & redirect in bold (in addition to what I just wrote above), it would make it much easier to keep track of progress. Here's an example of what the end product should look like in case I'm not very clear.
Alphabetical | By hits |
|
|
Thanks. Headbomb { ταλκ κοντριβς – WP Physics} 15:07, 2 June 2009 (UTC)
Currently, WP Elements and the inactive WP Isotopes are using multiple banners with various parameters used in a more or less consistent manner to achieve various things. The request is sort of "three-step", although implementation doesn't have to be. These tables might prove useful.
{{
WP Elements}}
and {{
V0.5}}
. The parameters |class= and |importance= should be imported into {{
WP Elements}}
and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed.{{
WP Elements|isotope=yes}}
. The parameters |class= and |importance= should be imported when possible, and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed. If the pages are redirects, they should be tagged with |class=redirect.{{
WP Elements|isotope=yes}}
. The parameters |class= and |importance= should be imported when possible, and be written if not present. The parameter |peer-review=yes should be imported if present, but otherwise should be removed. If the pages are redirects, they should be tagged with |class=redirect.{{
WP Elements|class=category|importance=na}}
and the categories of
Category:Isotopes of elementname should be tagged with {{
WP Elements|class=category|importance=na|isotopes=yes}}
.{{
WikiProjectNotice|PROJECT=XXX}}
, where XXX contains the words "chemistry", "isotope/isotopes" or "element/elements", or combinations of them should be removed, as they are redundant with the new banner.That was a mouthful. Headbomb { ταλκ κοντριβς – WP Physics} 21:09, 6 June 2009 (UTC)
I'd like to request a bot that would go through the news and discover when links are made to wikipedia articles and create a press multi on the talk page. Ideally, the bot would scrape google news and see which articles contain a wikipedia link. Does anyone support this? Smallman12q ( talk) 20:09, 5 May 2009 (UTC)
(New indent)No I suppose-I think I just assumed, well I don't really know!(I was a bit rushed)! OK, I'll keep looking! dottydotdot ( talk) 07:27, 20 May 2009 (UTC)
Regardless, accessing Google News for links to Wikipedia via a bot would be a breach of their Terms of Service:
"You agree not to access (or attempt to access) any of the Services by any means other than through the interface that is provided by Google, unless you have been specifically allowed to do so in a separate agreement with Google. You specifically agree not to access (or attempt to access) any of the Services through any automated means (including use of scripts or web crawlers) and shall ensure that you comply with the instructions set out in any robots.txt file present on the Services." — Google Terms of Service, Section 5.3
You'd have to do it manually, as Google does not issue API keys anymore. Another option is to try another news search engine, such as Yahoo News, that does give out API keys. I ran into the same problem with EarwigBot II ( BRFA · contribs · actions log · block log · flag log · user rights), as did Coren with CorenSearchBot ( BRFA · contribs · actions log · block log · flag log · user rights). I hope that helps! The Earwig ( Talk | Editor review) 02:58, 25 May 2009 (UTC)
Still doing... Most of the code is done. I just have to finish writing two of the functions (query() and makechanges()), and then I'll put up a BRFA for it. The Earwig ( Talk | Editor review) 23:56, 30 May 2009 (UTC)
{{
press}}
a parameter, such as {{
press|bot=yes}}
, identifying it as done automatically. Then, users viewing the talk page could notice it, and verify it if there was actually a citing on the page. Does anyone support this?
The Earwig (
Talk |
Editor review) 01:50, 1 June 2009 (UTC)I'm almost done. One of the scripts is complete, while the other has two remaining elements to be coded before it will be ready. I should have been done sooner, but some unexpected work came up, and I didn't have time to finish the code. I'll definitely be done by the week's end. The Earwig ( Talk | Editor review) 20:01, 2 June 2009 (UTC)
I did a Google search for "wikipedia.org" for May and June ...this is what I [ http://news.google.com/news?pz=1&ned=us&hl=en&q=%22wikipedia.org%22&as_drrb=b&as_minm=6&as_mind=1&as_maxm=6&as_maxd=8 found]
There are many more that have a link title or are referred such as
I believe that either the yahoo API isn't being used correctly, or simply that yahoo has feeds from fewer sites. Perhaps you made a mistake somewhere. But on google news and bing news, I get results. I am still on a wikibreak, and have responded to show that you probably made a mistake somewhere, and as such, I hope you won't abandon the bot. Thanks. Smallman12q ( talk) 00:20, 8 June 2009 (UTC)
Would scraping Wikipedia's referer logs be of any use? — Dispenser 17:22, 8 June 2009 (UTC)
I think Mr.Z-man brings up an excellent point. I'm not willing to run a bot that has a potential to make edits with a 63% error rate— we really need to refine this first. To be less error-prone, per the statistics provided, {{ onlinesource}} might be a better template to use, even though {{ press}} is written with a broader audience in mind. Here's another idea: I didn't want to do this, because I absolutely despise this kind of thing when it comes to unsupervised bots, but what about if the bot simply produced a report in the userspace and didn't tag any talk pages at all? It would require a user to manually review each one, which I don't think will work at all, and may produce a backlog. It will, however, lower the error rate to virtually zero. What do we think of these ideas?
This is why I did not want to run this bot initally, because I knew there would be some annoying issues to sort through. Hopefully, there's some way to get this to work. I have the code, but there's still some things we need to do before this could work. Thanks, The Earwig ( Talk | Editor review) 21:57, 8 June 2009 (UTC)
As discussed here, it would be useful to see the evolution of the data from Special:Statistics. I'd like to request a bot regularly saving this data in Wikipedia space, so we can have regular data for each parameter, then create graphs, etc, with the exceptions of Founder, Stewards, Importers, Transwiki importers and Uploaders, as they are not changing or unused. The first eight parameters change more often, so they may need to be saved more often than others, say every day, while others maybe every week. Cenarium ( talk) 13:42, 27 May 2009 (UTC)
A daily snapshot is already being recorded by a bot on this page. The stats go back to November 2008. Not sure if that's what you're looking for, but there you go. -- Andrew Kelly ( talk) 03:47, 10 June 2009 (UTC)
Template:Startrekproject is out on a whole lot of WP:TREK pages, redirecting to the current template, at Template:WikiProject Star Trek. If possible, can a bot replace the old template with the new one on all pages, but LEAVE the existing quality and importance ratings in place? -- Aatrek / TALK 18:15, 9 June 2009 (UTC)
…is very small.
Bonjour,
Is an English bot allowed to work in other Wikipedias ? Please, see here and here. Thanks.
Budelberger ( ) 15:09, 11 June 2009 (UTC).
Where can I find a bot that only grabs pages and searches text inside? For example, I have a list of 50 articles, and I want to see which ones contain „string”. So I need a bot that just reads, and does not save, does not even need to authenticate. Thanks Ark25 ( talk) 18:52, 11 June 2009 (UTC)
Thanks. Still, I need some tool to fetch the codes of the pages. Or can AWB save the content of pages to files? Sometimes I do not need to modify the pages, just need to see what is the code, and check which ones contain a certain string. Ark25 ( talk) 10:36, 15 June 2009 (UTC)
Does someone have a Bot that can go through the 100+ archives and add
{{
talkarchive}}
{{
archive-nav}}
to the top of each page, where xx is the archive page number (without the extra markup - tl and br)? Most or even all of them already have talkarchive, few have archive-nav. I'm up to 19 but realized there may be an easier solution. 199.125.109.126 ( talk) 01:43, 14 June 2009 (UTC)
Done. Thanks Chris G. I don't know if you type fast or used a bot, but thanks. 199.125.109.126 ( talk) 03:40, 14 June 2009 (UTC)
Please make my bot called GameBot please. Thanks, WimpyKid ( talk) 11:45, 15 June 2009 (UTC)
Please make my bot "WorldBot"
It will help out newcomers, write on wikipedians' talk pages, and create new articles.
Thanks, WimpyKid ( talk) 00:49, 16 June 2009 (UTC)
Hello, can someone go through all of the articles in Category:Food and drink stubs and tag those articles that do not have the {{ WikiProject Food and drink}} banner on their talk page? Also, if there are any of these stubs that do have the banner but do not have a class assignment please add it (stub of course).
Thanks, Jeremy ( blah blah) 06:11, 16 June 2009 (UTC)
[I'm relisting this August 2008 request (including subsequent revisions), as the editor who said he would make the edits has not done so, nor replied to many enquiries as to progress (due at least in part to understandable family matters). Since there are hundreds of templates in need of this overdue change, and this currently emitting broken microformats, the need may be considered pressing]
I've compiled a list of relevant infoboxes at User:Pigsonthewing/to-do#Date conversions.
Thank you. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:10, 16 June 2009 (UTC)
Hi, This template: {{ user lua}} was recently moved to {{ user LUA}} because the letters "lua" referred to a language code. However that template was not modified on people's userpages and affects the following people [13] and needs to be modified. The full discussion can be found here. Thanks. -- Amazonien ( talk) 05:49, 17 June 2009 (UTC)
Could someone run a bot on articles found in Category:Needed-Class articles (and subcats) and update the ratings? It seems like most of them are redirects or disambigs. Or is this normal? Headbomb { ταλκ κοντριβς – WP Physics} 15:14, 17 June 2009 (UTC)
The {{ translated page}} template is often added incorrectly to articles, rather than to the talk page (in violation of WP:SELF). I've been fixing them manually when they appear, but if there's a bot that could do this automatically on a regular basis (say once or twice a week) it would save me the effort. This could probably be done by an existing template clean-up bots. The task would be as follows:
Are there any exsiting bot owners who are willing to add this task? — Tivedshambo ( t/ c) 06:50, 17 June 2009 (UTC)
This is a pretty simple request, and I could do it myself, but if memory serves, someone else already has an approved bot that does this.
The short version of the request is: I need a bot to go through the pages in Category:Possible cut-and-paste moves and remove said category from the page. All of the involved pages will be redirects. There's about 2,500 pages involved.
At Magioladitis's request, I had the bot tack on this category to any redirects, so that a person could go through and figure out whether or not they needed to be history merged. I since came up with a much better method (see User:Mikaey/Possible cut-and-paste moves), so now I just need a bot to go through and detag all these pages for me.
Thanks, Matt ( talk) 09:38, 18 June 2009 (UTC)
I don't usually like history merges in general. If there's any overlap between the two pages, the page history gets mangled, diffs become confusing and unreadable, etc. There's a core feature (disabled by default) that was written by one of Wikimedia's contractors to deal with split histories safely (see "#$wgGroupPermissions['sysop']['mergehistory'] = true;" in DefaultSettings.php). If possible, I'd much rather seem that activated than an ad hoc bot solution. For what it's worth. Perhaps can someone can convince me this is a good idea in a bot request given appropriate safety measures.... -- MZMcBride ( talk) 02:46, 19 June 2009 (UTC)
I'd like a bot to go through all articles using {{ Nihongo}} and check for capitalization errors in the content of that template. For example, in Fukuyama University, the {{ nihongo}} template appears like this: {{nihongo|'''Fukuyama University'''|福山大学|Fukuyama daigaku}}. The bot would need to check the first parameter ('''Fukuyama University''') and check to see if the contents of the third parameter used the same capitalization. In this case, it is using "Fukuyama daigaku" instead of "Fukuyama Daigaku". The bot would change the capitalization of the contents of the third parameter if they did not match, in this case changing them to "Fukuyama Daigaku". I hope that makes sense. Can this be done? Thanks! ··· 日本穣 ? · Talk to Nihonjoe 21:34, 20 June 2009 (UTC)
In the recent overhaul of WP:Elements templates and categories I've started created redirects to main articles for each isotopes, but this is very tedious, as there are lots of isotopes (thousands).
For example, Hydrogen-7 redirects to Isotopes of hydrogen#Hydrogen-7 and is categorized in Category:Isotopes of hydrogen. Such redirects need to be recreate for all isotopes found in the lists of isotopes (don't bother with metastable isotopes for now).
The text for creation is (pay attention to capitalization please)
#REDIRECT[[Isotopes of elementname#Elementname-XX]] [[Category:Isotopes of elementname]]
Here are the Elementnames and the range of XX that needs to be created.
More to come. Headbomb { ταλκ κοντριβς – WP Physics} 02:29, 18 June 2009 (UTC)
Hi, i have a bot request partially drafted at Wikipedia:WikiProject National Register of Historic Places/Botrequest2, am hoping to get NRHP wikiproject banner added to many articles, perhaps thousands, lacking it currently. Currently the wikiproject has about 20,000 articles tagged with its banner, i am thinking a bot could add several thousand by addressing articles in categories and having certain titles. And I am hoping a bot could complete a day or two before July 4, end date of a cleanup drive we have going. It is taking time to get all the categories stated and checked carefully for the bot request. I wonder if i could ask:
Any responses will be appreciated. doncram ( talk) 11:38, 21 June 2009 (UTC)
I'd like to make a simple Python script which, given two Wikipedia articles, attempted to find a way to get from one to the other via a series of intermediary links. It would not have to log in as a user and only has to be able to browse Wikipedia pages normally as a user would, and would be completely harmless in every way. —Preceding unsigned comment added by 124.170.152.177 ( talk • contribs) 14:44, 20 June 2009 (UTC)