This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 8 | Archive 9 | Archive 10 | Archive 11 | Archive 12 | → | Archive 15 |
I'm building a research (ie, no edit) bot in C++... since I'm not really that experienced in programing I was wondering if someone would be willing to check my pseudocode?
The basic concept behind the bot is to identify when a particular string of text was added to an article using a binary search method. In theory it could search though the history of a page with 10,000 edits with less then 15 page-requests.
A research program like this will be a helpful tool in tracking down subtle vandals and spammers. So.. I've kinda drifted. Anyone more experienced with OOP languages want to audit my pseudocode? --- J.S ( T/ C) 23:59, 28 December 2006 (UTC)
Here is a perl implementation, less the essential bits (which could easily be added). I'm not entirely sure if the algorithm will even work properly, actually. Something seems off about it. -- Jmax- 10:09, 29 December 2006 (UTC)
I know there's already autoarchive bots running such as the one archiving this page, but I think a bot that operates a little differently could be effectively used to archive all article talkpages. First off, it would only archive talk pages that are very long, so 3 year old comments on a tiny talk page would be left untouched. When the bot runs across a very long talk page it will archive similarly to current bots, but with a high threshold, for example all sections older than 28 days (rather than the typical 7 days). Also, unlike current bots I'd suggest we make this opt out rather than opt in, although very busy talk pages or talk pages that are manually archived wouldn't be touched anyway because they'd either be short enough or would have no inactive sections. Vicarious 03:56, 1 January 2007 (UTC)
I suspect this idea isn't even remotely feasible, but I thought I'd suggest it in case I was wrong. A bot that posts a note on a user's talk page when they reach a milestone edit count (1k, 5k, whatever). It'd say congrats and maybe have a time and link for the thousandth edit. Vicarious 05:26, 1 January 2007 (UTC)
I suspect this idea has been thought of before but i don't see its fruit so here goes. When i first discovered the talk page here I couldn't for the life of me understand why wikipedia couldn't have a normal message box interface, even if it need be public. This simply means showing the thread of an exchange on different talkpages. It would save us having to keep an eye for a reply on the page we left a message the day before etc. A bot can simply thread a talk exchange, of course this would require tagging our talk as we reply to a message. This is more a navigational issue but since its not been integrated into the main wiki OS it seems to be left for a bot. I don't know how it would run though. Suggestions? frummer 17:57, 1 January 2007 (UTC)
I was wondering if it would be possible to create a bot that would serve solely to revert the addition of a link to the oregon trail article. once every other week or so a user adds a link for a free game download that we delete off the article. the bot would just have to monitor the External links category, removing the link: http://www.spesw.com/pc/educational_games/games_n_r/oregon_trail_deluxe.html Oregon Trail Deluxe download whenever it appears. Thanks, please let me know on my talk page if this is a possibility that anyone could take up. Thanks again, b_cubed 17:00, 2 January 2007 (UTC)
May someone please operate this for me? It's already been userpaged, accounted, and flagged. D•a•r•k•n• e• s•s•L•o•r•d• i•a•n••• CCD••• 22:12, 2 January 2007 (UTC)
I would like to suggest the creation of a BOT to defend articals for children's show. For some reason these pages appile to vandals and I think somthing needs to help protect them. I'll use an exsample before the Dora the Explorer page was put back under protection it was vandalized alot one time sticks in my mind the most was by a user named Oddanimals who, stated Dora was 47 and had a sex change along with a few other sex related comments, and replaced the word Bannana in Boot's artical with the S curse word. This is not proper to say the least and one of the users I talked to said that the Backyardagains artical is also vandalized alot. Parents, kids, and people ,like me, who just enjoy those shows look it up and this kind of thing should NOT be allowed. Thank You Superx 23:18, 2 January 2007 (UTC)
True but Those BOTs are checking other pages as well. that Vandalizm stuck out like a sore thumb and none of those bots caught it except for one after I fixed it myself and I think that just one BOT who's job it is too check those pages would be better than sevaral others who are checking a bunch of other pages as well. Superx 01:10, 3 January 2007 (UTC)
Yes but that would only apily here if the stuff I mentioned ACTULLY HAD SOMETHING TO DO WITH THE SHOW! Curse words and other such stuff is only allowed if it is relavent to the artical and none of that is like that thus making that point you mentioned doesn't apliy in this situation. Superx 12:00, 5 January 2007 (UTC)
Need to migrate all the existing transclusions of {{ CopyrightedFreeUse}} to {{ PD-release}} per discussion here. BetacommandBot started on this a few weeks ago and then mysteriously quit about 7/8ths of the way through and I haven't been able to get a response from Betacommand since then. Could someone else finish this so that we can finally delete that template. Thanks. Kaldari 01:27, 3 January 2007 (UTC)
Please add ro interwiki to all popes pages. Just created, Romihaitza 12:31, 3 January 2007 (UTC)
We need to add the {{ WikiProject France}} to all the articles belonging to France and its sub categories. So would be nice if someone could do it for us or tell me how to do it. STTW (talk) 09:45, 4 January 2007 (UTC)
People usually do a good job of protecting the templates on the Main Page; but there have been some that slip through the cracks and the results can be disastrous. I propose a bot that would be given sysop status. I know this is controversial, and there was a big discussion about a similar request at the AFD page awhile back. Such, anyone allowed to know the password must have already been approved for adminship through conventional means, and it should be open-source. It will protect the next day's templates in advance of them being on the Main Page (say, 24 hours) and then unprotect them afterwards. Preferably, it would make sure the pages stay protected until off the Main Page, and even be able to work with the pictures for POTD, but they'd have to be specified in advance, whereas the templates would run on the {{CURRENTDAY}} magic word system. This would be a big help in reducing the possibility of Main Page vandalism (believe me, it happens).-- Here T oHelp 03:52, 5 January 2007 (UTC)
I have the feeling I'm gonna get yelled at for this one, but how about a bot that deletes articles that have a clear concensus on Wikipedia:Articles for deletion. For example, it's quite obvious that Wikipedia:Articles for deletion/Myspacephobia is going to get deleted, but it's currently waiting for an admin to do the work. Yes I know this would mean an admin bot, but that's not without precedent. Also, this bot would ONLY work on articles with a very obvious concensus. As for vandals abusing the bot, I don't think it would be an issue. First off it'd ignore IPs, secondly it'd have a minimum amount of time for voting, and there's too many legitamate voters to contest a bad faith deletion for the bot to touch it. Btw, this bot would also close candidates that are clearly keep as well. Vicarious 07:39, 5 January 2007 (UTC)
{{Infobox musical artist 2
->
{{Infobox musical artist
86.201.106.176 13:23, 5 January 2007 (UTC)
I do not have any programming skills about running bots. I can handle and run the bot if someone writes the code to replace the image link from the articles, with the existing image on commons with different name. I suppose this type of bot could be useful other than english wikipedia as well in some of the cases. There are many examples of the images could be found in this category. Shyam ( T/ C) 19:52, 5 January 2007 (UTC)
In theory, the bot will look through new articles to try and find key phrases like "our products" and "we are a". It then places a template on the page like this:
And places it in a relevant category. A human (or other intelligent individual) would then look through the list and nominate any articles that are blatant ads for WP:SPEEDY.
What do you think? --/// Jrothwell ( talk)/// 13:15, 29 December 2006 (UTC)
(undent) The issue of false positives is important. Certainly if a large majority of flaggings related to a particular phrase are in fact in error, that phrase shouldn't be used by the bot. But keep in mind that this flagging will only be used for new articles, which are much more likely to be spam then existing ones, so drawing conclusions from your search of existing articles isn't necessarily a good idea.
In any case, the bot should be tested by seeing what happens using a given phrase for (say) the first ten articles it finds. For example, our products looks like a good phrase to use. A google search on that found Enterprise Engineering Center (user who created article has done nothing else), plus several others (in the top 10 results) that were tagged as appearing to be advertisements.
Finally, the bot is only doing flagging. A human has to actually nominate an article for deletion (and it's easy to remove a template). But your comment does raise a point about there being a link to click on to complain about the bot. John Broughton | Talk 15:40, 2 January 2007 (UTC)
I suggest this template:
Cocoaguy contribs talk 03:42, 3 January 2007 (UTC)
I proposed this to Raul654 on his talk page, but he'd rather not add it to his workload, though he supported using a bot instead.
I'd like a bot to watch the Featured log (for successful noms) and Featured archive (for failed noms) and automatically tag each one with a line that indicates when they were closed (i.e. added to the archive) and the result. That way, it'll be possible to determine from the page itself what happened.
I'm thinking it should add
Promoted ~~~~~
or
Not Promoted ~~~~~
at the bottom of each, in line with WP:FPC. Night Gyr ( talk/ Oy) 20:59, 5 January 2007 (UTC)
Yeah, top or bottom isn't really a big issue for me, and top (immediately below the section head) is probably better for quick reference. FPC uses {{ FPCresult}}, so it needn't be a complicated template. Night Gyr ( talk/ Oy) 21:15, 5 January 2007 (UTC)
Would like to have a similar bot do the same (in reverse) for Featured article review; rather than Promoted or Not Promoted, the bot would return Kept or Removed Featured status, based on the Featured article review archive. SandyGeorgia ( Talk) 05:53, 7 January 2007 (UTC)
I will volunteer to write a bot that performs these tasks, presuming no one else would like to or has already started. -- Jmax- 09:19, 7 January 2007 (UTC)
I need a bot to do some routing maintenance tasks for deletion review. Possible tasks would include:
Any help with these tasks is appreciated. ~ trialsanderrors 09:07, 7 January 2007 (UTC)
There is now a project dealing with articles which have not been modified or viewed recently at Wikipedia:WikiProject Abandoned Articles. Would there be any way to generate a bot which might list only articles which haven't been modified since, say, 2005 (or some other really long time, maybe by year), for the use of this project to help find the most overlooked articles? Badbilltucker 20:35, 6 January 2007 (UTC)
I'm in the process of importing a database dump and I'll gather these statistics for you. To be clear, you want a list of pages with the oldest most recent edit, and is in the main namespace, and is not a disambiguation page; Correct? -- Jmax- 07:33, 7 January 2007 (UTC)
Can someone run a bot to replace all instances of "oftentimes" and "often times", with " often" which is shorter, means exactly the same thing, and doesn't sound so awkward in formal written English. I tried to start doing it manually, but there's too much of it.
You can find the instances via Google:
Obviously, perhaps ignoring all the talk and user namespaces might be a good idea.
Please. Paul Silverman 12:19, 7 January 2007 (UTC)
Souldn't we delete the users who did not contributed to Wikipedia for a long period of time?-- Jamesshin92 22:32, 7 January 2007 (UTC)
There are many headings that to not follow Wikipedia:Manual of Style (headings)in Wikipedia Articles. -- Jamesshin92 22:14, 7 January 2007 (UTC)
=== White House communications ===
=== U.S. Senator ===
I agree. -- Jamesshin92 04:02, 8 January 2007 (UTC)
However, I still think that we can still spread this idea in different approach. For example, detecting repeated heading and special characters such as (%+^@~ and such).
We also might think of modifying the heading to standard headings such as. "Also see" into "See Also," "Links" into "External Links," and such. Jamesshin92 04:10, 8 January 2007 (UTC)
Unit symbols are invariable (unlike abbreviations), but there are nevertheless hundreds if not thousands of articles where "s" has been added improperly. A bot could fix this automatically, as the risk of confusion with correct English is about zero. Specifically:
The sought strings should be case sensitive, and the bot should leave instances immediately followed by a period alone (they could be legitimate abbreviations).
Other cases than those listed above probably exist.
Urhixidur 18:55, 8 January 2007 (UTC)
Would it be possible to create a bot that could automatically convert the U.S. Standard System of Measurment into the European Metric System of Measurment? I think a number of articles on here could benifit from such a bot if we do not already have one. Note that I know nothing about operating a bot, this is simply an idea of mind which I got while working on the article USS Wisconsin. —The preceding unsigned comment was added by TomStar81 ( talk • contribs) 06:06, 9 January 2007 (UTC).
(undent) It doesn't seem that controversial. Obviously some test runs, and starting slowly, would be appropriate. In general, I think Wikipedia needs more bots like this - human beings just aren't that consistent (much of which comes from not knowing everything in detail), and bots like this can compensate for that. John Broughton | Talk 15:30, 9 January 2007 (UTC)
Bot request: Change article text of the form "[[foo bar|foo]] bar", "foo [[foo bar|bar]]", and "[[foo bar|foo]] [[foo bar|bar]]" to "[[foo bar]]". I call this link normalization. The bot should make one pass of the whole database every month or so.
This oddities exist due to disambiguation. The first editor writes "[[foo]] bar", a disambiguator uses a tool to replace "[[foo]]" with "[[foo bar|foo]]", leaving "[[foo bar|foo]] bar". Clearly "[[foo bar]]" is the better form. It makes the link more sensible if it covers both words. The tools could be altered, but there's more than one, they're already complex, and thousands of these links already exits. -- Randall Bart 07:29, 11 January 2007 (UTC)
Can someone please go over to http://test.wikipedia.org and with a bot populate Category:Really big category with anything, it doesn't matter what. Just dump every page and every image into the category please to test how the category system works when it is pushed to its limit. Testing man 22:53, 4 January 2007 (UTC)
Why is this necessary? -- Jmax- 06:56, 12 January 2007 (UTC)
Could a bot that detects and removes repeats of links to other articles be created? For example the article on Lions might have the word "Africa" in the first paragraph which is linked to an article about Africa and then further down the page there is another link where the word "Africa" appears. The bot could detect this and de-link the second occurrence of the word. —The preceding unsigned comment was added by Mutley ( talk • contribs) 06:21, 8 January 2007 (UTC).
Hmm. This idea has been kicked out many times. I bleieve it could still work with care and attention. Rich Farmbrough, 15:28 13 January 2007 (GMT).
Hey how about a bot that will put all the commas, periods (all punctuation except semi-colons, in fact) inside quotation marks; it looks quite unprofessional to see articles written with punctuation outside quotations. - Unisgned comment added by User:165.82.156.110
Hello, we are (finishing to) putting in place a new translation project.
There are two things I would greatly appreciate if it was done by a bot.
First, we had to make a small modification of the format of the translation pages which are used for every translation request. The task is : For every page in Category:Translation sub-pages version 1, this kind of change needs to be done.
Second, there are a lot of categories to initialize with a very simple wikicode, 7 for each language and they are 50 of them. All red links of the array on
Wikipedia:Translation/*/Lang (except the first column of the array which has a different syntax) should be initialized with the syntax explained on this page.
Let me know if you need any furhter info
Jmfayard 18:46, 6 January 2007 (UTC)
Moved to User talk:PocklingtonDan/Spelling bot — Mets501 ( talk) 20:35, 13 January 2007 (UTC)
Cleaning up from this category move: Wikipedia:Categories for deletion/Log/2006 December 19#American television series by decade where the meaning of the category was changed, there should be no overlap with Category:Anime by date of first release, because by the English definition no US originated-series that we know of is anime.
I'd like a bot to re-categorize with the following rule: If article in Category:Anime series and in Category:XXXXs American television series then remove from Category:XXXXs American television series and add to Category:Anime of the XXXXs instead. (The latter category includes both films and series.) -- GunnarRene 05:37, 3 January 2007 (UTC)
Category:(....)s American television series
Category:Anime of the $1s
I'm wanting to extract a list of regular text lines from the articles in the millennium, century, decade, and year series ( Upper Paleolithic, 10th mil BC, 9th mil BC, 1690s BC, 499 BC, 1 BC, 1st mil AD, 3rd mil, 2066, 2100s, 30th century, and the other 3000 or so articles in between). I've already written Python code that sucessively downloads user defined ranges (and throws out formatting lines), but something happened to the database or something this morning, so I want to change my code to grab (I think it's approximately 20 MB) everything from Special:Export, and save it to disk before performing parsing operations on it. I don't know how to interact with the Export function, so I guess this is less of a bot request and more of a bot help request. Xaxafrad 00:06, 15 January 2007 (UTC)
Shouldn't we have a bot that translates articles from German or maybe, at least, a German version of RamBot that uses German-speaking countries' (such as Austria or Germany) information on towns. Tell me what you think. ''[[User:Kitia|Kitia]] 00:37, 15 January 2007 (UTC)
Request for double redirect bot? Running every 30 seconds? -- Parker007 13:07, 15 January 2007 (UTC)
Hi. I'm looking for a bot which would update this category summary for WP:ALBUM:
The figures are updated manually at present (using AWB for the large ones). The format itself was borrowed from Dragons flight's Category tracker so I presume it would be quite easy to do this... if you know what you are doing with bots... which I don't... so any help would be appreciated! Bubba hotep 13:58, 16 January 2007 (UTC)
hi.. I'm Glacious. I'm currently an admin on dv.wikipedia. there i want to make a Bot which could do various tasks. But i don't know how to make a Bot. So if any one can help me in my project, please leave me a message on my talk page soon. looking forward for a reply. Thanks... -- Glacious 14:22, 16 January 2007 (UTC)
As I was going over and serching articles yesterday, it shocked me to see how many articles had incorrect spelling, grammar and punctuation. I was thinking about a bot that will put all the punctuation (commas, periods, semi-colons, etc.), grammar (correctly capitallised nouns, etc.) and spelling (commonly known spelling errors) into place on any article it may 'crawl' across. Would appreciate anyone willing to create a bot for me. Many thanks, Extranet ( Talk | Contribs) 08:21, 14 January 2007 (UTC)
So you're saying there isn't really a need for a Punctuation and Grammar Bot? If not, why not suggest a few 'really needed' bots because I have always wanted to run a bot throughout my time here at Wikipedia. -- Extranet ( Talk | Contribs) 03:24, 15 January 2007 (UTC)
Thanks for your comments. I will close this request for now but if anyone has a bot that needs a new owner or has a new bot regarding Punctuation or Grammar editing, please leave a message on my talk page. Thanks! -- Extranet ( Talk | Contribs) 08:25, 17 January 2007 (UTC)
I was thinking how convenient it'd be for a bot to turn all the large amounts of web citations we have into the more formal referance tags. More specifically, it would turn this: [www.wikipedia.org] into:
<ref>{{cite web |url=www.wikipedia.org |title=Main Page - Wikipedia, the free encyclopedia |accessdate=2007-01-16}}</ref>
Note, it would assume the date the link was added was the access date, a reasonable assumption. Vicarious 08:10, 17 January 2007 (UTC)
I can see such a bot being extremely valuable. A couple of thoughts: It could go to Special:Linksearch and get a list of all external links for a specific domain (e.g., CNN). It should understand that when a URL appears twice in a page, there should not be two identical footnotes (so, the first ref/footnote gets a name="X" argument). It probably should avoid pages that have both a "Notes" and a "References" section. It probably shouldn't footnote a URL if the URL is in the "External links" section. And, obviously, it should put as much information as possible into the "cite" template. John Broughton | ♫♫ 22:47, 17 January 2007 (UTC)
A bot to change the
default sort key for all
people to {{DEFAULTSORT:''lastname'', ''firstname''}}
. Would make sorting categories much easier.
Λυδαcιτγ 22:11, 17 January 2007 (UTC)
And also delete piped sorting keys that become redundant after {{DEFAULTSORT}}
is changed.
Λυδαcιτγ 22:13, 17 January 2007 (UTC)
WikiProject Spam would like a list of all stub-templates with external links, with external links pointing to wikipedia.org filtered out. We've seen quite a bit of Google bombing on stub templates and a list like this would be quite helpful doing an initial cleanup. --- J.S ( T/ C/ WRE) 22:26, 13 January 2007 (UTC)
Please have a look at the discussion going on at Talk:Barack Obama#Consensus on IP edits and let us know there if you can think of ways a bot might help. My first thought is a bot that would detect vandalism, revert, then apply semi-protection for a limited period, automatically toggling it off after a defined interval. -- HailFire 20:37, 17 January 2007 (UTC)
Anyone with AWB and some regex skill, I was thinking a bot could clean up the kind of problem you see at Super Bowl IV#References. Basically, numbered inline citations (usually of the new style nowadays) are mixed with general bulleted references. It produces and awkwardly looking format, and really should be split into inline cited "notes" and more general "references". I don't really have to figure out how to do this properly, but it should be that hard for someone more skilled with AWB than me. -- W.marsh 18:18, 18 January 2007 (UTC)
A recent sneaky spammer was caught and community banned. The spammer owned a dozen+ domains and was replacing references with references to his sites, he created a dozen or so articles with the only citation being to his site and he added his address to many External Link sections. Because of the way he obfuscated it took 8 months to notice this activity.
So... I thought up a dream bot that would be able to track this crap down.
Assuming we end up with a list of 1 million links and spend 5 seconds on each one then it will take slightly less then 2 months to finish running the program. I guess that sorta makes my request a little silly. :( Is there any better way to track down this with a bot? --- J.S ( T/ C/ WRE) 22:52, 18 January 2007 (UTC)
I noticed that there were a fairly big amount of articles where the spelling "twelwe" was used instead of "twelve". Would anyone fancy to take on this? MoRsE 11:18, 19 January 2007 (UTC)
I am an admin at the Non-canon Star Trek Wiki and we are preparing to have our namespace changed to Memory Beta. Unfortunately, a large number of articles have already been moved into that namespace and we have been told that these articles must be moved back so that the namespace can be changed, or we lose all our content.
Because of this, I wondered if anyone had a bot or could create a bot that would be able to move all the articles starting with Memory Beta: to now begin with Non-canon Star Trek Wiki:, as well as delete the redirects between those pages. All of the articles that begin with the namespace can be found here.
Any help would be greatly appreciated, Thanks. -- The Doctor 20:43, 20 January 2007 (UTC)
I was wondering if it possible for a bot to:
{{
WikiProject hip hop}}
with {{
WikiProject hip hop|class=|importance=}}
. This template redirect was the original template page but it got moved due to naming concerns (specifically whether "Hip Hop" should be in capitals).{{
WikiProject hip hop|class=|importance=}}
.Also a lot of this pages are already tagged so the bot would have to recognize if the page is already tagged properly or not, and not to erase the code if it was rated. Is there any existing bot is able to do this tasks? Thank you in advance. — Tutmosis 21:31, 16 January 2007 (UTC)
I'm back! I'm afraid that I need some help from a bot for WikiProject MMO's banners again. We are starting up an assessment system and I thought a good way to start off would be to take ratings from WikiProject CVG's banners. Most all MMO articles have both {{ WP MMOG}} (WikiProject MMO's banner) and {{ Cvgproj}} (WikiProject CVG's banner) on their talk pages. I was hoping that a bot could take what WikiProject CVG assessed the article as and copy it (being {{WP MMOG|class=?}}). The articles that would be assessed are in Category:Massively multiplayer online games as well as it's sub-categories. I would prefer that you use this category over the project category as to hopefully get more articles with our banner on it. TIA! Greeves ( talk • contribs) 02:38, 23 January 2007 (UTC)
I would like a bot to tell me off when I edit project namespace. -- Punk Boi 8 06:56, 23 January 2007 (UTC)
Please ignore this request - I have explained to the user how to recognize which space they are in based on the page title. -- Trödel 15:11, 23 January 2007 (UTC)
The {{ fc}} and {{ nft}} templates are designed for ease-of-editing, and according to the talk page, they should be subst: when added to the page. At one point, I think a bot had run through them, and for awhile I had manually kept nft under control. Now, it's out of control again (350+ on nft page links and 2000+ on fc page links. I'd like to request a bot to go through and subst: all of the fc and nft occurences. Thanks!! Neier 02:22, 23 January 2007 (UTC)
It would be rather useful if the description page for images included details of when an image starts and stops being used in articles. Often it's quite hard to identify exactly when an image was added or removed (and by whom) from an article, and this bot would enable people to find out very easily by viewing the image discussion page. I suspect a bot may not be the best way to implement this though, as it would probably involve a substantial number of edits in order to achieve this - maybe this functionality would be better implemented within mediawiki itself...? -- Rebroad 19:38, 23 January 2007 (UTC)
I'd like to see a bot that can scan the contents of Category:Fair use images and all its subcategories and identify images that are older than seven days and are not used in an articles and tag them will {{ Orphaned fairuse not replaced}} and also notify the uploader with {{ Orphaned}}. It would also be nice if the bot could identify FU images that are used outside of article namespace (such as on userpages or templates) and it could log those so an editor could later go and remove them and notify the user. There are hundreds of thousands of images in these categories (31,000+ in the main category itself and the template that places them in the main category is outdated, meaning there should be zero in the main category) and it would be really nice to have a bot to search and tag orphaned FU images and this is now done manually. I believe there used to be a bot that did this but I can't find it now.-- NMajdan• talk 17:11, 17 January 2007 (UTC)
I am fairly new to Wikipedia so I have yet to find what to do in this case. There is a large number of links to the FCHD, a database of football (soccer) clubs and statistics. That site has changed its address and I assume the old links will be dead at some point. It is simply a matter of changing the domain from www.fchd.btinternet.co.uk to www.fchd.info, the rest of the URL remains the same. ByteofKnowledge 16:23, 25 January 2007 (UTC)
This Discussion has moved to User:Betacommand/Bot Tasks Betacommand ( talk • contribs • Bot) 15:54, 26 January 2007 (UTC)
This Discussion has moved to User:Betacommand/Bot Tasks Betacommand ( talk • contribs • Bot) 15:54, 26 January 2007 (UTC)
I run across a lot of images sourced from the "English Wikipedia" in my admin work on commons. These are tagged on commons as PD but on en they are fair use. As these are being uploaded for other projects (like es.wiki) the en page doesn't get {{ NCT}} tagged so en admins are not aware of the bad copy, and Commons admins may well miss the copyvio image for some time. Would it be feasible to make a bot to check uploads on Commons for images sourced to en.wikipedia (or synonyms) and check that the commons image is under the same license as the en source image? ({{ fairuse}} on commons redirects to copyvio, the other fair use templates could be too if need arose).-- Nilf anion ( talk) 18:49, 21 January 2007 (UTC)
I'm currently creating redirects for all the permutations of the two-hybrid screening title. It covers E. coli and yeast (specifically known as Saccharomyces cerevisiae) two hybrid screening, but the combinations of words I need to put into titles are; (E. coli/Escherichia coli/bacterial/bacteria/yeast/Saccharomyces cerevisiae/S. cerevosoiae) (two hybrid/two-hybrid) (*nothing*/test/screen/screening/method/analysis). In total, that's 7x2x6 = 84 possible permutations! Even with clever use of the back button and copy ans paste, that's going to be a drawn-out laborious task to do by hand. Could someone please write a bot for redirect making where you enter the variables and then it goes to work? It should probably be operated only by the creator as per requests on the bot's talk page, to prevent abuse. Also, it would need to return an error/fail silently when one of the permutations already exists, rather than overwrite, in case that permutation contained a full seperate artcle or a redirect to another article. -- Seans Potato Business 22:41, 18 January 2007 (UTC)
Could somebody with a bot add {{ WikiProject GeorgiaUS}} to the talk pages of all of the articles in Category:Georgia (U.S. state)? Thanks in advance, P TO 20:09, 27 January 2007 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 8 | Archive 9 | Archive 10 | Archive 11 | Archive 12 | → | Archive 15 |
I'm building a research (ie, no edit) bot in C++... since I'm not really that experienced in programing I was wondering if someone would be willing to check my pseudocode?
The basic concept behind the bot is to identify when a particular string of text was added to an article using a binary search method. In theory it could search though the history of a page with 10,000 edits with less then 15 page-requests.
A research program like this will be a helpful tool in tracking down subtle vandals and spammers. So.. I've kinda drifted. Anyone more experienced with OOP languages want to audit my pseudocode? --- J.S ( T/ C) 23:59, 28 December 2006 (UTC)
Here is a perl implementation, less the essential bits (which could easily be added). I'm not entirely sure if the algorithm will even work properly, actually. Something seems off about it. -- Jmax- 10:09, 29 December 2006 (UTC)
I know there's already autoarchive bots running such as the one archiving this page, but I think a bot that operates a little differently could be effectively used to archive all article talkpages. First off, it would only archive talk pages that are very long, so 3 year old comments on a tiny talk page would be left untouched. When the bot runs across a very long talk page it will archive similarly to current bots, but with a high threshold, for example all sections older than 28 days (rather than the typical 7 days). Also, unlike current bots I'd suggest we make this opt out rather than opt in, although very busy talk pages or talk pages that are manually archived wouldn't be touched anyway because they'd either be short enough or would have no inactive sections. Vicarious 03:56, 1 January 2007 (UTC)
I suspect this idea isn't even remotely feasible, but I thought I'd suggest it in case I was wrong. A bot that posts a note on a user's talk page when they reach a milestone edit count (1k, 5k, whatever). It'd say congrats and maybe have a time and link for the thousandth edit. Vicarious 05:26, 1 January 2007 (UTC)
I suspect this idea has been thought of before but i don't see its fruit so here goes. When i first discovered the talk page here I couldn't for the life of me understand why wikipedia couldn't have a normal message box interface, even if it need be public. This simply means showing the thread of an exchange on different talkpages. It would save us having to keep an eye for a reply on the page we left a message the day before etc. A bot can simply thread a talk exchange, of course this would require tagging our talk as we reply to a message. This is more a navigational issue but since its not been integrated into the main wiki OS it seems to be left for a bot. I don't know how it would run though. Suggestions? frummer 17:57, 1 January 2007 (UTC)
I was wondering if it would be possible to create a bot that would serve solely to revert the addition of a link to the oregon trail article. once every other week or so a user adds a link for a free game download that we delete off the article. the bot would just have to monitor the External links category, removing the link: http://www.spesw.com/pc/educational_games/games_n_r/oregon_trail_deluxe.html Oregon Trail Deluxe download whenever it appears. Thanks, please let me know on my talk page if this is a possibility that anyone could take up. Thanks again, b_cubed 17:00, 2 January 2007 (UTC)
May someone please operate this for me? It's already been userpaged, accounted, and flagged. D•a•r•k•n• e• s•s•L•o•r•d• i•a•n••• CCD••• 22:12, 2 January 2007 (UTC)
I would like to suggest the creation of a BOT to defend articals for children's show. For some reason these pages appile to vandals and I think somthing needs to help protect them. I'll use an exsample before the Dora the Explorer page was put back under protection it was vandalized alot one time sticks in my mind the most was by a user named Oddanimals who, stated Dora was 47 and had a sex change along with a few other sex related comments, and replaced the word Bannana in Boot's artical with the S curse word. This is not proper to say the least and one of the users I talked to said that the Backyardagains artical is also vandalized alot. Parents, kids, and people ,like me, who just enjoy those shows look it up and this kind of thing should NOT be allowed. Thank You Superx 23:18, 2 January 2007 (UTC)
True but Those BOTs are checking other pages as well. that Vandalizm stuck out like a sore thumb and none of those bots caught it except for one after I fixed it myself and I think that just one BOT who's job it is too check those pages would be better than sevaral others who are checking a bunch of other pages as well. Superx 01:10, 3 January 2007 (UTC)
Yes but that would only apily here if the stuff I mentioned ACTULLY HAD SOMETHING TO DO WITH THE SHOW! Curse words and other such stuff is only allowed if it is relavent to the artical and none of that is like that thus making that point you mentioned doesn't apliy in this situation. Superx 12:00, 5 January 2007 (UTC)
Need to migrate all the existing transclusions of {{ CopyrightedFreeUse}} to {{ PD-release}} per discussion here. BetacommandBot started on this a few weeks ago and then mysteriously quit about 7/8ths of the way through and I haven't been able to get a response from Betacommand since then. Could someone else finish this so that we can finally delete that template. Thanks. Kaldari 01:27, 3 January 2007 (UTC)
Please add ro interwiki to all popes pages. Just created, Romihaitza 12:31, 3 January 2007 (UTC)
We need to add the {{ WikiProject France}} to all the articles belonging to France and its sub categories. So would be nice if someone could do it for us or tell me how to do it. STTW (talk) 09:45, 4 January 2007 (UTC)
People usually do a good job of protecting the templates on the Main Page; but there have been some that slip through the cracks and the results can be disastrous. I propose a bot that would be given sysop status. I know this is controversial, and there was a big discussion about a similar request at the AFD page awhile back. Such, anyone allowed to know the password must have already been approved for adminship through conventional means, and it should be open-source. It will protect the next day's templates in advance of them being on the Main Page (say, 24 hours) and then unprotect them afterwards. Preferably, it would make sure the pages stay protected until off the Main Page, and even be able to work with the pictures for POTD, but they'd have to be specified in advance, whereas the templates would run on the {{CURRENTDAY}} magic word system. This would be a big help in reducing the possibility of Main Page vandalism (believe me, it happens).-- Here T oHelp 03:52, 5 January 2007 (UTC)
I have the feeling I'm gonna get yelled at for this one, but how about a bot that deletes articles that have a clear concensus on Wikipedia:Articles for deletion. For example, it's quite obvious that Wikipedia:Articles for deletion/Myspacephobia is going to get deleted, but it's currently waiting for an admin to do the work. Yes I know this would mean an admin bot, but that's not without precedent. Also, this bot would ONLY work on articles with a very obvious concensus. As for vandals abusing the bot, I don't think it would be an issue. First off it'd ignore IPs, secondly it'd have a minimum amount of time for voting, and there's too many legitamate voters to contest a bad faith deletion for the bot to touch it. Btw, this bot would also close candidates that are clearly keep as well. Vicarious 07:39, 5 January 2007 (UTC)
{{Infobox musical artist 2
->
{{Infobox musical artist
86.201.106.176 13:23, 5 January 2007 (UTC)
I do not have any programming skills about running bots. I can handle and run the bot if someone writes the code to replace the image link from the articles, with the existing image on commons with different name. I suppose this type of bot could be useful other than english wikipedia as well in some of the cases. There are many examples of the images could be found in this category. Shyam ( T/ C) 19:52, 5 January 2007 (UTC)
In theory, the bot will look through new articles to try and find key phrases like "our products" and "we are a". It then places a template on the page like this:
And places it in a relevant category. A human (or other intelligent individual) would then look through the list and nominate any articles that are blatant ads for WP:SPEEDY.
What do you think? --/// Jrothwell ( talk)/// 13:15, 29 December 2006 (UTC)
(undent) The issue of false positives is important. Certainly if a large majority of flaggings related to a particular phrase are in fact in error, that phrase shouldn't be used by the bot. But keep in mind that this flagging will only be used for new articles, which are much more likely to be spam then existing ones, so drawing conclusions from your search of existing articles isn't necessarily a good idea.
In any case, the bot should be tested by seeing what happens using a given phrase for (say) the first ten articles it finds. For example, our products looks like a good phrase to use. A google search on that found Enterprise Engineering Center (user who created article has done nothing else), plus several others (in the top 10 results) that were tagged as appearing to be advertisements.
Finally, the bot is only doing flagging. A human has to actually nominate an article for deletion (and it's easy to remove a template). But your comment does raise a point about there being a link to click on to complain about the bot. John Broughton | Talk 15:40, 2 January 2007 (UTC)
I suggest this template:
Cocoaguy contribs talk 03:42, 3 January 2007 (UTC)
I proposed this to Raul654 on his talk page, but he'd rather not add it to his workload, though he supported using a bot instead.
I'd like a bot to watch the Featured log (for successful noms) and Featured archive (for failed noms) and automatically tag each one with a line that indicates when they were closed (i.e. added to the archive) and the result. That way, it'll be possible to determine from the page itself what happened.
I'm thinking it should add
Promoted ~~~~~
or
Not Promoted ~~~~~
at the bottom of each, in line with WP:FPC. Night Gyr ( talk/ Oy) 20:59, 5 January 2007 (UTC)
Yeah, top or bottom isn't really a big issue for me, and top (immediately below the section head) is probably better for quick reference. FPC uses {{ FPCresult}}, so it needn't be a complicated template. Night Gyr ( talk/ Oy) 21:15, 5 January 2007 (UTC)
Would like to have a similar bot do the same (in reverse) for Featured article review; rather than Promoted or Not Promoted, the bot would return Kept or Removed Featured status, based on the Featured article review archive. SandyGeorgia ( Talk) 05:53, 7 January 2007 (UTC)
I will volunteer to write a bot that performs these tasks, presuming no one else would like to or has already started. -- Jmax- 09:19, 7 January 2007 (UTC)
I need a bot to do some routing maintenance tasks for deletion review. Possible tasks would include:
Any help with these tasks is appreciated. ~ trialsanderrors 09:07, 7 January 2007 (UTC)
There is now a project dealing with articles which have not been modified or viewed recently at Wikipedia:WikiProject Abandoned Articles. Would there be any way to generate a bot which might list only articles which haven't been modified since, say, 2005 (or some other really long time, maybe by year), for the use of this project to help find the most overlooked articles? Badbilltucker 20:35, 6 January 2007 (UTC)
I'm in the process of importing a database dump and I'll gather these statistics for you. To be clear, you want a list of pages with the oldest most recent edit, and is in the main namespace, and is not a disambiguation page; Correct? -- Jmax- 07:33, 7 January 2007 (UTC)
Can someone run a bot to replace all instances of "oftentimes" and "often times", with " often" which is shorter, means exactly the same thing, and doesn't sound so awkward in formal written English. I tried to start doing it manually, but there's too much of it.
You can find the instances via Google:
Obviously, perhaps ignoring all the talk and user namespaces might be a good idea.
Please. Paul Silverman 12:19, 7 January 2007 (UTC)
Souldn't we delete the users who did not contributed to Wikipedia for a long period of time?-- Jamesshin92 22:32, 7 January 2007 (UTC)
There are many headings that to not follow Wikipedia:Manual of Style (headings)in Wikipedia Articles. -- Jamesshin92 22:14, 7 January 2007 (UTC)
=== White House communications ===
=== U.S. Senator ===
I agree. -- Jamesshin92 04:02, 8 January 2007 (UTC)
However, I still think that we can still spread this idea in different approach. For example, detecting repeated heading and special characters such as (%+^@~ and such).
We also might think of modifying the heading to standard headings such as. "Also see" into "See Also," "Links" into "External Links," and such. Jamesshin92 04:10, 8 January 2007 (UTC)
Unit symbols are invariable (unlike abbreviations), but there are nevertheless hundreds if not thousands of articles where "s" has been added improperly. A bot could fix this automatically, as the risk of confusion with correct English is about zero. Specifically:
The sought strings should be case sensitive, and the bot should leave instances immediately followed by a period alone (they could be legitimate abbreviations).
Other cases than those listed above probably exist.
Urhixidur 18:55, 8 January 2007 (UTC)
Would it be possible to create a bot that could automatically convert the U.S. Standard System of Measurment into the European Metric System of Measurment? I think a number of articles on here could benifit from such a bot if we do not already have one. Note that I know nothing about operating a bot, this is simply an idea of mind which I got while working on the article USS Wisconsin. —The preceding unsigned comment was added by TomStar81 ( talk • contribs) 06:06, 9 January 2007 (UTC).
(undent) It doesn't seem that controversial. Obviously some test runs, and starting slowly, would be appropriate. In general, I think Wikipedia needs more bots like this - human beings just aren't that consistent (much of which comes from not knowing everything in detail), and bots like this can compensate for that. John Broughton | Talk 15:30, 9 January 2007 (UTC)
Bot request: Change article text of the form "[[foo bar|foo]] bar", "foo [[foo bar|bar]]", and "[[foo bar|foo]] [[foo bar|bar]]" to "[[foo bar]]". I call this link normalization. The bot should make one pass of the whole database every month or so.
This oddities exist due to disambiguation. The first editor writes "[[foo]] bar", a disambiguator uses a tool to replace "[[foo]]" with "[[foo bar|foo]]", leaving "[[foo bar|foo]] bar". Clearly "[[foo bar]]" is the better form. It makes the link more sensible if it covers both words. The tools could be altered, but there's more than one, they're already complex, and thousands of these links already exits. -- Randall Bart 07:29, 11 January 2007 (UTC)
Can someone please go over to http://test.wikipedia.org and with a bot populate Category:Really big category with anything, it doesn't matter what. Just dump every page and every image into the category please to test how the category system works when it is pushed to its limit. Testing man 22:53, 4 January 2007 (UTC)
Why is this necessary? -- Jmax- 06:56, 12 January 2007 (UTC)
Could a bot that detects and removes repeats of links to other articles be created? For example the article on Lions might have the word "Africa" in the first paragraph which is linked to an article about Africa and then further down the page there is another link where the word "Africa" appears. The bot could detect this and de-link the second occurrence of the word. —The preceding unsigned comment was added by Mutley ( talk • contribs) 06:21, 8 January 2007 (UTC).
Hmm. This idea has been kicked out many times. I bleieve it could still work with care and attention. Rich Farmbrough, 15:28 13 January 2007 (GMT).
Hey how about a bot that will put all the commas, periods (all punctuation except semi-colons, in fact) inside quotation marks; it looks quite unprofessional to see articles written with punctuation outside quotations. - Unisgned comment added by User:165.82.156.110
Hello, we are (finishing to) putting in place a new translation project.
There are two things I would greatly appreciate if it was done by a bot.
First, we had to make a small modification of the format of the translation pages which are used for every translation request. The task is : For every page in Category:Translation sub-pages version 1, this kind of change needs to be done.
Second, there are a lot of categories to initialize with a very simple wikicode, 7 for each language and they are 50 of them. All red links of the array on
Wikipedia:Translation/*/Lang (except the first column of the array which has a different syntax) should be initialized with the syntax explained on this page.
Let me know if you need any furhter info
Jmfayard 18:46, 6 January 2007 (UTC)
Moved to User talk:PocklingtonDan/Spelling bot — Mets501 ( talk) 20:35, 13 January 2007 (UTC)
Cleaning up from this category move: Wikipedia:Categories for deletion/Log/2006 December 19#American television series by decade where the meaning of the category was changed, there should be no overlap with Category:Anime by date of first release, because by the English definition no US originated-series that we know of is anime.
I'd like a bot to re-categorize with the following rule: If article in Category:Anime series and in Category:XXXXs American television series then remove from Category:XXXXs American television series and add to Category:Anime of the XXXXs instead. (The latter category includes both films and series.) -- GunnarRene 05:37, 3 January 2007 (UTC)
Category:(....)s American television series
Category:Anime of the $1s
I'm wanting to extract a list of regular text lines from the articles in the millennium, century, decade, and year series ( Upper Paleolithic, 10th mil BC, 9th mil BC, 1690s BC, 499 BC, 1 BC, 1st mil AD, 3rd mil, 2066, 2100s, 30th century, and the other 3000 or so articles in between). I've already written Python code that sucessively downloads user defined ranges (and throws out formatting lines), but something happened to the database or something this morning, so I want to change my code to grab (I think it's approximately 20 MB) everything from Special:Export, and save it to disk before performing parsing operations on it. I don't know how to interact with the Export function, so I guess this is less of a bot request and more of a bot help request. Xaxafrad 00:06, 15 January 2007 (UTC)
Shouldn't we have a bot that translates articles from German or maybe, at least, a German version of RamBot that uses German-speaking countries' (such as Austria or Germany) information on towns. Tell me what you think. ''[[User:Kitia|Kitia]] 00:37, 15 January 2007 (UTC)
Request for double redirect bot? Running every 30 seconds? -- Parker007 13:07, 15 January 2007 (UTC)
Hi. I'm looking for a bot which would update this category summary for WP:ALBUM:
The figures are updated manually at present (using AWB for the large ones). The format itself was borrowed from Dragons flight's Category tracker so I presume it would be quite easy to do this... if you know what you are doing with bots... which I don't... so any help would be appreciated! Bubba hotep 13:58, 16 January 2007 (UTC)
hi.. I'm Glacious. I'm currently an admin on dv.wikipedia. there i want to make a Bot which could do various tasks. But i don't know how to make a Bot. So if any one can help me in my project, please leave me a message on my talk page soon. looking forward for a reply. Thanks... -- Glacious 14:22, 16 January 2007 (UTC)
As I was going over and serching articles yesterday, it shocked me to see how many articles had incorrect spelling, grammar and punctuation. I was thinking about a bot that will put all the punctuation (commas, periods, semi-colons, etc.), grammar (correctly capitallised nouns, etc.) and spelling (commonly known spelling errors) into place on any article it may 'crawl' across. Would appreciate anyone willing to create a bot for me. Many thanks, Extranet ( Talk | Contribs) 08:21, 14 January 2007 (UTC)
So you're saying there isn't really a need for a Punctuation and Grammar Bot? If not, why not suggest a few 'really needed' bots because I have always wanted to run a bot throughout my time here at Wikipedia. -- Extranet ( Talk | Contribs) 03:24, 15 January 2007 (UTC)
Thanks for your comments. I will close this request for now but if anyone has a bot that needs a new owner or has a new bot regarding Punctuation or Grammar editing, please leave a message on my talk page. Thanks! -- Extranet ( Talk | Contribs) 08:25, 17 January 2007 (UTC)
I was thinking how convenient it'd be for a bot to turn all the large amounts of web citations we have into the more formal referance tags. More specifically, it would turn this: [www.wikipedia.org] into:
<ref>{{cite web |url=www.wikipedia.org |title=Main Page - Wikipedia, the free encyclopedia |accessdate=2007-01-16}}</ref>
Note, it would assume the date the link was added was the access date, a reasonable assumption. Vicarious 08:10, 17 January 2007 (UTC)
I can see such a bot being extremely valuable. A couple of thoughts: It could go to Special:Linksearch and get a list of all external links for a specific domain (e.g., CNN). It should understand that when a URL appears twice in a page, there should not be two identical footnotes (so, the first ref/footnote gets a name="X" argument). It probably should avoid pages that have both a "Notes" and a "References" section. It probably shouldn't footnote a URL if the URL is in the "External links" section. And, obviously, it should put as much information as possible into the "cite" template. John Broughton | ♫♫ 22:47, 17 January 2007 (UTC)
A bot to change the
default sort key for all
people to {{DEFAULTSORT:''lastname'', ''firstname''}}
. Would make sorting categories much easier.
Λυδαcιτγ 22:11, 17 January 2007 (UTC)
And also delete piped sorting keys that become redundant after {{DEFAULTSORT}}
is changed.
Λυδαcιτγ 22:13, 17 January 2007 (UTC)
WikiProject Spam would like a list of all stub-templates with external links, with external links pointing to wikipedia.org filtered out. We've seen quite a bit of Google bombing on stub templates and a list like this would be quite helpful doing an initial cleanup. --- J.S ( T/ C/ WRE) 22:26, 13 January 2007 (UTC)
Please have a look at the discussion going on at Talk:Barack Obama#Consensus on IP edits and let us know there if you can think of ways a bot might help. My first thought is a bot that would detect vandalism, revert, then apply semi-protection for a limited period, automatically toggling it off after a defined interval. -- HailFire 20:37, 17 January 2007 (UTC)
Anyone with AWB and some regex skill, I was thinking a bot could clean up the kind of problem you see at Super Bowl IV#References. Basically, numbered inline citations (usually of the new style nowadays) are mixed with general bulleted references. It produces and awkwardly looking format, and really should be split into inline cited "notes" and more general "references". I don't really have to figure out how to do this properly, but it should be that hard for someone more skilled with AWB than me. -- W.marsh 18:18, 18 January 2007 (UTC)
A recent sneaky spammer was caught and community banned. The spammer owned a dozen+ domains and was replacing references with references to his sites, he created a dozen or so articles with the only citation being to his site and he added his address to many External Link sections. Because of the way he obfuscated it took 8 months to notice this activity.
So... I thought up a dream bot that would be able to track this crap down.
Assuming we end up with a list of 1 million links and spend 5 seconds on each one then it will take slightly less then 2 months to finish running the program. I guess that sorta makes my request a little silly. :( Is there any better way to track down this with a bot? --- J.S ( T/ C/ WRE) 22:52, 18 January 2007 (UTC)
I noticed that there were a fairly big amount of articles where the spelling "twelwe" was used instead of "twelve". Would anyone fancy to take on this? MoRsE 11:18, 19 January 2007 (UTC)
I am an admin at the Non-canon Star Trek Wiki and we are preparing to have our namespace changed to Memory Beta. Unfortunately, a large number of articles have already been moved into that namespace and we have been told that these articles must be moved back so that the namespace can be changed, or we lose all our content.
Because of this, I wondered if anyone had a bot or could create a bot that would be able to move all the articles starting with Memory Beta: to now begin with Non-canon Star Trek Wiki:, as well as delete the redirects between those pages. All of the articles that begin with the namespace can be found here.
Any help would be greatly appreciated, Thanks. -- The Doctor 20:43, 20 January 2007 (UTC)
I was wondering if it possible for a bot to:
{{
WikiProject hip hop}}
with {{
WikiProject hip hop|class=|importance=}}
. This template redirect was the original template page but it got moved due to naming concerns (specifically whether "Hip Hop" should be in capitals).{{
WikiProject hip hop|class=|importance=}}
.Also a lot of this pages are already tagged so the bot would have to recognize if the page is already tagged properly or not, and not to erase the code if it was rated. Is there any existing bot is able to do this tasks? Thank you in advance. — Tutmosis 21:31, 16 January 2007 (UTC)
I'm back! I'm afraid that I need some help from a bot for WikiProject MMO's banners again. We are starting up an assessment system and I thought a good way to start off would be to take ratings from WikiProject CVG's banners. Most all MMO articles have both {{ WP MMOG}} (WikiProject MMO's banner) and {{ Cvgproj}} (WikiProject CVG's banner) on their talk pages. I was hoping that a bot could take what WikiProject CVG assessed the article as and copy it (being {{WP MMOG|class=?}}). The articles that would be assessed are in Category:Massively multiplayer online games as well as it's sub-categories. I would prefer that you use this category over the project category as to hopefully get more articles with our banner on it. TIA! Greeves ( talk • contribs) 02:38, 23 January 2007 (UTC)
I would like a bot to tell me off when I edit project namespace. -- Punk Boi 8 06:56, 23 January 2007 (UTC)
Please ignore this request - I have explained to the user how to recognize which space they are in based on the page title. -- Trödel 15:11, 23 January 2007 (UTC)
The {{ fc}} and {{ nft}} templates are designed for ease-of-editing, and according to the talk page, they should be subst: when added to the page. At one point, I think a bot had run through them, and for awhile I had manually kept nft under control. Now, it's out of control again (350+ on nft page links and 2000+ on fc page links. I'd like to request a bot to go through and subst: all of the fc and nft occurences. Thanks!! Neier 02:22, 23 January 2007 (UTC)
It would be rather useful if the description page for images included details of when an image starts and stops being used in articles. Often it's quite hard to identify exactly when an image was added or removed (and by whom) from an article, and this bot would enable people to find out very easily by viewing the image discussion page. I suspect a bot may not be the best way to implement this though, as it would probably involve a substantial number of edits in order to achieve this - maybe this functionality would be better implemented within mediawiki itself...? -- Rebroad 19:38, 23 January 2007 (UTC)
I'd like to see a bot that can scan the contents of Category:Fair use images and all its subcategories and identify images that are older than seven days and are not used in an articles and tag them will {{ Orphaned fairuse not replaced}} and also notify the uploader with {{ Orphaned}}. It would also be nice if the bot could identify FU images that are used outside of article namespace (such as on userpages or templates) and it could log those so an editor could later go and remove them and notify the user. There are hundreds of thousands of images in these categories (31,000+ in the main category itself and the template that places them in the main category is outdated, meaning there should be zero in the main category) and it would be really nice to have a bot to search and tag orphaned FU images and this is now done manually. I believe there used to be a bot that did this but I can't find it now.-- NMajdan• talk 17:11, 17 January 2007 (UTC)
I am fairly new to Wikipedia so I have yet to find what to do in this case. There is a large number of links to the FCHD, a database of football (soccer) clubs and statistics. That site has changed its address and I assume the old links will be dead at some point. It is simply a matter of changing the domain from www.fchd.btinternet.co.uk to www.fchd.info, the rest of the URL remains the same. ByteofKnowledge 16:23, 25 January 2007 (UTC)
This Discussion has moved to User:Betacommand/Bot Tasks Betacommand ( talk • contribs • Bot) 15:54, 26 January 2007 (UTC)
This Discussion has moved to User:Betacommand/Bot Tasks Betacommand ( talk • contribs • Bot) 15:54, 26 January 2007 (UTC)
I run across a lot of images sourced from the "English Wikipedia" in my admin work on commons. These are tagged on commons as PD but on en they are fair use. As these are being uploaded for other projects (like es.wiki) the en page doesn't get {{ NCT}} tagged so en admins are not aware of the bad copy, and Commons admins may well miss the copyvio image for some time. Would it be feasible to make a bot to check uploads on Commons for images sourced to en.wikipedia (or synonyms) and check that the commons image is under the same license as the en source image? ({{ fairuse}} on commons redirects to copyvio, the other fair use templates could be too if need arose).-- Nilf anion ( talk) 18:49, 21 January 2007 (UTC)
I'm currently creating redirects for all the permutations of the two-hybrid screening title. It covers E. coli and yeast (specifically known as Saccharomyces cerevisiae) two hybrid screening, but the combinations of words I need to put into titles are; (E. coli/Escherichia coli/bacterial/bacteria/yeast/Saccharomyces cerevisiae/S. cerevosoiae) (two hybrid/two-hybrid) (*nothing*/test/screen/screening/method/analysis). In total, that's 7x2x6 = 84 possible permutations! Even with clever use of the back button and copy ans paste, that's going to be a drawn-out laborious task to do by hand. Could someone please write a bot for redirect making where you enter the variables and then it goes to work? It should probably be operated only by the creator as per requests on the bot's talk page, to prevent abuse. Also, it would need to return an error/fail silently when one of the permutations already exists, rather than overwrite, in case that permutation contained a full seperate artcle or a redirect to another article. -- Seans Potato Business 22:41, 18 January 2007 (UTC)
Could somebody with a bot add {{ WikiProject GeorgiaUS}} to the talk pages of all of the articles in Category:Georgia (U.S. state)? Thanks in advance, P TO 20:09, 27 January 2007 (UTC)