![]() | This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
re: Suggestion to merge CPU and Microprocessor
I've done some preliminary work on this user subpage of mine. Work is still undergoing though and it's far from complete.
Anyone care to load it up, tell me what you think, maybe make a few changes to it? Leave any comments on my user talk. Thanks.
Also, we'll need to decide if we do go ahead with the merger which page to keep and which to change to a #redirect..
splintax 14:47, 8 September 2005 (UTC)
Some parts of this article are also similar to integrated circuit. — Preceding unsigned comment added by Twilight Realm ( talk • contribs) 01:19, September 27, 2005(UTC)
"As with many advances in technology, the microprocessor was an idea whose time had come." This isn't the right style of writing, and it isn't neutral. I haven't read much of the article, so there may be more. Also, a grammar error, right after the explanation of Moore's Law. I would normally do it, but... not now. Too tired, I'd mess it up. Twilight Realm 01:22, 27 September 2005 (UTC)
This article, along with the articles for integrated circuit, CPU, etc, don't make it clear what's different about them. How exactly is a microprocessor different than an IC other than "microprocessors are very advanced integrated circuits"? Twilight Realm 01:46, September 27, 2005 (UTC)
Are there any criteria defining microchips? Or is it just an objective term? The IC article says "For the first time it became possible to fabricate a CPU or even an entire microprocessor on a single integrated circuit." To me at least, that sounds like there's a specific level an IC must pass to be considered a microprocessor, or that there's a difference, that a microprocessor contains some components that an IC doesn't necessarily have to have. Clarification would be appreciated, even though it's not in this article. Twilight Realm 01:16, 30 September 2005 (UTC)
Sorry that I didn't use an edit summery on my last edit, but I clicked on save page instead of minor edit. It was just reverting vandalism though. -- Apyule 12:36, 2 November 2005 (UTC)
This section is totally misplaced here. Not only does it have NOTHING to do with microprocessors, but it was TOTALLY wrong. Linux support for 64 bit microprocessors dates back to the Alpha and MIPS ports (LONG before x86-64). Windows support also dates back to NT 3's Alpha and MIPS R4xxx ports. Likewise, Mac OSX's blood relatives Darwin, Mach, and L4 all ran on 64-bit microprocessors before OSX was compiled for PowerPC64.
The section was REALLY 'history of OS support for x86-64,' which is already included in the AMD64 article (in much more complete form). -- uberpenguin 02:38, 18 December 2005 (UTC)
"In 64-bit computing, the DEC(-Intel) ALPHA, the AMD 64, and the HP-Intel Itanium are the most popular designs as of late 2004." Was that really true? And, if so, is it still true? Or is "popular" defined as something other than "most common"? I suspect there might be more 64-bit SPARC machines and 64-bit POWER/PowerPC machines (especially if you include AS/400 and iSeries PowerAS) than Alpha machines, much less Itanium machines. Guy Harris 19:06, 24 December 2005 (UTC)
Hey didn't you forget the Intel Pentium D (Dual Core) processors. I feel the performance of Intel is way better than the DEC, AMD, others. — Preceding unsigned comment added by 167.206.128.33 ( talk) 00:52, January 26, 2006 (UTC)
I think there should be more links to processor architecture from this page. Von neuman, Harvard, DIB, etc. — Preceding unsigned comment added by 167.206.128.33 ( talk) 00:53, January 26, 2006 (UTC)
I'm not sure, if MIPS was really the first here. ARM was made in (working!) silicon (ARM1) on April 26 1985, first products were sold 1986 (exact date missing, the "ARM Development System", a second processor card for the BBC Micro ), first workstations released June 1987 ( Acorn Archimedes). But I don't know, when the first working MIPS silcon was made (I find 1985-1987 on the web, mips.com says nothing), what the first MIPS based products were, and when they were released. Some of the early products I know are the DECstation 2100 (1989), SGI Indigo (1990), MIPS Magnum 3000 (1990). Another candidate would be IBM ROMP, the first workstation was released 1986 (exact date missing), other products before that unlikely. - Alureiter 16:02, 7 February 2006 (UTC)
The first paragraph tells me what a microprocessor is made of but doesn't tell me what it does. I would like to see a succinct sentence about what a microprocessor actual does (execute instructions, for example), and then perhaps explain it a bit more in section farther down in the article.
Then at the end of the article there are three screens full of lists of various stuff. On Wikipedia it's easy to allow lists to get out of control and lose sight of what makes a thorough, balanced article. And complete doesn't mean we have to make a list of every possible internal and external link that might be somehow related!
So, tell me what the thing does and judiciously select a very few closely related links that might also be helpful. JonHarder 22:10, 16 July 2006 (UTC)
@ 2006-07-16 23:01Z
WRT the reverting to #transistors doubling every 18 months: I initially thought that this was wrong, but on checking the article, even though 18 months is oft quoted, 24 seems to fit the data much better. Also, from Moore's law:
I think the best thing may be to change the 18 at the top of the Moore's Law article to 24, and re-revert the change here. Comments? -- Mike Van Emmerik 22:42, 27 February 2006 (UTC)
From text:
The world's first single-chip 32-bit microprocessor was the AT&T Bell Labs BELLMAC-32A, with first samples in 1980, and general production in 1982(...)
but a few lines later:
The most famous of the 32-bit designs is the MC68000, introduced in 1979.
so, which one is right? it was the bellmac-3a or the mc68k?
Alejandro Matos 14:47, 20 November 2006 (UTC)
I suggest a link to my site called 'How Computers Work: Processor and Main Memory' at http://www.fastchip.net/howcomputerswork/p1.html . It tells how a processor and memory work simply and in COMPLETE DETAIL. A microprocessor is a processor on a single chip. It is not to replace the 'How Stuff Works' link but compliment it. If you understand this book/site, you will understand PRECISELY what a microprocessor and its main parts are and how they work together. Thinkorrr 01:09, 4 December 2006 (UTC)
I just googled "Hyatt microprocessor" and found this. Apparently TI overturned the earlier patent on the grounds that it was never implemented at the time. -- ArtifexCrastinus 06:57, 12 December 2006 (UTC)
I'm a little wary that the article classifies GPUs as microprocessors. I have always seen the term "microprocessor" applied to an IC-based CPU. As I'm sure most readers realize, GPUs are much more akin to DSPs or stream processors than CPUs, despite the unfortunate acronym similarity. The programmability and general design model of GPUs certainly does not qualify it to be called a CPU. So my question is, is it appropriate to call a GPU a microprocessor, given that I've always known the term microprocessor to be related to CPUs? I'm not entirely sure, thoughts? -- uberpenguin 12:59, 20 October 2005 (UTC)
Halfway through making a list of papers from IEEE journals to demonstrate the term's usage, I decided that all this rhetoric is really silly over a minor terminology disagreement. I went ahead and wrote a section describing the usage of "microprocessor" to mean something other than a CPU; feel free to add to it or revise it as you see fit. I still hold that DSPs and GPUs are not in themselves microprocessors, but I doubt many people would have such issues with using the term thus. I do feel strongly, however, that when no further clarification is given, the term "microprocessor" can safely be assumed to refer to a CPU. The section I wrote reflects that point. -- uberpenguin 22:40, 19 December 2005 (UTC)
{{
citation}}
: Explicit use of et al. in: |author=
(
help) - Paper describing the architecture of the TM3270 media processor. It's somewhat similar to a DSP/GPU, but is actually much closer architecturally to a CPU than GPUs are. The article never refers to the TM3270 as a CPU or a microprocessor, but as a "media processor" (actually, I think a very apt term for GPUs and CPUs).{{
citation}}
: CS1 maint: multiple names: authors list (
link) - Paper specifically addressing general purpose programming on the latest generation of programmable GPUs (this was only published in October of this year). It refers to GPUs as "stream processors," never microprocessors. It even makes a very clear distinction between GPUs and CPUs (as, IMO, it should).
Here are some reference points for inclusion of a "specialized microprocessor" subsection. MFNickster 05:58, 18 December 2005 (UTC)
Even though they dominate the desktop computers, there is almost no mention of the x86 family of processors at all in the history section after i386?
MIPS is not only used in embedded systems "like Cisco routers". The PlayStation game consolesare perhaps more well-known? — Preceding unsigned comment added by 80.202.211.146 ( talk) 16:49, February 6, 2005 (UTC)
I find it odd that the notable 32-bit section says the following: "The most famous of the 32-bit designs is the MC68000, introduced in 1979." The question here is, if the word famous is being used in the normal fashion, shouldn't the MOST famous 32-bit be a member of the x86 family? Regardless of how many applications there were of the 68k series, fame is a measure of popular knowledge. I'm not saying that the x86 family needs a boost in the article so much as that a word other than famous should be used to describe why the 68k series is more SIGNIFICANT than the x86, which I would argue it is. Jo7hs2 22:15, 2 November 2007 (UTC)
There is no such article. If someone removed it, please provide a substitute. If not, please remove the link. Landroo 13:31, 1 September 2007 (UTC)
Look at these articles everyone!
http://www.indybay.org/newsitems/2004/12/08/17088681.php
http://www.thocp.net/biographies/pickette_wayne.html
Its about the real brains and the actual "father" of the microprocessor. How come he isn't included in this article? And there isn't a single mention of him in Wikipedia either! His name doesn't appear anywhere as far as I've seen! Seriously this is one great guy screwed by Intel, Fairchild etc. big time!
And this is to the moderator(s): kindly dont hide what I've just (with a * or whatever). Certain stuff needs to be spoken out loud!
Hope he gets the credit due to him soon! Krishvanth ( talk) 06:50, 5 January 2008 (UTC)
Regarding the claim: There have even been designs for simple computing machines based on mechanical parts such as gears, shafts, levers, Tinkertoys, etc. Leonardo DaVinci made one such design, although none were possible to construct using the manufacturing techniques of the time. ... Does anyone know if the Leonardo DaVinci mechanical 'computer' or 'processor' claim is true? It's not mentioned in the Leonardo article, unless Leonardo's robot is considered a computing device. Reading up on the 'robot' does not sell me on the 'computing' possibility, though it is obviously an impressive contraption for the time. -- Ds13 03:31, 2004 Apr 15 (UTC)
Yes, mechanical computers have been designed and built. I suspect the original writer is thinking about the difference engine and analytical engine designed by Charles Babbage. -- 68.0.124.33 ( talk) 02:16, 8 March 2008 (UTC)
It's funny how this article explains jack shit about how microprocessors work. The most simple thing this article should have is somehow nonexistant. —Preceding unsigned comment added by 137.28.55.114 ( talk) 21:55, 31 January 2008 (UTC)
added a small section on history of general purpose microprocessors Matsuiny2004 ( talk) 22:11, 18 April 2009 (UTC)
added citations to the first types section Matsuiny2004 ( talk) 21:58, 18 April 2009 (UTC)
can somebody do some more research on the TMS 1000 since the source I have used considers it a microcontroller. If this is so then should it not be moved to the micro controllers article? Matsuiny2004 ( talk) 22:37, 18 April 2009 (UTC)
It's been decades since I've been that deep into the matter, but we used to have these simple block diagrams of the essential components of a microprocessor. If s.o. knows what I'm talking about and still has one or can find one it would be nice if we could add something like that to this page. 71.236.24.129 ( talk) 09:59, 13 May 2009 (UTC)
Datapoint never used the 8008 or 8080 altho they did play a role in their creation. They were too slow. The only unit I recall that used a single-chip "microprocessor" was their 15xx series which used the Z80. more info here: http://www.old-computers.com/museum/doc.asp?c=596
and more: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9111341
Ken ( talk) 15:40, 26 May 2009 (UTC)
I have changed the appropriate text in the main article to reflect this. Ken ( talk) 02:31, 5 June 2009 (UTC)
So, I'm in a minor edit war with what I assume to be the same anonymous contributor (IP address varies, but writing style and method is the same -- you may want to register an account to make things clearer, or at least provide a handle in the edit summary). I keep removing a giant list of manufacturers, and the other contributor keeps putting it back in, with an edit summary that implies they are concerned that the article gives the impression that the microprocessors used in general-purpose PCs are the only applications.
I find that's a reasonable concern. If you want to make sure it is understood that microprocessors are used both in general-purpose and embedded designs, by all means, do so. But please do so in prose, by discussing applications in both GP PCs and embedded systems. Ideally, cite market-share figures in reliable sources, for both applications. It would be nice to know what the percentages are. (Be aware that we currently draw a distinction between microprocessor and microcontroller. Perhaps both articles should be clarified.)
However, I must insist that dumping a huge list of manufacturers into the article is the wrong thing to do. This is purely an editorial/style objection. Lists belong in the list pages we already have. They should not be duplicated here.
Thanks. — DragonHawk ( talk| hist) 17:59, 26 December 2009 (UTC)
Well, at least according to the RCA 1802 article it didn't. —Preceding unsigned comment added by Stib ( talk • contribs) 23:44, 25 May 2010 (UTC)
http://en.wikipedia.org/wiki/Transputer
Skyshack ( talk) 17:45, 1 April 2011 (UTC)
Getting back to the topic of this article, was the Transputer as big a deal as it seemed at the time? It got a lot of press but seems to have faded away as "regular" microprocessors caught up; I wonder why the Transputer didn't keep its lead over more complicated processors. -- Wtshymanski ( talk) 23:37, 1 April 2011 (UTC)
I think that this page doesn't offer enough information, such as how they function, how the transistors work, what type of transistors there are, such as MOSFETS. Then again, no a lot of people need to learn all that. — Preceding unsigned comment added by Patrick-liu11 ( talk • contribs) 19:14, 3 April 2011 (UTC)
This section implies that in the opinion of the Smithsonian staff the TMS 1000 was the first microprocessor. In fact, the link is to a page from a book that the Smithsonian has scanned in called STATE OF THE ART. The bottom of the page says "The National Museum of American History and the Smithsonian Institution make no claims as to the accuracy or completeness of this work." The information in this section was discredited in connection with litigation in the 1990s, when Texas Instruments claimed to have patented the microprocessor. In response, Lee Boysel assembled a system in which a single 8-bit AL1 was used as part of a courtroom demonstration computer system, together with ROM, RAM and an input-output device. See the Wikipedia article on Four Phase Systems: http://en.wikipedia.org/wiki/Four_Phase_Systems_AL1 — Preceding unsigned comment added by GilCarrick ( talk • contribs) 16:43, 8 June 2011 (UTC)
...see http://home.comcast.net/~gordonepeterson2/schaller_dissertation_2004.pdf
The main article is missing, among other things, the Four Phase AL1 (of of several claims prior to Intel 4004). Schaller's discussion is even-handed and makes it clear that the history is complicated enough for it to be impossible to simply pick a "winner" as being "the first".
Schaller begins "CHAPTER 7: The Invention of the Microprocessor, Revisited" with an excellent selection of quotes from other cited sources:
"The 4004, invented by Intel, was the world's first commercially available microprocessor." (Intel website)1
"TI invents the single-chip microcomputer and receives the first patent for the single-chipmicroprocessor, ushering in the personal computer era." (Texas Instruments website)2
"The first microprocessor in a commercial product was Lee Boysel's AL1, which was designed and built at Four-Phase for use in a terminal application in 1969." (Nick Tredennick)3
"Alongside to the IC, the invention of the 'micro-processor' (MPU - Micro Processing Unit) is the greatest invention of the 20th century in the field of electronics." (Busicom Corp.)4
"[T]he idea of putting the computer on a chip was a fairly obvious thing to do. People had been talking about it in the literature for some time, it's just... I don't think at that point anybody realized that the technology had advanced to the point where if you made a simple enough processor, it was now feasible.~] (Ted Hoff)5
"Having been involved with integrated electronics when I was at Intel, we never conceived of patenting a computer on a chip or CPU on a chip, because the idea was patently obvious. That is you worked on a processor with 25 chips, then 8 chips, and by- God eventually you get one chip so where's 'the invention'." (Stan Mazor)6
Such inventions don't come from new scientific principles but from the synthesis of existing principles... Because these inventions have a certain inevitability about them, the real contribution lies in making them work. (Federico Faggin)7
[A]t the time in the early 1970s, late 1960s, the industry was ripe for the invention of the microprocessor. With the industry being ready for it, I think the microprocessor would have been born in 1971 or 1972, just because the technology and the processing capability were there. (Hal Feeney)8
"I don't think anyone 'invented' the microprocessor. Having lived through it, this [claim] sounds so silly." (Victor Poor)9
"It is problematic to call the microprocessor an 'invention' when every invention rides on the shoulders of past inventions." (Ted Hoff)10
"Most of us who have studied the question of the origin of the microprocessor have concluded that it was simply an idea whose time had come. Throughout the 1960's there was an increasing count of the number of transistors that could be fabricated on one substrate, and were several programs in existence, both commercial and government funded, to fabricate increasingly complex systems in a monolithic fashion. (Robert McClure)11
The question of 'who invented the microprocessor?' is, in fact, a meaningless one in any non-legal sense. The microprocessor is not really an invention at all; it is an evolutionary development, combining functions previously implemented on separate devices into one chip. Furthermore, no one individual was responsible for coming up with this idea or making it practical. There were multiple, concurrent efforts at several companies, and each was a team effort that relied on the contributions of several people.? (Microprocessor Report)12
"The emergence of microprocessors is not due to foresight, astute design or advanced planning. It has been accidental." (Rodnay Zaks)13
"The only thing that was significant about the microprocessor was that it was cheap! People now miss this point entirely." (Stan Mazor)14
1 "Intel Consumer Desktop PC Microprocessor History Timeline," http://www.intel.com/pressroom/archive/backgrnd/30thann_timeline.pdf
2 "History of Innovation: 1970s," http://www.ti.com/corp/docs/company/history/1970s.shtml
3 Nick Tredennick, "Technology and Business: Forces Driving Microprocessor Evolution," Proceedings of the IEEE, Vol. 83, No. 12, December 1995, 1647.
4 "Innovation: The World's first MPU 4004," http://www.dotpoint.com/xnumber/agreement0.htm
5 Ted Hoff as quoted in Rob Walker, "Silicon Genesis: Oral Histories of Semiconductor Industry Pioneers, Interview with Marcian (Ted) Hoff, Los Altos Hills, California" Stanford University, March 3, 1995.
6 Stan Mazor, Stanford University Online Lecture, May 15, 2002, 020515-ee380-100, http://www.stanford.edu/class/ee380/
7 Federico Faggin, "The Birth Of The Microprocessor: An invention of major social and technological impact reaches its twentieth birthday," Byte, Volume 2, 1992, 145, http://www.uib.es/c- calculo/scimgs/fc/tc1/html/MicroProcBirth.html
8 "Microprocessor pioneers reminisce: looking back on the world of 16-pin, 2000-transistor microprocessors," Microprocessor Report, Vol. 5, No. 24, December 26, 1991, 13(6). Hal Feeney helped design the 8008 at Intel.
9 Vic Poor, former vice president of research R&D for Datapoint, telephone interview with the author, June 5, 2003.
10 Dean Takahashi, "Yet Another 'Father' of the Microprocessor Wants Recognition From the Chip Industry," Wall Street Journal, September 22, 1998, http://www.microcomputerhistory.com/f14wsj1.htm
11 See e-mail/newsgroup posting to Dave Farber's IP list dated May 12, 2002 to Dave Farber dave@farber.net McClure was formerly with TI and helped found CTC; he also was an expert witness in the Boone patent case.
12 Microprocessor Report, op. cit.
13 Rodnay Zaks, Microprocessors: from chips to systems, 3/e, SYBEX Inc., 1980, First Edition Published 1977, 29.
14 Stan Mazor, telephone interview with the author, June 10, 2003.
It's a rich source of information for enhancing the main article (and quite interesting reading for its own sake)
Dougmerritt 04:32, 23 January 2007 (UTC)
This page has obviously gone through a lot of editing and the result it that it contradicts itself in several places. The section on the Four-Phase Systems AL1 was apparently added somewhat late in the evolution. It refers to the litigation where TI tried to overturn Intel microprocessor patents. The case was dismissed when Lee Boysel demonstrated that the Four Phase AL1 processor predated both the TI and Intel designs.
The section titled "Firsts" says that "Three projects delivered a microprocessor at about the same time," and mentions TI, Intel and the CADC. It should at least also mention the AL1 since it was clearly first.
The section titled "Intel 4004" says "The Intel 4004 is generally regarded as the first microprocessor." This is contradicted by the section on the AL1.
The section titled "8-bit designs" says "The Intel 4004 was followed in 1972 by the Intel 8008, the world's first 8-bit microprocessor." The AL1 was an 8 bit processor and predated the 4004, much less the 8008. See the Wikipedia article on the Four Phase AL1: http://en.wikipedia.org/wiki/Four_Phase_Systems_AL1 — Preceding unsigned comment added by GilCarrick ( talk • contribs) 17:25, 8 June 2011 (UTC)
I just recently made an edit to the article which included a small change on the subject named by the title of this section, and I left a note in the edit summary referring here. The article had described the x86 memory segmentation model prior to the introduction of the 80286 (i.e. the x86 real mode memory model) as "crude"; I slightly adjusted this to remove bias. While it is undeniable that many programmers disliked (and even hated, sometimes intensely) the 8086 segment register design, calling 8086 memory segmentation "crude" is an opinion; the word has a pejorative connotation and implies a negative judgement along with the objective characterization that this aspect of the 8086 is not sophisticated or advanced. Contrasting 8086 segmentation with other processors' designs, it was clearly innovative (noting that not all innovations are improvements over past designs); this is evident from the simple fact that no microprocessor before the 8086 used any memory segmentation method quite like it, and none received qualitatively similar criticism. 8086 segmentation is also undeniably limited, particularly in that every segment is 64 KB in size, making it undisputedly difficult (or at least a non-trivial problem) to deal with large data objects (such as arrays or instances of implementations of any kind of ADT), i.e. ones larger than 64K bytes (= 64 KiB = 65536 Bytes). This is a tradeoff in an engineering design that, it should be remembered, was a solution to the problem of making a moderate-cost 16-bit processor able to address more than 2^16 words, (i.e. able to drive more than 16 address bus lines). But nonetheless, it is a limit, and one that many 8086 programmers found themselves having to deal with frequently.
On the other hand, in my opinion, I find this (the difficuly with >64 K memory blocks) to be the only really major disadvantage of the strategy the 8086 design engineers chose, and I otherwise find the 8086 memory segmentation model extremely flexible. You can use it like a bank-switching system, like a double-register addressing system (in the mode of the HL register of the 6502 CPU), or for up to two-level plus immediate indexed addressing ([base address in segment] + BX + SI + [immediate displacement]). Perhaps it takes an imaginative attitude and a fresh perspective.
Considering all of this, I have changed "crude" to "innovative but limited", which is objective and, I believe, fair. I call it fair because it balances what is generally considered a positive quality (innovation) with a negative one (limitation). It also avoids injecting inappropriate details into this article, as a more detailed characterization of the processor's memory addressing model would.
(For readers unfamilar with x86 real mode, it basically works like this: All addresses are 20-bit values that are each built from two 16-bit values, called the "segment" and the "offset". The innovative part is that rather than each bit of the finished address coming from either the segment or the offset, the segment is shifted left by four bits [i.e. multiplied by sixteen] and then added to the offset to generate the 20-bit address. For dealing with the segment parts of address, the CPU has four segment registers, one for code (which is always combined with the IP to generate the execution address), two for data, and one for the stack. Most instructions have default segment registers from which they get the segment part of any addresses in their operands, but those can be overridden with opcode prefixes. A key aspect that is unusual is that there are multiple segment and offset combinations [4096 of them, in fact] that correspond to each physical address. This system has been much maligned for being allegedly too complicated and illogical, but it actually makes sense, and it in fact does work, as the existence of thousands of MS-DOS software tiles attests. Of course, just because it works doesn't logically imply it's any good.)
It appears to me that this topic could easily be the subject of a flame war (among people who care about old CPUs, of which I of course am one), and I certainly am not out to start one of those pointless wastes of time, especially here on WP of all places. I personally have a decent respect for the 8086 and the accomplishment of the Intel engineers that designed it, while admitting that the 80286 is better, the 80386 is even better, and the Motorola 68000 is better than either of the first two or maybe all three of those, discounting cost and from a programmer's perspective. Still, I think I removed a significant but subtle bias from the sentence about the 8086 (a.k.a. x86 real mode) segmented memory model, and I hope the WP community will agree. I just wanted to explain my reasoning. --Stephen 74.109.5.17 ( talk) 12:29, 24 July 2011 (UTC)
Once again, we get a list of part numbers but very little explanation as to *why* there were so many part numbers. Why did we waste all that money on 6502s when the Itanium is clearly a better processor? A little history might be more encyclopediac than a recitation of part numbers, as popular as those are. Isn't there a List of microprocessor someehere that we can point at here instead of reciting numbers with no reasons behind them? -- Wtshymanski ( talk) 16:30, 23 August 2011 (UTC)
This article has been found to be edited by students of the Wikipedia:India Education Program project as part of their (still ongoing) course-work. Unfortunately, many of the edits in this program so far have been identified as plain copy-jobs from books and online resources and therefore had to be reverted. See the India Education Program talk page for details. In order to maintain the WP standards and policies, let's all have a careful eye on this and other related articles to ensure that no copyrighted material remains in here. -- Matthiaspaul ( talk) 14:15, 30 October 2011 (UTC)
Please check/correct
"l, with TI as intervenor and owner of the microprocessor patent." [sic]
shouldn't this be "inventor" or am I missing something legalistic? — Preceding unsigned comment added by 69.86.252.239 ( talk) 22:55, 3 December 2011 (UTC)
can we interface different -2 no. of bits system i.e. a micro controller is 32 bit but the out put device are 64 bit. will data loss or not when data are transfer mpu to output device. — Preceding unsigned comment added by Pradyuman Katiyar ( talk • contribs) 12:34, 27 August 2012 (UTC)
In 1968, Garrett AiResearch, with designer Ray Holt and Steve Geller, were invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy's new F-14 Tomcat fighter.
The processor was used for the flight control computer, or for the Fire Control System (FCS)? Because as far as I know, the Tomcat didn't have a fly-by-wire control system. Maybe the author meant the FCS but got confused? —Preceding unsigned comment added by 79.107.73.166 ( talk) 05:21, 28 October 2008 (UTC)
Added April 8, 2013. I am Ray Holt and I meant flight control computer. The F-14 CADC was a fly-by-wire control of the moving surfaces including the wings and provided real-time data to the communications computer and the weapons computer. it also provide altitude, mach #, angle of attack to the pilot ... as well as many other functions. — Preceding unsigned comment added by Zajacik ( talk • contribs) 16:29, 8 April 2013 (UTC)
The intro to this page makes no sense to someone without a background. —Preceding unsigned comment added by 71.163.67.111 ( talk) 04:54, 6 June 2009 (UTC)
there's a similar list on Central processing unit. Do these need merging, or is one the parent article of the other? -- Tarquin 16:57 Jan 5, 2003 (UTC)
I'm a bit concerned about the assertion that a microprocessor is programmable. Is it not the computer that is programmable? Programmable suggests that the program is within what is being programmed. A microprocessor can execute program code, but the code that it executes is external to it (OK, perhaps this dubious use of language is now standard, in which case I suppose this can stay. Let the experts decide here). Brian Josephson ( talk) 21:09, 22 April 2013 (UTC)
The first 16-bit single-chip microprocessor is the Hewlett-Packard BPC, released in late 1973. The BPC is basically a single-chip implementation of a HP 2100-series minicomputer. The BPC was available as a stand-alone device, but was more commonly implemented in a single package with other HP support chips known as the IOC, EMC and AEC. The BPC was originally designed as the CPU of the HP 9825 computer, but was later used in the 9845 and various pieces of computerized HP test equipment, such as the 4955A Protocol Analyzer and 64000 development system. -- Accutron ( talk) 15:10, 16 June 2013 (UTC)
The combination of ALU and CU is called CPU — Preceding unsigned comment added by 182.68.97.208 ( talk) 05:17, 26 February 2014 (UTC)
What sort of materials are used in production? What chemical elements are present in the final product (especially those other than silicon? Apparently some are conflict minerals, which is notable. -- Beland ( talk) 16:22, 8 April 2014 (UTC)
What is this supposed to be saying? "Microprocessors integrated into one or a few large-scale ICs the architectures that had previously been implemented using many medium- and small-scale integrated circuits." That smells like bad cut and paste. It appears to try to explain the microprocessors were previously a few separate ICs before they become integrated, or something. 108.33.72.18 ( talk) 14:59, 8 May 2014 (UTC)
Nobody who knows what they are talking about would describe this processor as a 32-bit processor. The only people who regularly did so were marketing people for rather obvious reasons. Virtually every reference work describes the processor as 16 bit. [1] [2] In the case of both references, although they make reference to to the 68000's 32 bit data and address registers, nowhere does either work claim that the 68000 is a 32-bit processor. If anyone really wants to insist that the 68000 is listed as a 32-bit processor, then I shall insist that the 8080 and the Z80 are listed as 16-bit processors (precisely because both have 16-bit data and address registers, and the Z80 even has 16-bit arithmetic instructions). DieSwartzPunkt ( talk) 18:10, 22 January 2015 (UTC)
References
INX
, DCX
, and DAD
. The Z80 expands on these. They are implemented in precisely the same way as 32-bit operations on the MC68000: microcoded double-precision arithmetic using the normal ALU. (Actually, the 8080 & 8085 block diagrams show a 16-bit address incrementer/decrementer—it is unclear whether this relates only to SP or can operate on all register pairs—so while DAD
is a double precision main-ALU addition in microcode, INX
and DCX
might not be.)
74.103.131.73 (
talk)
06:43, 22 March 2016 (UTC)"The founders of Pico had the idea that it was possible to develop a single chip calculator (most calculators at the time used at least 5 ICs). Pico did this and this calculator IC was actually the world's first microprocessor (despite what Intel, or Texas Instruments would like you to believe)." [28] (used as a source at X10 (industry standard)). Integrated circuit says 4004 is "the world's first microprocessor" while this article qualifies. [I know the difference of a calculator and a PC, i.e. the calculator was not programmable, but was the chip?]
I see now, Pico is already in the article and "lay claim to be one of the first microprocessors or microcontrollers having ROM, RAM and a RISC instruction set on-chip." Having RAM on-chip, is stronger then later (4004, and even (most) current micropocessors) or former systems in the article can claim. RISC, may be dubious or not.., I doubt it means what it does now. Is "The key team members had originally been tasked by Elliott Automation to create an 8-bit computer in MOS" about the same chip? Then 8-bit vs. 4-bit for 4004, and I'm curious about size/transistor count. The lowest I've heard for any CPU is about 4000, curious how it compares and if anyone has beat that..
There is "Category:American inventions" but possibly should say "Category:Scottish inventions"? comp.arch ( talk) 14:46, 20 September 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on Microprocessor. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 18:18, 10 June 2017 (UTC)
Is a microprocessor actually abbreviated μP? It certainly isn't an abbreviation in common usage, so if this is some specific jargon it should be labeled as such. -- Delirium 04:51, Dec 12, 2003 (UTC)
The initial definition says a microprocessor is implemented on a single chip, which I have always understood to be an essential feature. However, further down the page there is mention of multi-chip 16-bit "microprocessors", which by this definition cannot exist. — Preceding unsigned comment added by 212.44.25.184 ( talk) 14:17, February 8, 2005 (UTC)
Is there any support for moving the section History of Operating System support for 64 bit microchips somewhere else, like maybe Operating Systems? It doesn't seem to serve much purpose here (other than a thinly veiled Linux good M$ bad dig) — Preceding unsigned comment added by Alf Boggis ( talk • contribs) 15:32, September 1, 2005 (UTC)
How about this The article. 134.250.72.176 — Preceding undated comment added 04:06, October 28, 2005 (UTC)
"National introduced the first 16-bit single-chip microprocessor, the National Semiconductor PACE..." and then a paragraph or so later, "The first single-chip 16-bit microprocessor was TI's TMS 9900..." — Preceding unsigned comment added by 66.41.35.114 ( talk) 19:00, August 10, 2007 (UTC)
![]() | This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
re: Suggestion to merge CPU and Microprocessor
I've done some preliminary work on this user subpage of mine. Work is still undergoing though and it's far from complete.
Anyone care to load it up, tell me what you think, maybe make a few changes to it? Leave any comments on my user talk. Thanks.
Also, we'll need to decide if we do go ahead with the merger which page to keep and which to change to a #redirect..
splintax 14:47, 8 September 2005 (UTC)
Some parts of this article are also similar to integrated circuit. — Preceding unsigned comment added by Twilight Realm ( talk • contribs) 01:19, September 27, 2005(UTC)
"As with many advances in technology, the microprocessor was an idea whose time had come." This isn't the right style of writing, and it isn't neutral. I haven't read much of the article, so there may be more. Also, a grammar error, right after the explanation of Moore's Law. I would normally do it, but... not now. Too tired, I'd mess it up. Twilight Realm 01:22, 27 September 2005 (UTC)
This article, along with the articles for integrated circuit, CPU, etc, don't make it clear what's different about them. How exactly is a microprocessor different than an IC other than "microprocessors are very advanced integrated circuits"? Twilight Realm 01:46, September 27, 2005 (UTC)
Are there any criteria defining microchips? Or is it just an objective term? The IC article says "For the first time it became possible to fabricate a CPU or even an entire microprocessor on a single integrated circuit." To me at least, that sounds like there's a specific level an IC must pass to be considered a microprocessor, or that there's a difference, that a microprocessor contains some components that an IC doesn't necessarily have to have. Clarification would be appreciated, even though it's not in this article. Twilight Realm 01:16, 30 September 2005 (UTC)
Sorry that I didn't use an edit summery on my last edit, but I clicked on save page instead of minor edit. It was just reverting vandalism though. -- Apyule 12:36, 2 November 2005 (UTC)
This section is totally misplaced here. Not only does it have NOTHING to do with microprocessors, but it was TOTALLY wrong. Linux support for 64 bit microprocessors dates back to the Alpha and MIPS ports (LONG before x86-64). Windows support also dates back to NT 3's Alpha and MIPS R4xxx ports. Likewise, Mac OSX's blood relatives Darwin, Mach, and L4 all ran on 64-bit microprocessors before OSX was compiled for PowerPC64.
The section was REALLY 'history of OS support for x86-64,' which is already included in the AMD64 article (in much more complete form). -- uberpenguin 02:38, 18 December 2005 (UTC)
"In 64-bit computing, the DEC(-Intel) ALPHA, the AMD 64, and the HP-Intel Itanium are the most popular designs as of late 2004." Was that really true? And, if so, is it still true? Or is "popular" defined as something other than "most common"? I suspect there might be more 64-bit SPARC machines and 64-bit POWER/PowerPC machines (especially if you include AS/400 and iSeries PowerAS) than Alpha machines, much less Itanium machines. Guy Harris 19:06, 24 December 2005 (UTC)
Hey didn't you forget the Intel Pentium D (Dual Core) processors. I feel the performance of Intel is way better than the DEC, AMD, others. — Preceding unsigned comment added by 167.206.128.33 ( talk) 00:52, January 26, 2006 (UTC)
I think there should be more links to processor architecture from this page. Von neuman, Harvard, DIB, etc. — Preceding unsigned comment added by 167.206.128.33 ( talk) 00:53, January 26, 2006 (UTC)
I'm not sure, if MIPS was really the first here. ARM was made in (working!) silicon (ARM1) on April 26 1985, first products were sold 1986 (exact date missing, the "ARM Development System", a second processor card for the BBC Micro ), first workstations released June 1987 ( Acorn Archimedes). But I don't know, when the first working MIPS silcon was made (I find 1985-1987 on the web, mips.com says nothing), what the first MIPS based products were, and when they were released. Some of the early products I know are the DECstation 2100 (1989), SGI Indigo (1990), MIPS Magnum 3000 (1990). Another candidate would be IBM ROMP, the first workstation was released 1986 (exact date missing), other products before that unlikely. - Alureiter 16:02, 7 February 2006 (UTC)
The first paragraph tells me what a microprocessor is made of but doesn't tell me what it does. I would like to see a succinct sentence about what a microprocessor actual does (execute instructions, for example), and then perhaps explain it a bit more in section farther down in the article.
Then at the end of the article there are three screens full of lists of various stuff. On Wikipedia it's easy to allow lists to get out of control and lose sight of what makes a thorough, balanced article. And complete doesn't mean we have to make a list of every possible internal and external link that might be somehow related!
So, tell me what the thing does and judiciously select a very few closely related links that might also be helpful. JonHarder 22:10, 16 July 2006 (UTC)
@ 2006-07-16 23:01Z
WRT the reverting to #transistors doubling every 18 months: I initially thought that this was wrong, but on checking the article, even though 18 months is oft quoted, 24 seems to fit the data much better. Also, from Moore's law:
I think the best thing may be to change the 18 at the top of the Moore's Law article to 24, and re-revert the change here. Comments? -- Mike Van Emmerik 22:42, 27 February 2006 (UTC)
From text:
The world's first single-chip 32-bit microprocessor was the AT&T Bell Labs BELLMAC-32A, with first samples in 1980, and general production in 1982(...)
but a few lines later:
The most famous of the 32-bit designs is the MC68000, introduced in 1979.
so, which one is right? it was the bellmac-3a or the mc68k?
Alejandro Matos 14:47, 20 November 2006 (UTC)
I suggest a link to my site called 'How Computers Work: Processor and Main Memory' at http://www.fastchip.net/howcomputerswork/p1.html . It tells how a processor and memory work simply and in COMPLETE DETAIL. A microprocessor is a processor on a single chip. It is not to replace the 'How Stuff Works' link but compliment it. If you understand this book/site, you will understand PRECISELY what a microprocessor and its main parts are and how they work together. Thinkorrr 01:09, 4 December 2006 (UTC)
I just googled "Hyatt microprocessor" and found this. Apparently TI overturned the earlier patent on the grounds that it was never implemented at the time. -- ArtifexCrastinus 06:57, 12 December 2006 (UTC)
I'm a little wary that the article classifies GPUs as microprocessors. I have always seen the term "microprocessor" applied to an IC-based CPU. As I'm sure most readers realize, GPUs are much more akin to DSPs or stream processors than CPUs, despite the unfortunate acronym similarity. The programmability and general design model of GPUs certainly does not qualify it to be called a CPU. So my question is, is it appropriate to call a GPU a microprocessor, given that I've always known the term microprocessor to be related to CPUs? I'm not entirely sure, thoughts? -- uberpenguin 12:59, 20 October 2005 (UTC)
Halfway through making a list of papers from IEEE journals to demonstrate the term's usage, I decided that all this rhetoric is really silly over a minor terminology disagreement. I went ahead and wrote a section describing the usage of "microprocessor" to mean something other than a CPU; feel free to add to it or revise it as you see fit. I still hold that DSPs and GPUs are not in themselves microprocessors, but I doubt many people would have such issues with using the term thus. I do feel strongly, however, that when no further clarification is given, the term "microprocessor" can safely be assumed to refer to a CPU. The section I wrote reflects that point. -- uberpenguin 22:40, 19 December 2005 (UTC)
{{
citation}}
: Explicit use of et al. in: |author=
(
help) - Paper describing the architecture of the TM3270 media processor. It's somewhat similar to a DSP/GPU, but is actually much closer architecturally to a CPU than GPUs are. The article never refers to the TM3270 as a CPU or a microprocessor, but as a "media processor" (actually, I think a very apt term for GPUs and CPUs).{{
citation}}
: CS1 maint: multiple names: authors list (
link) - Paper specifically addressing general purpose programming on the latest generation of programmable GPUs (this was only published in October of this year). It refers to GPUs as "stream processors," never microprocessors. It even makes a very clear distinction between GPUs and CPUs (as, IMO, it should).
Here are some reference points for inclusion of a "specialized microprocessor" subsection. MFNickster 05:58, 18 December 2005 (UTC)
Even though they dominate the desktop computers, there is almost no mention of the x86 family of processors at all in the history section after i386?
MIPS is not only used in embedded systems "like Cisco routers". The PlayStation game consolesare perhaps more well-known? — Preceding unsigned comment added by 80.202.211.146 ( talk) 16:49, February 6, 2005 (UTC)
I find it odd that the notable 32-bit section says the following: "The most famous of the 32-bit designs is the MC68000, introduced in 1979." The question here is, if the word famous is being used in the normal fashion, shouldn't the MOST famous 32-bit be a member of the x86 family? Regardless of how many applications there were of the 68k series, fame is a measure of popular knowledge. I'm not saying that the x86 family needs a boost in the article so much as that a word other than famous should be used to describe why the 68k series is more SIGNIFICANT than the x86, which I would argue it is. Jo7hs2 22:15, 2 November 2007 (UTC)
There is no such article. If someone removed it, please provide a substitute. If not, please remove the link. Landroo 13:31, 1 September 2007 (UTC)
Look at these articles everyone!
http://www.indybay.org/newsitems/2004/12/08/17088681.php
http://www.thocp.net/biographies/pickette_wayne.html
Its about the real brains and the actual "father" of the microprocessor. How come he isn't included in this article? And there isn't a single mention of him in Wikipedia either! His name doesn't appear anywhere as far as I've seen! Seriously this is one great guy screwed by Intel, Fairchild etc. big time!
And this is to the moderator(s): kindly dont hide what I've just (with a * or whatever). Certain stuff needs to be spoken out loud!
Hope he gets the credit due to him soon! Krishvanth ( talk) 06:50, 5 January 2008 (UTC)
Regarding the claim: There have even been designs for simple computing machines based on mechanical parts such as gears, shafts, levers, Tinkertoys, etc. Leonardo DaVinci made one such design, although none were possible to construct using the manufacturing techniques of the time. ... Does anyone know if the Leonardo DaVinci mechanical 'computer' or 'processor' claim is true? It's not mentioned in the Leonardo article, unless Leonardo's robot is considered a computing device. Reading up on the 'robot' does not sell me on the 'computing' possibility, though it is obviously an impressive contraption for the time. -- Ds13 03:31, 2004 Apr 15 (UTC)
Yes, mechanical computers have been designed and built. I suspect the original writer is thinking about the difference engine and analytical engine designed by Charles Babbage. -- 68.0.124.33 ( talk) 02:16, 8 March 2008 (UTC)
It's funny how this article explains jack shit about how microprocessors work. The most simple thing this article should have is somehow nonexistant. —Preceding unsigned comment added by 137.28.55.114 ( talk) 21:55, 31 January 2008 (UTC)
added a small section on history of general purpose microprocessors Matsuiny2004 ( talk) 22:11, 18 April 2009 (UTC)
added citations to the first types section Matsuiny2004 ( talk) 21:58, 18 April 2009 (UTC)
can somebody do some more research on the TMS 1000 since the source I have used considers it a microcontroller. If this is so then should it not be moved to the micro controllers article? Matsuiny2004 ( talk) 22:37, 18 April 2009 (UTC)
It's been decades since I've been that deep into the matter, but we used to have these simple block diagrams of the essential components of a microprocessor. If s.o. knows what I'm talking about and still has one or can find one it would be nice if we could add something like that to this page. 71.236.24.129 ( talk) 09:59, 13 May 2009 (UTC)
Datapoint never used the 8008 or 8080 altho they did play a role in their creation. They were too slow. The only unit I recall that used a single-chip "microprocessor" was their 15xx series which used the Z80. more info here: http://www.old-computers.com/museum/doc.asp?c=596
and more: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9111341
Ken ( talk) 15:40, 26 May 2009 (UTC)
I have changed the appropriate text in the main article to reflect this. Ken ( talk) 02:31, 5 June 2009 (UTC)
So, I'm in a minor edit war with what I assume to be the same anonymous contributor (IP address varies, but writing style and method is the same -- you may want to register an account to make things clearer, or at least provide a handle in the edit summary). I keep removing a giant list of manufacturers, and the other contributor keeps putting it back in, with an edit summary that implies they are concerned that the article gives the impression that the microprocessors used in general-purpose PCs are the only applications.
I find that's a reasonable concern. If you want to make sure it is understood that microprocessors are used both in general-purpose and embedded designs, by all means, do so. But please do so in prose, by discussing applications in both GP PCs and embedded systems. Ideally, cite market-share figures in reliable sources, for both applications. It would be nice to know what the percentages are. (Be aware that we currently draw a distinction between microprocessor and microcontroller. Perhaps both articles should be clarified.)
However, I must insist that dumping a huge list of manufacturers into the article is the wrong thing to do. This is purely an editorial/style objection. Lists belong in the list pages we already have. They should not be duplicated here.
Thanks. — DragonHawk ( talk| hist) 17:59, 26 December 2009 (UTC)
Well, at least according to the RCA 1802 article it didn't. —Preceding unsigned comment added by Stib ( talk • contribs) 23:44, 25 May 2010 (UTC)
http://en.wikipedia.org/wiki/Transputer
Skyshack ( talk) 17:45, 1 April 2011 (UTC)
Getting back to the topic of this article, was the Transputer as big a deal as it seemed at the time? It got a lot of press but seems to have faded away as "regular" microprocessors caught up; I wonder why the Transputer didn't keep its lead over more complicated processors. -- Wtshymanski ( talk) 23:37, 1 April 2011 (UTC)
I think that this page doesn't offer enough information, such as how they function, how the transistors work, what type of transistors there are, such as MOSFETS. Then again, no a lot of people need to learn all that. — Preceding unsigned comment added by Patrick-liu11 ( talk • contribs) 19:14, 3 April 2011 (UTC)
This section implies that in the opinion of the Smithsonian staff the TMS 1000 was the first microprocessor. In fact, the link is to a page from a book that the Smithsonian has scanned in called STATE OF THE ART. The bottom of the page says "The National Museum of American History and the Smithsonian Institution make no claims as to the accuracy or completeness of this work." The information in this section was discredited in connection with litigation in the 1990s, when Texas Instruments claimed to have patented the microprocessor. In response, Lee Boysel assembled a system in which a single 8-bit AL1 was used as part of a courtroom demonstration computer system, together with ROM, RAM and an input-output device. See the Wikipedia article on Four Phase Systems: http://en.wikipedia.org/wiki/Four_Phase_Systems_AL1 — Preceding unsigned comment added by GilCarrick ( talk • contribs) 16:43, 8 June 2011 (UTC)
...see http://home.comcast.net/~gordonepeterson2/schaller_dissertation_2004.pdf
The main article is missing, among other things, the Four Phase AL1 (of of several claims prior to Intel 4004). Schaller's discussion is even-handed and makes it clear that the history is complicated enough for it to be impossible to simply pick a "winner" as being "the first".
Schaller begins "CHAPTER 7: The Invention of the Microprocessor, Revisited" with an excellent selection of quotes from other cited sources:
"The 4004, invented by Intel, was the world's first commercially available microprocessor." (Intel website)1
"TI invents the single-chip microcomputer and receives the first patent for the single-chipmicroprocessor, ushering in the personal computer era." (Texas Instruments website)2
"The first microprocessor in a commercial product was Lee Boysel's AL1, which was designed and built at Four-Phase for use in a terminal application in 1969." (Nick Tredennick)3
"Alongside to the IC, the invention of the 'micro-processor' (MPU - Micro Processing Unit) is the greatest invention of the 20th century in the field of electronics." (Busicom Corp.)4
"[T]he idea of putting the computer on a chip was a fairly obvious thing to do. People had been talking about it in the literature for some time, it's just... I don't think at that point anybody realized that the technology had advanced to the point where if you made a simple enough processor, it was now feasible.~] (Ted Hoff)5
"Having been involved with integrated electronics when I was at Intel, we never conceived of patenting a computer on a chip or CPU on a chip, because the idea was patently obvious. That is you worked on a processor with 25 chips, then 8 chips, and by- God eventually you get one chip so where's 'the invention'." (Stan Mazor)6
Such inventions don't come from new scientific principles but from the synthesis of existing principles... Because these inventions have a certain inevitability about them, the real contribution lies in making them work. (Federico Faggin)7
[A]t the time in the early 1970s, late 1960s, the industry was ripe for the invention of the microprocessor. With the industry being ready for it, I think the microprocessor would have been born in 1971 or 1972, just because the technology and the processing capability were there. (Hal Feeney)8
"I don't think anyone 'invented' the microprocessor. Having lived through it, this [claim] sounds so silly." (Victor Poor)9
"It is problematic to call the microprocessor an 'invention' when every invention rides on the shoulders of past inventions." (Ted Hoff)10
"Most of us who have studied the question of the origin of the microprocessor have concluded that it was simply an idea whose time had come. Throughout the 1960's there was an increasing count of the number of transistors that could be fabricated on one substrate, and were several programs in existence, both commercial and government funded, to fabricate increasingly complex systems in a monolithic fashion. (Robert McClure)11
The question of 'who invented the microprocessor?' is, in fact, a meaningless one in any non-legal sense. The microprocessor is not really an invention at all; it is an evolutionary development, combining functions previously implemented on separate devices into one chip. Furthermore, no one individual was responsible for coming up with this idea or making it practical. There were multiple, concurrent efforts at several companies, and each was a team effort that relied on the contributions of several people.? (Microprocessor Report)12
"The emergence of microprocessors is not due to foresight, astute design or advanced planning. It has been accidental." (Rodnay Zaks)13
"The only thing that was significant about the microprocessor was that it was cheap! People now miss this point entirely." (Stan Mazor)14
1 "Intel Consumer Desktop PC Microprocessor History Timeline," http://www.intel.com/pressroom/archive/backgrnd/30thann_timeline.pdf
2 "History of Innovation: 1970s," http://www.ti.com/corp/docs/company/history/1970s.shtml
3 Nick Tredennick, "Technology and Business: Forces Driving Microprocessor Evolution," Proceedings of the IEEE, Vol. 83, No. 12, December 1995, 1647.
4 "Innovation: The World's first MPU 4004," http://www.dotpoint.com/xnumber/agreement0.htm
5 Ted Hoff as quoted in Rob Walker, "Silicon Genesis: Oral Histories of Semiconductor Industry Pioneers, Interview with Marcian (Ted) Hoff, Los Altos Hills, California" Stanford University, March 3, 1995.
6 Stan Mazor, Stanford University Online Lecture, May 15, 2002, 020515-ee380-100, http://www.stanford.edu/class/ee380/
7 Federico Faggin, "The Birth Of The Microprocessor: An invention of major social and technological impact reaches its twentieth birthday," Byte, Volume 2, 1992, 145, http://www.uib.es/c- calculo/scimgs/fc/tc1/html/MicroProcBirth.html
8 "Microprocessor pioneers reminisce: looking back on the world of 16-pin, 2000-transistor microprocessors," Microprocessor Report, Vol. 5, No. 24, December 26, 1991, 13(6). Hal Feeney helped design the 8008 at Intel.
9 Vic Poor, former vice president of research R&D for Datapoint, telephone interview with the author, June 5, 2003.
10 Dean Takahashi, "Yet Another 'Father' of the Microprocessor Wants Recognition From the Chip Industry," Wall Street Journal, September 22, 1998, http://www.microcomputerhistory.com/f14wsj1.htm
11 See e-mail/newsgroup posting to Dave Farber's IP list dated May 12, 2002 to Dave Farber dave@farber.net McClure was formerly with TI and helped found CTC; he also was an expert witness in the Boone patent case.
12 Microprocessor Report, op. cit.
13 Rodnay Zaks, Microprocessors: from chips to systems, 3/e, SYBEX Inc., 1980, First Edition Published 1977, 29.
14 Stan Mazor, telephone interview with the author, June 10, 2003.
It's a rich source of information for enhancing the main article (and quite interesting reading for its own sake)
Dougmerritt 04:32, 23 January 2007 (UTC)
This page has obviously gone through a lot of editing and the result it that it contradicts itself in several places. The section on the Four-Phase Systems AL1 was apparently added somewhat late in the evolution. It refers to the litigation where TI tried to overturn Intel microprocessor patents. The case was dismissed when Lee Boysel demonstrated that the Four Phase AL1 processor predated both the TI and Intel designs.
The section titled "Firsts" says that "Three projects delivered a microprocessor at about the same time," and mentions TI, Intel and the CADC. It should at least also mention the AL1 since it was clearly first.
The section titled "Intel 4004" says "The Intel 4004 is generally regarded as the first microprocessor." This is contradicted by the section on the AL1.
The section titled "8-bit designs" says "The Intel 4004 was followed in 1972 by the Intel 8008, the world's first 8-bit microprocessor." The AL1 was an 8 bit processor and predated the 4004, much less the 8008. See the Wikipedia article on the Four Phase AL1: http://en.wikipedia.org/wiki/Four_Phase_Systems_AL1 — Preceding unsigned comment added by GilCarrick ( talk • contribs) 17:25, 8 June 2011 (UTC)
I just recently made an edit to the article which included a small change on the subject named by the title of this section, and I left a note in the edit summary referring here. The article had described the x86 memory segmentation model prior to the introduction of the 80286 (i.e. the x86 real mode memory model) as "crude"; I slightly adjusted this to remove bias. While it is undeniable that many programmers disliked (and even hated, sometimes intensely) the 8086 segment register design, calling 8086 memory segmentation "crude" is an opinion; the word has a pejorative connotation and implies a negative judgement along with the objective characterization that this aspect of the 8086 is not sophisticated or advanced. Contrasting 8086 segmentation with other processors' designs, it was clearly innovative (noting that not all innovations are improvements over past designs); this is evident from the simple fact that no microprocessor before the 8086 used any memory segmentation method quite like it, and none received qualitatively similar criticism. 8086 segmentation is also undeniably limited, particularly in that every segment is 64 KB in size, making it undisputedly difficult (or at least a non-trivial problem) to deal with large data objects (such as arrays or instances of implementations of any kind of ADT), i.e. ones larger than 64K bytes (= 64 KiB = 65536 Bytes). This is a tradeoff in an engineering design that, it should be remembered, was a solution to the problem of making a moderate-cost 16-bit processor able to address more than 2^16 words, (i.e. able to drive more than 16 address bus lines). But nonetheless, it is a limit, and one that many 8086 programmers found themselves having to deal with frequently.
On the other hand, in my opinion, I find this (the difficuly with >64 K memory blocks) to be the only really major disadvantage of the strategy the 8086 design engineers chose, and I otherwise find the 8086 memory segmentation model extremely flexible. You can use it like a bank-switching system, like a double-register addressing system (in the mode of the HL register of the 6502 CPU), or for up to two-level plus immediate indexed addressing ([base address in segment] + BX + SI + [immediate displacement]). Perhaps it takes an imaginative attitude and a fresh perspective.
Considering all of this, I have changed "crude" to "innovative but limited", which is objective and, I believe, fair. I call it fair because it balances what is generally considered a positive quality (innovation) with a negative one (limitation). It also avoids injecting inappropriate details into this article, as a more detailed characterization of the processor's memory addressing model would.
(For readers unfamilar with x86 real mode, it basically works like this: All addresses are 20-bit values that are each built from two 16-bit values, called the "segment" and the "offset". The innovative part is that rather than each bit of the finished address coming from either the segment or the offset, the segment is shifted left by four bits [i.e. multiplied by sixteen] and then added to the offset to generate the 20-bit address. For dealing with the segment parts of address, the CPU has four segment registers, one for code (which is always combined with the IP to generate the execution address), two for data, and one for the stack. Most instructions have default segment registers from which they get the segment part of any addresses in their operands, but those can be overridden with opcode prefixes. A key aspect that is unusual is that there are multiple segment and offset combinations [4096 of them, in fact] that correspond to each physical address. This system has been much maligned for being allegedly too complicated and illogical, but it actually makes sense, and it in fact does work, as the existence of thousands of MS-DOS software tiles attests. Of course, just because it works doesn't logically imply it's any good.)
It appears to me that this topic could easily be the subject of a flame war (among people who care about old CPUs, of which I of course am one), and I certainly am not out to start one of those pointless wastes of time, especially here on WP of all places. I personally have a decent respect for the 8086 and the accomplishment of the Intel engineers that designed it, while admitting that the 80286 is better, the 80386 is even better, and the Motorola 68000 is better than either of the first two or maybe all three of those, discounting cost and from a programmer's perspective. Still, I think I removed a significant but subtle bias from the sentence about the 8086 (a.k.a. x86 real mode) segmented memory model, and I hope the WP community will agree. I just wanted to explain my reasoning. --Stephen 74.109.5.17 ( talk) 12:29, 24 July 2011 (UTC)
Once again, we get a list of part numbers but very little explanation as to *why* there were so many part numbers. Why did we waste all that money on 6502s when the Itanium is clearly a better processor? A little history might be more encyclopediac than a recitation of part numbers, as popular as those are. Isn't there a List of microprocessor someehere that we can point at here instead of reciting numbers with no reasons behind them? -- Wtshymanski ( talk) 16:30, 23 August 2011 (UTC)
This article has been found to be edited by students of the Wikipedia:India Education Program project as part of their (still ongoing) course-work. Unfortunately, many of the edits in this program so far have been identified as plain copy-jobs from books and online resources and therefore had to be reverted. See the India Education Program talk page for details. In order to maintain the WP standards and policies, let's all have a careful eye on this and other related articles to ensure that no copyrighted material remains in here. -- Matthiaspaul ( talk) 14:15, 30 October 2011 (UTC)
Please check/correct
"l, with TI as intervenor and owner of the microprocessor patent." [sic]
shouldn't this be "inventor" or am I missing something legalistic? — Preceding unsigned comment added by 69.86.252.239 ( talk) 22:55, 3 December 2011 (UTC)
can we interface different -2 no. of bits system i.e. a micro controller is 32 bit but the out put device are 64 bit. will data loss or not when data are transfer mpu to output device. — Preceding unsigned comment added by Pradyuman Katiyar ( talk • contribs) 12:34, 27 August 2012 (UTC)
In 1968, Garrett AiResearch, with designer Ray Holt and Steve Geller, were invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy's new F-14 Tomcat fighter.
The processor was used for the flight control computer, or for the Fire Control System (FCS)? Because as far as I know, the Tomcat didn't have a fly-by-wire control system. Maybe the author meant the FCS but got confused? —Preceding unsigned comment added by 79.107.73.166 ( talk) 05:21, 28 October 2008 (UTC)
Added April 8, 2013. I am Ray Holt and I meant flight control computer. The F-14 CADC was a fly-by-wire control of the moving surfaces including the wings and provided real-time data to the communications computer and the weapons computer. it also provide altitude, mach #, angle of attack to the pilot ... as well as many other functions. — Preceding unsigned comment added by Zajacik ( talk • contribs) 16:29, 8 April 2013 (UTC)
The intro to this page makes no sense to someone without a background. —Preceding unsigned comment added by 71.163.67.111 ( talk) 04:54, 6 June 2009 (UTC)
there's a similar list on Central processing unit. Do these need merging, or is one the parent article of the other? -- Tarquin 16:57 Jan 5, 2003 (UTC)
I'm a bit concerned about the assertion that a microprocessor is programmable. Is it not the computer that is programmable? Programmable suggests that the program is within what is being programmed. A microprocessor can execute program code, but the code that it executes is external to it (OK, perhaps this dubious use of language is now standard, in which case I suppose this can stay. Let the experts decide here). Brian Josephson ( talk) 21:09, 22 April 2013 (UTC)
The first 16-bit single-chip microprocessor is the Hewlett-Packard BPC, released in late 1973. The BPC is basically a single-chip implementation of a HP 2100-series minicomputer. The BPC was available as a stand-alone device, but was more commonly implemented in a single package with other HP support chips known as the IOC, EMC and AEC. The BPC was originally designed as the CPU of the HP 9825 computer, but was later used in the 9845 and various pieces of computerized HP test equipment, such as the 4955A Protocol Analyzer and 64000 development system. -- Accutron ( talk) 15:10, 16 June 2013 (UTC)
The combination of ALU and CU is called CPU — Preceding unsigned comment added by 182.68.97.208 ( talk) 05:17, 26 February 2014 (UTC)
What sort of materials are used in production? What chemical elements are present in the final product (especially those other than silicon? Apparently some are conflict minerals, which is notable. -- Beland ( talk) 16:22, 8 April 2014 (UTC)
What is this supposed to be saying? "Microprocessors integrated into one or a few large-scale ICs the architectures that had previously been implemented using many medium- and small-scale integrated circuits." That smells like bad cut and paste. It appears to try to explain the microprocessors were previously a few separate ICs before they become integrated, or something. 108.33.72.18 ( talk) 14:59, 8 May 2014 (UTC)
Nobody who knows what they are talking about would describe this processor as a 32-bit processor. The only people who regularly did so were marketing people for rather obvious reasons. Virtually every reference work describes the processor as 16 bit. [1] [2] In the case of both references, although they make reference to to the 68000's 32 bit data and address registers, nowhere does either work claim that the 68000 is a 32-bit processor. If anyone really wants to insist that the 68000 is listed as a 32-bit processor, then I shall insist that the 8080 and the Z80 are listed as 16-bit processors (precisely because both have 16-bit data and address registers, and the Z80 even has 16-bit arithmetic instructions). DieSwartzPunkt ( talk) 18:10, 22 January 2015 (UTC)
References
INX
, DCX
, and DAD
. The Z80 expands on these. They are implemented in precisely the same way as 32-bit operations on the MC68000: microcoded double-precision arithmetic using the normal ALU. (Actually, the 8080 & 8085 block diagrams show a 16-bit address incrementer/decrementer—it is unclear whether this relates only to SP or can operate on all register pairs—so while DAD
is a double precision main-ALU addition in microcode, INX
and DCX
might not be.)
74.103.131.73 (
talk)
06:43, 22 March 2016 (UTC)"The founders of Pico had the idea that it was possible to develop a single chip calculator (most calculators at the time used at least 5 ICs). Pico did this and this calculator IC was actually the world's first microprocessor (despite what Intel, or Texas Instruments would like you to believe)." [28] (used as a source at X10 (industry standard)). Integrated circuit says 4004 is "the world's first microprocessor" while this article qualifies. [I know the difference of a calculator and a PC, i.e. the calculator was not programmable, but was the chip?]
I see now, Pico is already in the article and "lay claim to be one of the first microprocessors or microcontrollers having ROM, RAM and a RISC instruction set on-chip." Having RAM on-chip, is stronger then later (4004, and even (most) current micropocessors) or former systems in the article can claim. RISC, may be dubious or not.., I doubt it means what it does now. Is "The key team members had originally been tasked by Elliott Automation to create an 8-bit computer in MOS" about the same chip? Then 8-bit vs. 4-bit for 4004, and I'm curious about size/transistor count. The lowest I've heard for any CPU is about 4000, curious how it compares and if anyone has beat that..
There is "Category:American inventions" but possibly should say "Category:Scottish inventions"? comp.arch ( talk) 14:46, 20 September 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on Microprocessor. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 18:18, 10 June 2017 (UTC)
Is a microprocessor actually abbreviated μP? It certainly isn't an abbreviation in common usage, so if this is some specific jargon it should be labeled as such. -- Delirium 04:51, Dec 12, 2003 (UTC)
The initial definition says a microprocessor is implemented on a single chip, which I have always understood to be an essential feature. However, further down the page there is mention of multi-chip 16-bit "microprocessors", which by this definition cannot exist. — Preceding unsigned comment added by 212.44.25.184 ( talk) 14:17, February 8, 2005 (UTC)
Is there any support for moving the section History of Operating System support for 64 bit microchips somewhere else, like maybe Operating Systems? It doesn't seem to serve much purpose here (other than a thinly veiled Linux good M$ bad dig) — Preceding unsigned comment added by Alf Boggis ( talk • contribs) 15:32, September 1, 2005 (UTC)
How about this The article. 134.250.72.176 — Preceding undated comment added 04:06, October 28, 2005 (UTC)
"National introduced the first 16-bit single-chip microprocessor, the National Semiconductor PACE..." and then a paragraph or so later, "The first single-chip 16-bit microprocessor was TI's TMS 9900..." — Preceding unsigned comment added by 66.41.35.114 ( talk) 19:00, August 10, 2007 (UTC)