This is the
talk page for discussing improvements to the
History of computing hardware article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1, 2 |
History of computing hardware is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed. | |||||||||||||
This article appeared on Wikipedia's Main Page as Today's featured article on June 23, 2004. | |||||||||||||
| |||||||||||||
Current status: Former featured article |
This
level-4 vital article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||
|
Daily pageviews of this article
A graph should have been displayed here but
graphs are temporarily disabled. Until they are enabled again, visit the interactive graph at
pageviews.wmcloud.org |
|
||
Old discussions have been moved to archives - use the navigation box to switch between them. I used the much nicer {{archives}} and {{archivesnav}} templates as found on the Personal computer talk pages to spruce up navigation a little. Remember when creating new archive pages that they must have a space in the title - talk:History of computing hardware/Archive 3 would be the next page, for example. -- Wtshymanski ( talk) 01:35, 25 September 2009 (UTC)
Call me a massive geek, but surely the C64 and Amiga deserve some mention in here. The advancement in personal computers isn't just down to the number of transistors - those computers added some really creative features (particularly with sound hardware) which we now take for granted. Their rise and fall (there's a certain book by a congruent title) is a huge chapter in the history of computing, surely..
In the interests of keeping this article a featured article, might we move the latest contribution on 2nd generation computers to the talk page and work on the English prose before re-instating it to the article page? -- Ancheta Wis ( talk) 14:17, 2 January 2008 (UTC)
computers are an amazing creation of sensation. —Preceding unsigned comment added by Sckater ( talk • contribs) 21:50, 8 March 2008 (UTC)
i was wondering whether the Antikythera mechanism is the first computer, cause their are lot articles that make that claim. Tomasz Prochownik ( talk) 21:05, 23 April 2008 (UTC)
Fellow editors, User:Ragesoss has noted that we are building back up the citations for this FA. When this article was first formed, the rise in standards for Featured Articles had not yet occurred. Since I have been volunteered for this, there will be an American bias to the footnotes I am contributing; please feel to contribute your own sources.
Please feel free to step up and add more citations in the form of the following markup: <ref>Your citation here</ref>. You can add this markup anywhere [1] in the article, and our wiki software will push it to the <references/> position in the article page, individually numbered and highlighted when you click on the ^. As an illustration, I placed this markup on the talk page so that new users can even practice on this talk page.
In my opinion, the best source is Bell and Newell (1971) [2], which is already listed in the article. I do not have time to visit the local university library, so my own contributions are from sources which I have on my own bookshelves; this may be appropriate since the seminal period 1945-1950 will probably be viewed as the heyday of first generation of electronic digital computers, which blossomed in the US, 1945-1950. [3], [4], [5], [6], [7], [8], [9] I recognize that there will need to be more citations from the Association for Computing Machinery and the IEEE Transactions, but that will have to come from those editors who are in the Wikiproject on computing. In particular, the Radiation Laboratory of MIT published a series of books The M.I.T. Radiation Laboratory Series [10] which are the foundation for computing hardware, in tandem with the Manhattan Project; what is common to these projects is that they involved groups of cooperating contributors. [11] Before the howls of outrage subside, please note that the exact forms of computer hardware had not yet been selected in this period, but since the technologists were already in place for other purposes, it was a small step to the forms of hardware we see today. [12], [13], [14], [15], [16], [17], [18] The forms of hardware could easily have gone in other directions, and our current computers would have been different from what could have been. [19] [20]
New users (especially those with a CS or EE background ), please feel free to contribute your citations. Wikipedia:Five Pillars summarize the guidelines for editors, and your cheatsheet for markup can be found here. Users can append comments to the foot of this talk page, signed with the signature markup: --~~~~
Casual readers might note that the references which will be added to this article can be purchased quite cheaply on the Internet (typically for a few dollars), which in sum would amount to a nice education in this subject. -- Ancheta Wis ( talk) 09:31, 3 May 2008 (UTC)
We are up to 59 footnotes. You can examine the edit history to see how the citations were embedded in the article, as well as study this section, for examples on how to do it. -- Ancheta Wis ( talk) 10:01, 6 May 2008 (UTC)
User:SandyGeorgia has noted that the citations are expected to have a certain format. Everyone is welcome to improve the citations. -- Ancheta Wis ( talk) 01:42, 7 May 2008 (UTC)
It appears that the footnote macro is space-sensitive. For example <ref name=IBM_SMS/ > works, but <ref name=IBM_SMS/> causes error messages unless a space is added after the trailing slash. To see this, look at this diff -- Ancheta Wis ( talk) 09:42, 9 May 2008 (UTC)
Sample citation format from User:Wackymacs: [21]
{{
cite web}}
: Check |url=
value (
help); Missing pipe in: |url=
(
help)
According to Hennesey and Patterson, Von Neumann knew about the details of Zuse' floating-point proposal. This suggests that the sentence 'Zuse was largely ignored' should be stricken. Any objections? -- Ancheta Wis ( talk) 10:30, 5 May 2008 (UTC)
Zuse did not implement the floating-point design he patented in 1939, before WWII ended. Von Neumann was aware of Zuse's patent and refused to include it in his Princeton machine, as documented in the seminal paper (Burks, Goldstine and von Neumann, 1946). -- Hennesey and Patterson p.313, note "A decimal floating point unit was available for the IBM 650, and [binary floating-point hardware was available for] 704, 709, 7090, 7094, ... ". "As a result, everybody had floating point, but every implementation was different." .
To this day, floating point operations are less convenient, less reliable, and more difficult to implement (in both hardware and software). - Ancheta Wis ( talk) 08:07, 10 May 2008 (UTC)
This assertion is made about the Colossus in this article. It is also made about the ACE in that article. THERE CAN BE ONLY ONE! Twang ( talk) 18:59, 10 May 2008 (UTC)
Fellow editors, you are welcome to make your contribution to this article. See the sections above for examples on adding citations. Be Bold.
-- Ancheta Wis ( talk) 10:43, 11 May 2008 (UTC)
The article currently states "(Electronic Numerical Integrator and Computer) .... it was 1,000 times faster than its contemporaries." As it is stated that ENIAC was Turing complete, if it had been programmed to break " Tunny" would it have been 1,000 times faster than Colossus? If not then this sentence needs changing. -- PBS ( talk) 10:08, 13 May 2008 (UTC)
Ancheta Wis, you're doing amazing work here - but don't you think the article should have less pictures? — Wackymacs ( talk ~ edits) 06:23, 15 May 2008 (UTC)
It is no good adding lots of citations, when half of them are not formatted properly with the citation templates provided. Please see Wikipedia:Citation templates. All web citations should use the Cite web template, and must have an access date. Also, a lot of the current citations look questionable, and some are useless. (For example, the two citations in the lead explaining hardware and software) - Why? Wikipedia has articles on both of these. — Wackymacs ( talk ~ edits) 10:45, 15 May 2008 (UTC)
Replaced the {{cite}} with {{Citation}}. Retained {{Ref patent}} on the recommendation of the Citations people. The notes now use {{harvnb}} Harvard-style references. -- Ancheta Wis ( talk) 06:46, 19 June 2008 (UTC)
For the record I am aware that Lord Bowden's first name is not Lord. But I am forced into this by the strictures of the Citation system while using Harvard references. The Ref patent template also does not appear to play well with the References section. That is the reason that I have the 3 patent citations in a hybrid, one style for the Notes, and the Last, First names preceding the Ref patent template in the References section. -- Ancheta Wis ( talk) 12:12, 19 June 2008 (UTC)
SandyGeorgia, the harvnb templates still need last|year, but I notice that the 'last=' was missing from the Intel and IEEE. I restored the Citation|last=IEEE and then noticed that the Citation|last=Intel was changed as well. How is the Harvard-style referencing method going to work, in this case? -- Ancheta Wis ( talk) 01:38, 2 July 2008 (UTC)
We need a name akin to the concept of first light of an observatory telescope; I propose the denotation first good run, and wish to apply it to Baby's first good run, June 21, 1948, 60 years ago. -- Ancheta Wis ( talk) 23:00, 21 June 2008 (UTC)
It seems like the table titled
"Defining characteristics of some early digital computers of the 1940s (See History of computing hardware)"
has a mistake. In the row about the Harvard Mark I – IBM ASCC in the column "Turing Complete" the link (1998) is clearly copied and pasted from the row about Z3. I don't know if Harvard Mark I was turing complete but the reference is wrong for sure. I am not familiar with the markup that references this table (obviously across multiple pages) and could not remove the information. Can someone else do it.
Stilgar ( talk) 07:40, 25 June 2008 (UTC)
I don't agree with extending the Rojas conclusion to another machine. Isn't it more complicated? It sounds like a piece of original research that hasn't been published. Zebbie ( talk) 23:30, 22 August 2008 (UTC)
As a separate issue, I think Rojas' conclusion was wrong. Turing's most important contribution to computer science was to postulate "halting problem." Simply put, you can't tell how long a program will take to finish. Therefore Turing defined his Turing machine with the conditional branch. Rojas conclusion, again paraphrased, was: you can write a program without conditionals, but you have to make the tape as long as the program run time is.
1. Rojas is redefining a Turing machine to have no conditionals. I'd argue that is no longer a Turing machine. 2. Rojas' new machine has to know in advance how long the program will run. Turing would argue you cannot know this.
Zebbie ( talk) 23:30, 22 August 2008 (UTC)
As it stands, this still doesn't meet the 2008 FA criteria standards. I just ran the link checker tool on this article, and found some broken links (many are used as references):
http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=History_of_computing_hardware
The broken links will need to be replaced with other reliable sources, preferably books. — Wackymacs ( talk ~ edits) 07:53, 6 July 2008 (UTC)
At the moment, it seems page numbers are being given in the 'References' section instead of in the footnotes where they should be. — Wackymacs ( talk ~ edits) 08:18, 6 July 2008 (UTC)
Why is there a special section for 'American developments' and not one for 'British developments', or any other country? Are Americans special?
--Bias Detector-- 21st July 2008 —Preceding unsigned comment added by 86.9.138.200 ( talk) 16:45, 21 July 2008 (UTC)
Claude Shannon founded digital design. Open any electrical engineering book and you will see what Shannon did. This is a link to his thesis. -- Ancheta Wis ( talk) 10:07, 27 January 2009 (UTC)
This isn't the same as "implementing" a circuit. However ground-breaking his thesis, it provided a proof, not an implementation. Follow the wikilinks. All we have is words to communicate here; we do need to be able to understand what they mean to make progress on this issue. -- TraceyR ( talk) 10:42, 27 January 2009 (UTC)"In his 1937 MIT master's thesis, A Symbolic Analysis of Relay and Switching Circuits, Claude Elwood Shannon 'proved' that Boolean algebra and binary arithmetic could be used to simplify the arrangement of the electromechanical relays then used in telephone routing switches, then turned the concept upside down and also proved that it should be possible to use arrangements of relays to solve Boolean algebra problems."
Thank you for taking this to the talk page, which I propose be the venue for improving the article: "In 1937, Shannon produced his master's thesis[61] at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history." In this sentence, implemented refers to George Boole's work, which Shannon reduced to practice. Proof was established in the nineteenth century, before Shannon, by Boole. In other words, Shannon implemented Boole, with Boolean logic gates. In turn, successive generations of engineers re-implemented these logic gates in successive, improved technologies, which computing hardware has taken to successively higher levels of abstraction.
As a metaphor, take Jimbo Wales' statement of principle for Wikipedia. All successive editors implement Wales' vision. In the same way, Shannon implemented Boole.
If you have improvements to the article, I propose we work through them on the talk page. -- Ancheta Wis ( talk) 11:18, 27 January 2009 (UTC)
I think I see the disconnect: some things can be viewed as purely academic and theoretical; Boole's system of logic might be viewed in this light. But when Shannon expressed Boole's concepts in hardware (which had been done in an ad-hoc way earlier) he showed AT&T that there was another way to build the PSTN, which at one time was completely composed of humans doing the switching of telephone conversations. Today of course, this is all automated. So Shannon's accomplishment was essentially to provide an alternative vocabulary for the existing practice and mindset of the telephone company which in 1937 was analog circuitry. -- Ancheta Wis ( talk) 11:34, 27 January 2009 (UTC)
Here is a proposed sentence and reference:
-- Ancheta Wis ( talk) 12:54, 27 January 2009 (UTC)
I need to put in a plug for Emil Post's work. His formulation of the Turing machine is simpler and Post was actually earlier than Turing, but he failed to publish early enough. That is actually the reason I left in the 'and others'. But, c'est la vie. Maybe the Post-Turing machine will gain currency in future Category:Computational models. -- Ancheta Wis ( talk) 18:36, 28 January 2009 (UTC)
Since Stibnitz is mentioned in the same paragraph as Shannon, there is a suggestion that Stibnitz's work was based on Shannon's thesis. If this is the case, perhaps this should be stated explicitly (and mentioned in the Stibnitz article too). If not, maybe a new paragraph is needed. -- TraceyR ( talk) 14:02, 29 January 2009 (UTC)
Many of the section hatnotes are a little non-sequitorous. Others "belong" in other sections. I don't have the time to sift through them all myself though. – OrangeDog ( talk • edits) 18:37, 29 January 2009 (UTC)
The lead summary states: "Eventually the voltages or currents were standardized, and then digitized". Could someone explain how voltages or currents were digitized. In what way(s) was this breakthrough made? I thought that my PC used 'analogue' power. Many thanks. -- TraceyR ( talk) 07:42, 30 April 2009 (UTC)
I appreciate ArnoldReinhold's edits; they show that the flat memory model is a definite advance on the delay line memory model that early programmers had to deal with; however the current style of programming did not arise from nothing. If the deleted edits were unclear, then we might have to give an example of the contortions that programmers had to go through when solving a problem in the early days. Hardware-independent programming did not exist in the early days. Even today, operating system-independent programming is not a given: the API is typically OS dependent. In the absence of contributions to the article in this vein, consider how one would have to program if the items in memory were to decay before they were reused -- one would be forced to refresh critical data before the delay time had elapsed. -- Ancheta Wis ( talk) 19:01, 24 May 2009 (UTC)
I reached this article looking for a reference to the MOSAIC computer (Ministry of Supply Automatic Integrator and Calculator) and wondered if the following Introduction might be short enough and apposite:
Computing hardware subsumes (1) machines that needed separate manual action to perform each arithmetic operation, (2) punched card machines, and (3) stored program computers. The history of (3) relates largely to (a) the organization of the units to perform input and output, to store data and to combine it, into a complete machine (computer architecture), (b) the electronic components and mechanical devices that comprise these units, (3) higher levels of organization into 21st century supercomputers. Increases in speed and memory capacity, and decreases in cost and size in relation to compute power, are major features of the history.
Five lines instead of 36. The present Introduction could become the first section, headed say Overview, and the pre-stored program coverage extended to mention the abacus, the National Accounting Machines that "cross footed" under control of a "form bar" that facilitated table construction using difference methods, and machines of mid 20th century typified by the Brunswiga (not sure of spelling) and Marchand. The overlap of punched card and stored program computers, by dint of control panels and then card programmed computers could be mentioned. Michael P. Barnett ( talk) 01:47, 24 December 2010 (UTC)
There is currently a slow edit war at IEEE 754-1985. I put down the Z3 as the first working computer as is in this article and it was reverted. I pointed out this article as a better venue to argue matters about history but they can't be bothered to do that so I'm doing it instead. Discussion at Talk:IEEE 754-1985#Z3 first working computer. 17:50, 8 February 2011 (UTC)
Please put in a note, that the idea of punched card driven looms originated from french mechanic Jean Baptist Falcon in 1728, although Falcon never successed in building one by himself. —Preceding unsigned comment added by 91.97.182.235 ( talk) 15:12, 13 February 2011 (UTC)
I propose to rename the analog section in order to preserve the content that was removed.
Alternatively, perhaps a new section with this name might be inserted to contain that content. -- Ancheta Wis ( talk) 11:15, 5 May 2011 (UTC)
We currently label the Mk I as NOT Turing complete - presumably because of a lack of jump instructions. There was some discussion of this on this talk page back in 2008.
It must be noted that:
// Initialization: typedef unsigned char byte ; int lut [ 256 ] = { 1, 1, 1, 1, 1, 1, 1, .... // 128 ones. 0, 0, 0, 0, 0, 0, 0, .... // 128 zeroes. } ; byte mem [...whatever...] = { ...whatever... } ; // The initial state of memory in the SUBLEQ machine int PC = 0 ; // The SUBLEQ machine's program counter.
// Runtime: while ( 1 ) // (Implemented via a paper tape loop) { // Read instruction operands from the program counter location. int a = mem[PC++] ; int b = mem[PC++] ; int c = mem[PC++] ; // Perform subtraction: mem[b] -= mem[a] ; // Use lookup table to extract sign of mem[b] so that: // c is multiplied by 1 and added to the program counter if mem[b]<=0 // c is multiplied by 0 and added to the program counter if mem[b]>0. PC += lut[mem[b]+128] * c ; }
Ergo, the Harvard Mark I was indeed Turing Complete. This is rather important IMHO. SteveBaker ( talk) 15:26, 3 May 2012 (UTC)
I think the Turing completeness column is useful to our readers as a rough guide to how the technology evolved. The controversial entries should have a footnote that says later researchers have attempted to show the machines in question were Turing complete but those capabilities were not envisioned when the machines were developed and used. -- agr ( talk) 10:39, 9 May 2012 (UTC)
To anon 86.177.118.203: I patched in a phrase in the new footnote 1 which I hope matches your intent. Please feel free to alter my patch to your contribution. -- Ancheta Wis (talk | contribs) 03:25, 25 January 2013 (UTC)
In the same light, I propose to use 'accelerated' rather than 'underpinned' in your contribution because the article makes it clear that there were funding sources other than military contract, in both US and Germany. I do not deny that IC-based computers in military systems (1958-1960s) were materially funded by US (& likely USSR) contracts. -- Ancheta Wis (talk | contribs) 04:06, 25 January 2013 (UTC)
What category (or categories) is appropriate for machines that use integrated circuits, but don't put the entire processor on a single chip? In other words, what category covers what History of computing hardware (1960s–present) calls "Third generation" computers?
In other words, what category goes in the blank of the following?:
-- DavidCary ( talk) 14:55, 23 August 2013 (UTC)
The category: minicomputers covers a many of them, but it doesn't cover other multi-chip processors such as the Apollo Guidance Computer, the Cray-1, the first hardware prototype of the Motorola 6800, etc.
Should we start a new category, perhaps category: integrated circuit processors? -- DavidCary ( talk) 14:55, 23 August 2013 (UTC)
Archimedes' method of performing calculations was the use of mechanical balance (think see-saw) of countable items versus the object being measured. This method was used for estimating the number of grains of sand in the universe, etc. (see the the sand reckoner).
Thus Archimedes' method of calculation was very concrete, as befits his status as engineer, inventor, and physicist. For this reason I propose to add his method to history of computing rather than to this article. I am pretty sure there is already a main article about this. -- Ancheta Wis (talk | contribs) 02:19, 30 September 2013 (UTC)
The lede previously claimed that Zuse was commonly known as *the* "inventor of the computer" and the only citations given are to discussions in blogs. Published histories of computing have variously proposed that the "inventor of the computer" is Babbage (who designed the first programmable computer), Aiken (for the Harvard Mark1 which was a highly influential electromechanical computer designed and built around the time of Zuse's Z3), Atanasoff (for the first electronic digital computer), Eckert and von Neumann (for the stored program concept), and several other milestones. Zuse's Z3 could certainly could support the claim of his being the creator of the first working electromechanical programmable computer, but this does not imply that he is commonly known as the inventor of the computer. Wikipedia articles should not be used to push non-mainstream views.
For now I have moved this claim down to section on Zuse's computer for now, but I think that either a separate section discussing the complex issue of who was *the* inventor of the computer should be added, or this claim should be removed (in any case, the claim needs reputable citations, not just blogs). 198.255.141.250 ( talk) 16:33, 22 December 2013 (UTC)
Hi, the article is rather chaotic and unorganized. It's very difficult for a casual reader to make sense of the important developments and stages. There's is also lots of important information that is missing. Noodleki ( talk) 19:19, 7 January 2014 (UTC)
Hi, I understood from the above that you would revert. I think your suggestions equally apply to the version as it stands, although I think software wouldn't necessarily come under this article's purview. Thanks. Noodleki ( talk) 21:20, 8 January 2014 (UTC)
Noodleki, I am waiting for the other editors to respond. Your changes for Babbage fit nicely in the nineteenth c. and I suggest that you add them to that section. However I do not agree with your characterization of 'chaotic' and suggest to you that there is a logical flow in the article already. It goes a bit far to place as much emphasis on Babbage as your version, as his design required repeatable manufacturing tolerances beyond the capacities of the nineteenth c. It took another century. __ Ancheta Wis (talk | contribs) 12:00, 12 January 2014 (UTC)
Is there a Wikipedia article dedicated to vacuum tube computers?
I think there's enough material in this article about vacuum tube computers to create an article ( WP:SPINOUT) focused on that category of computers.
Usually when there exists both a Wikipedia category about some topic, and also a Wikipedia "List of" article about that same topic, there is usually an WP:EPONYMOUS article dedicated to exactly that topic.
For example, there is both a list of transistorized computers article and a category: transistorized computers, so I am glad to see there is also a transistor computer article.
I see there is both a list of vacuum tube computers article and a category: vacuum tube computers, so I am surprised that there is apparently no article dedicated to vacuum tube computers.
When I click on vacuum tube computers, hoping to find an article dedicated to them, today I find it is a redirect to vacuum tube, which has much less information (mostly in vacuum tube#Use in electronic computers) about such machines than this "History of computing hardware" article.
Is there an article that more specifically discusses vacuum tube computers that vacuum tube computer and vacuum tube computers should redirect to? -- DavidCary ( talk) 18:37, 28 May 2015 (UTC)
Hi,
Does the Wilbur machine (analog computer, on display in the science museum in Tokyo) fit in the history of (analog) computers or did it have any significans? -- Butch ( talk) 13:40, 22 November 2015 (UTC)
Probably, a separate article would fare better, along the lines of the Atanasoff–Berry computer, which attacked the same application (systems of linear equations). Or, a contribution to System of linear equations, including both Atanasoff–Berry computer and Wilbur machine would add interest to a math article. -- Ancheta Wis (talk | contribs) 13:14, 23 November 2015 (UTC)
@Butch, I see that Google's quantum computer from D-wave is also a hard-coded device. That is, it embodies some quantum-mechanical experiment. In Google's case, it was quantum annealing. So we are back to the limitations of the Wilbur machine; like the Wilbur machine, the current Google machine is not general purpose, even though it ran 10^8 times faster [2] than a conventional computer working on the same problem, [3] simulated annealing. -- Ancheta Wis (talk | contribs) 15:53, 9 December 2015 (UTC)
References
Hi
any particular reason Lull's Ars Magna is not included (or at least referenced) here?
T
88.89.219.147 (
talk) 23:50, 17 May 2016 (UTC)
There is an undefined reference to "Robinson" in the portion related to Colossus. — Preceding unsigned comment added by 146.18.173.105 ( talk) 19:06, 8 June 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 2 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{
Sourcecheck}}
).
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 22:33, 11 September 2016 (UTC)
There is a picture of this artifact but no mention of it in the text. Such an arrangement is not helpful. Kdammers ( talk) 17:53, 12 September 2016 (UTC)
Done -- Ancheta Wis (talk | contribs) 21:36, 18 September 2016 (UTC)
I assume that MESM, "the first universally programmable computer in continental Europe", that is, present-day Ukraine, should be added to History of computing hardware, before EDVAC. That or EDVAC removed from that section, since it's unclear how it contributes anything there. Or maybe both.
Ilyak ( talk) 05:24, 13 March 2017 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
{{
dead link}}
tag to
http://www.ourcomputerheritage.org/wp/When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 10:22, 3 April 2017 (UTC)
Hello fellow Wikipedians,
I have just modified 4 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 07:39, 20 May 2017 (UTC)
I have been curious how IBM had such good knowledge of Zuse. Perhaps a history which details Dehomag can clarify this. -- Ancheta Wis (talk | contribs) 18:12, 23 August 2017 (UTC)
This quotation is a paraphrase of Machiavelli, The Prince, ch. XVIII:
—ch. XVIII
Since we are seeing a revert war, might we consider:
I am being vague because these statements could be misused against the existing order. I for one wish to preserve the stability of the existing order. -- Ancheta Wis (talk | contribs) 07:58, 8 September 2017 (UTC)
See: The Social Construction of Reality. In other words, as social beings, we belong to social systems which can be at war with each other. Can't we rise above the issues that divide us, and join in building up the social systems that unite us? -- Ancheta Wis (talk | contribs) 08:15, 8 September 2017 (UTC)
I paraphrase the preface to The Answers of Ernst Von Salomon to the 131 Questions in the Allied Military Government "Fragebogen" (This book has never been out of print in Germany, ever since its first publication) Ernst von Salomon wrote (I paraphrase) "As I wrote my answers, which would determine whether I lived or died, whether I would remain imprisoned or go free, I got the sense of a vast alien intelligence that had not the slightest interest in my own well-being ..." 08:31, 8 September 2017 (UTC)
I don't think that "magnetic storage" should be a subsection under "stored program". Magnetic storage isn't necessary for a stored program computer. Bubba73 You talkin' to me? 02:31, 24 September 2017 (UTC)
Turing is known for articulating the idea of a universal computer, but the first description of a universal computer was the lambda calculus, invented by Alonzo Church (who then became Turing's thesis advisor). Doesn't he belong in the same section with Turing? Briankharvey ( talk) 20:50, 16 October 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 02:12, 5 November 2017 (UTC)
Nothing on amateur computing?
john f 2.26.119.204 ( talk) 09:24, 5 December 2017 (UTC)
A NeXT Computer and its object-oriented development tools and libraries were used by Tim Berners-Lee and Robert Cailliau at CERN to develop the world's first web server software, CERN httpd, and also used to write the first web browser, WorldWideWeb. These facts, along with the close association with Steve Jobs, secure the 68030 NeXT a place in history as one of the most significant computers of all time.[citation needed]
This strikes me as opinion, and not necessarily fitting for a topic on computing hardware. Internet history, definitely, however it is still phrased as opinion. I happen to agree that the NeXT Computer (I believe the NeXTCube) that Tim Berners-Lee used to develop the WWW, and I would add John Carmack's development of Doom on a NeXtStation, are historical, I don't feel this paragraph fits in this article.
Perhaps the history of the WWW, an article on the history of Next, video games, etc. but not in this article.
Communibus locis ( talk) 21:42, 18 January 2018 (UTC)
The article has a book citation to Reconstruction of the Atanasoff-Berry Computer by John Gustafson. I can't find such a book but there is this paper. Is that it? Bubba73 You talkin' to me? 04:22, 8 April 2018 (UTC)
== Computers and Automation Magazine ==
Pictorial Report on the Computer Field:
A PICTORIAL INTRODUCTION TO COMPUTERS - [1], pp. 49-56
A PICTORIAL MANUAL ON COMPUTERS - [2], pp. 10-13, 15-17, 19-24, 28, 30, 32
A PICTORIAL MANUAL ON COMPUTERS, Part 2 - [3], pp. 12-17, 20-22, 24, 26-27
1958 Pictorial Report on the Computer Field - [4], pp. 6, 8-10, 12-14, 16-18, 20-21
1959 PICTORIAL REPORT ON THE COMPUTER FIELD - [5], pp. 8-19
1960 Pictorial Report on the Computer Field - [6], pp. 13-32
1961 PICTORIAL REPORT ON THE COMPUTER FIELD - [7], pp. digital 24-36, analog 41-45, I/O devices 60-69; 72-78, 83-88 (Bernoulli disk rotating storage device - p. 62, IBM 1301 - 69, Semiconductor Network Computer - 85)
1962 PICTORIAL REPORT ON THE COMPUTER FIELD - [8], pp. 26-42, I/O / components/others: 67-73 / 74-79/80-82
1963 PICTORIAL REPORT ON THE COMPUTER FIELD - [9], pp. 26-44
1964 PICTORIAL REPORT ON THE COMPUTER FIELD - [10], pp. 28-36, 37-51 (UNIVAC FLUID COMPUTER - air-operated, SDS 92 IC, Fairchild Planar II)
1965 Pictorial Report on the Computer Field - [11], pp. 18-30, 31-38; IC memories, Floating Floor :)
1966 Pictorial Report on the Computer Field - [12], pp. 22--- 89.25.210.104 ( talk) 18:21, 19 June 2018 (UTC)
Both men and women contributed, but works of women have been exaggerated and sources are inaccurate, based on words of feminist authors rather than neutral . When a job is male specific we never say "the field was primary dominated by men", but if a women in the slightest roles we bring up "women were involved", jobs primarily specified to women we say women were more involved. This is an article on computer hardware not a feminist propaganda article! The source Light, Jennifer S. (July 1999). "When Computers Were Women". Technology and Culture. 40: 455–483. comes from a feminist citation needed author rather than a neutral research and is unreliable. Respected Person ( talk) 10:16, 14 December 2018 (UTC)
The article has "...but the 'program' was hard wired right into the set up, usually in a patch panel". Is it correct to call a patch panel (plug board) hard-wired, since it is easily changed? See this dictionary. Bubba73 You talkin' to me? 03:11, 24 December 2018 (UTC)
I think that most of Post-1960 (integrated circuit based) section should be moved to History of computing hardware (1960s–present). -- MarMi wiki ( talk) 22:08, 30 December 2018 (UTC)
It is clear that Americans were intimately involved in the use of Colossu during WWII - see: "Small, Albert W. (December 1944),
The Special Fish Report, The American National Archive (NARA) College Campus Washington{{
citation}}
: CS1 maint: date and year (
link) CS1 maint: location missing publisher (
link)". So I shall revert the recent edit. --
TedColes (
talk) 04:15, 11 January 2019 (UTC)
I would like to discuss this edit: [15]
@ Tom94022: <--ping
It seems to me that computers based on integrated circuits were an important intermediate step between computers based upon discrete transistors and computers based upon microprocessors. I this the section should be restored. -- Guy Macon ( talk) 01:14, 20 January 2019 (UTC)
Integrated circuit computers never left the article; some of the history of integrated circuits did. The article should be aligned with the traditional four generations of electronic computers; tube, transistor, IC(not micoprocessor) and (monolithic) microprocessor. @ Guy Macon:'s edit sort of messed this up lumping three into one which I will restore. As far as the history of the invention of the IC does it really have much to do with this article and is very well coverred in the integrated circuit article. I will leave the history in until we hear from other editors. Tom94022 ( talk) 07:04, 20 January 2019 (UTC)
The section History_of_computing_hardware#Integrated_circuits really doesn't say anything about how/when ICs got into computers. The Apollo Guidance Computer should be mentioned as one of the first; it used only one type of small-scale IC (double 3-input NOR). Probably some of you know other early computers based on ICs (is that in what was removed?). Dicklyon ( talk) 20:11, 20 January 2019 (UTC)
Per WP:TALKDONTREVERT and WP:BRD I reverted the following rather major removal of sourced material, only to face an editor who chooses to re-revert rather than discuss. [16] [17] [18] This is the same editor who tried tto deleted a large chunk of material that we discussed in the section above. [19] [20]
Before I go any further, I would like to bring this up for discussion. Should that material be deleted or retained? -- Guy Macon ( talk) 22:22, 20 January 2019 (UTC)
The idea of the integrated circuit was conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952:[141]
With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires.[142] The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers”.The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[143] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[144]In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated.”[145] The first customer for the invention was the US Air Force.[146]Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[147] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
Its not clear why we have two articles that overlap, History_of_computing_hardware and History_of_computing_hardware_(1960s–present) but given that's the situation it seems appropriate to me that most of the information about the third and fourth generations of computer hardware belong to the second article while this article has summary sections pointing to the sections in the second article as the main article. To that line, I intend to move much of the microprocessor material to the main article. Comments?
I do think the "generations" belong in both articles and have so edited them into both articles. For this reason I reverted Ancheta Wis change of the section title. I have no objection to a new section on Current Commputers but shouldn't be in History_of_computing_hardware_(1960s–present)? It should include Microprocessor Computers but likely includes other computer devices. Would need to find a reference to add it in either article but I am sure there are some. Tom94022 ( talk) 17:45, 21 January 2019 (UTC)
Reference 118 is a link to a patent application. It is used at the end of a sentence saying that magnetic core was dominant until the mid-1970s. This is not in the source. Bubba73 You talkin' to me? 03:00, 22 January 2019 (UTC)
References
The scope of computing has moved beyond microprocessors; multiple governments beyond the US government are seeking quantum computing as a matter of national security, including Canada, Australia, the Netherlands, the United Kingdom, the European Union, Singapore, Russia, North Korea and Japan. [1] What this means for the article is a change to the post 1960 section name, beyond the microprocessor. [1] -- Ancheta Wis (talk | contribs) 07:02, 23 February 2019 (UTC)
References
Per Wikipedia:Disambiguation: There are three important aspects to disambiguation: "Making the links for ambiguous terms point to the correct article title. For example, an editor of an astronomy article may have created a link to Mercury, and this should be corrected to point to Mercury (planet)."
The term Modern is vague and meaningless. Modern history covers the period from the 16th to the 21st century. Modernity is also used for the "socio-cultural norms" of the world prior to World War II, and is associated with Modernism as an art movement (late 19th century to early 20th century).
Meanwhile contemporary history refers to the present time period. Dimadick ( talk) 19:24, 5 March 2019 (UTC)
I think not! A recent edit reverted without relevant discussion the removal of material that is not particularly relevant to this article and well covered in the linked article and elsewhere. I'm going to revert it again and hopefully there will be some discussion here and not edit warring. Tom94022 ( talk) 01:24, 13 September 2019 (UTC)
The above subsection is not particularly relevant to the question raised. It doesn't matter whether irrelevant material is copied from within Wikipedia or obtained from reliable sources it is still not relevant to this article. Similarly, it is not particularly relevant that the disputed material was in the article at some distant time. Nor does the age of the article or prior authors particularly matter. History of MOSFETs just not deserve a place in this article. I suppose if a reliable source can be found perhaps a single sentence might be added; something along the lines of, "Modern microprocessors are built upon MOSFET technology." Tom94022 ( talk) 20:10, 14 September 2019 (UTC)
See Talk:Analog computer#Earliest? -- Guy Macon ( talk) 16:23, 6 October 2019 (UTC)
Entire Class of Electro/Mechanical Computers MISSING
There were thousands of types of mechanical & electro-mechanical load/store computers used for hundreds of years.
Most of these used a sled, cart, or feeder robot that would take a sequence control (such as an index number, turns, box or document number, page or slot, etc) and go fetch or access something.
These, by storing values, performed almost any sequence of computations from any library of punched tape, reels, feed tape, mechanical route cards, etc...
There were huge tabulation and computational facilities supporting these little bots that followed routes on rails or channels to leads anywhere. — Preceding unsigned comment added by 172.58.187.157 ( talk) 19:07, 18 December 2019 (UTC)
The 1962 book Computers: the machines we think with, by D. S. Halacy, Jr, pg. 49, says that the ENIAC was followed by BINAC, MANIAC, JOHNNIAC, UNIVAC, RECOMP, STRETCH, and LARC. I couldn't find anything about RECOMP, but there is Autonetics Recomp II. Was there a computer named RECOMP, before the RECOMP II? Bubba73 You talkin' to me? 21:02, 19 August 2020 (UTC)
Did some more searching: April 1, 1957 "Recomp I, a new portable, high-speed, completely transistorized digital computer" https://www.americanradiohistory.com/hd2/IDX-Site-Technical/Engineering-General/Archive-Electronics-IDX/IDX/50s/57/Electronics-1957-04-OCR-Page-0138.pdf
Theory #1: The Recomp I was introduced in 1957, the Recomp II was introduced in 1958, and they were sill selling Recomp Is in 1958.
Theory #2: They called the Recomp I "Recomp" until they decided to build a Recomp II, and at that point started calling the first Recomp a Recomp I.
-- Guy Macon ( talk) 06:12, 20 August 2020 (UTC)
Why do I keep seeing this same exact line "The castle clock, a hydropowered mechanical astronomical clock invented by Al-Jazari in 1206, was the first programmable analog computer.[10][11][12]" posted on all the major computing history Wikipedia articles? For starters, I believe the claim is a bit sensationalized, as the word "programmable" is being used very loosely here. The term programmable is usually meant in the context of being able to provide instructions to a machine so that the machine can adjust its operations accordingly. In this scenario for Al-Jazari's clock (also very loosely associated with a computer, but it performs a computation of sorts, namely, keeping time and such, so I will grant that I suppose), the clock had to be manually recalibrated. Does this qualify as programmable? In addition to this, the actual cited source isn't even correct. The episode in question of Ancient Discoveries of the History Channel is Series 3 Episode 9, and the episode itself (available on YouTube) doesn't even support the claim that Al-Jazari's clock was the first programmable analog computer. The episode actually makes an even stranger claim: that Al-Jazari's clock was a "super computer". I also looked through source 11 and didn't find the claim supported on the page given. What is going on here? 2601:82:200:8B20:0:0:0:3C04 ( talk) 01:24, 16 June 2022 (UTC)
@ TedColes: My edit [24] which you reverted, corrected the suggestion in the article that Turing's designs was independent of the work done by Mauchly and Eckert at the University of Pennsylvania, as reported by John von Neumann in his "First Draft of a Report on the EDVAC." My correction gave references which you removed.
The version prior to my edits and the current version after being reverted says:
In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. His 1945 report 'Proposed Electronic Calculator' was the first specification for such a device.
Meanwhile, John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the " von Neumann architecture".
However Turing himself states on page 3 of his 'Proposed Electronic Calculator': [1]
"The present report gives a fairly complete account of the proposed calculator. It is recomended however that it be read in conjunction with J. von Neumann's 'Report on the EDVAC',"
Turing indeed wrote a more full worked out design, but he does not claim to have written the "first specification for such a device." My edits which correct the timing, without denigrating Turing's contribution in any way, should be restored.-- agr ( talk) 19:57, 27 August 2023 (UTC) agr ( talk) 19:57, 27 August 2023 (UTC)
The idea of a universal stored-programme computing machine was promulgated in the USA by von Neumann and in the UK by [Max] Newman, the two mathematicians who, along with Turing himself, were by and large responsible for placing Turing’s abstract universal machine into the hands of electronic engineers. [2]
The theoretical basis for the stored-program computer had been proposed by Alan Turing in his 1936 paper.
Meanwhile,John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the "von Neumann architecture". Turing presented a more detailed paper to the National Physical Laboratory (NPL) Executive Committee in 1946, giving the first reasonably complete design of a stored-program computer, a device he called the Automatic Computing Engine (ACE). However, the better-known EDVAC design of John von Neumann, who knew of Turing's theoretical work, received more publicity, despite its incomplete nature and questionable lack of attribution of the sources of some of the ideas.[54]In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. Turing thought that the speed an ...
Why does the article claim that 'The first commercial computer was the Ferranti Mark 1, built by Ferranti and delivered to the University of Manchester in February 1951. ' when the Z4 was already rented to ETH and in operation there in 1950? This in my view clearly makes the Z4 the first commercial computer. The Z4 article even says (with references) that 'In 1950/1951, the Z4 was the only working digital computer in Central Europe, and the second digital computer in the world to be sold or loaned,[1]: 981 beating the Ferranti Mark 1 by five months and the UNIVAC I by ten months, but in turn being beaten by the BINAC (although that never worked at the customer's site[19]).' Claiming a computer that never really worked the 'firs commercial computer' seems rather misleading, so the first computer working for money is clearly the Zuse Z4. -- 85.169.148.50 ( talk) 22:36, 10 March 2024 (UTC)
This is the
talk page for discussing improvements to the
History of computing hardware article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1, 2 |
History of computing hardware is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed. | |||||||||||||
This article appeared on Wikipedia's Main Page as Today's featured article on June 23, 2004. | |||||||||||||
| |||||||||||||
Current status: Former featured article |
This
level-4 vital article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||
|
Daily pageviews of this article
A graph should have been displayed here but
graphs are temporarily disabled. Until they are enabled again, visit the interactive graph at
pageviews.wmcloud.org |
|
||
Old discussions have been moved to archives - use the navigation box to switch between them. I used the much nicer {{archives}} and {{archivesnav}} templates as found on the Personal computer talk pages to spruce up navigation a little. Remember when creating new archive pages that they must have a space in the title - talk:History of computing hardware/Archive 3 would be the next page, for example. -- Wtshymanski ( talk) 01:35, 25 September 2009 (UTC)
Call me a massive geek, but surely the C64 and Amiga deserve some mention in here. The advancement in personal computers isn't just down to the number of transistors - those computers added some really creative features (particularly with sound hardware) which we now take for granted. Their rise and fall (there's a certain book by a congruent title) is a huge chapter in the history of computing, surely..
In the interests of keeping this article a featured article, might we move the latest contribution on 2nd generation computers to the talk page and work on the English prose before re-instating it to the article page? -- Ancheta Wis ( talk) 14:17, 2 January 2008 (UTC)
computers are an amazing creation of sensation. —Preceding unsigned comment added by Sckater ( talk • contribs) 21:50, 8 March 2008 (UTC)
i was wondering whether the Antikythera mechanism is the first computer, cause their are lot articles that make that claim. Tomasz Prochownik ( talk) 21:05, 23 April 2008 (UTC)
Fellow editors, User:Ragesoss has noted that we are building back up the citations for this FA. When this article was first formed, the rise in standards for Featured Articles had not yet occurred. Since I have been volunteered for this, there will be an American bias to the footnotes I am contributing; please feel to contribute your own sources.
Please feel free to step up and add more citations in the form of the following markup: <ref>Your citation here</ref>. You can add this markup anywhere [1] in the article, and our wiki software will push it to the <references/> position in the article page, individually numbered and highlighted when you click on the ^. As an illustration, I placed this markup on the talk page so that new users can even practice on this talk page.
In my opinion, the best source is Bell and Newell (1971) [2], which is already listed in the article. I do not have time to visit the local university library, so my own contributions are from sources which I have on my own bookshelves; this may be appropriate since the seminal period 1945-1950 will probably be viewed as the heyday of first generation of electronic digital computers, which blossomed in the US, 1945-1950. [3], [4], [5], [6], [7], [8], [9] I recognize that there will need to be more citations from the Association for Computing Machinery and the IEEE Transactions, but that will have to come from those editors who are in the Wikiproject on computing. In particular, the Radiation Laboratory of MIT published a series of books The M.I.T. Radiation Laboratory Series [10] which are the foundation for computing hardware, in tandem with the Manhattan Project; what is common to these projects is that they involved groups of cooperating contributors. [11] Before the howls of outrage subside, please note that the exact forms of computer hardware had not yet been selected in this period, but since the technologists were already in place for other purposes, it was a small step to the forms of hardware we see today. [12], [13], [14], [15], [16], [17], [18] The forms of hardware could easily have gone in other directions, and our current computers would have been different from what could have been. [19] [20]
New users (especially those with a CS or EE background ), please feel free to contribute your citations. Wikipedia:Five Pillars summarize the guidelines for editors, and your cheatsheet for markup can be found here. Users can append comments to the foot of this talk page, signed with the signature markup: --~~~~
Casual readers might note that the references which will be added to this article can be purchased quite cheaply on the Internet (typically for a few dollars), which in sum would amount to a nice education in this subject. -- Ancheta Wis ( talk) 09:31, 3 May 2008 (UTC)
We are up to 59 footnotes. You can examine the edit history to see how the citations were embedded in the article, as well as study this section, for examples on how to do it. -- Ancheta Wis ( talk) 10:01, 6 May 2008 (UTC)
User:SandyGeorgia has noted that the citations are expected to have a certain format. Everyone is welcome to improve the citations. -- Ancheta Wis ( talk) 01:42, 7 May 2008 (UTC)
It appears that the footnote macro is space-sensitive. For example <ref name=IBM_SMS/ > works, but <ref name=IBM_SMS/> causes error messages unless a space is added after the trailing slash. To see this, look at this diff -- Ancheta Wis ( talk) 09:42, 9 May 2008 (UTC)
Sample citation format from User:Wackymacs: [21]
{{
cite web}}
: Check |url=
value (
help); Missing pipe in: |url=
(
help)
According to Hennesey and Patterson, Von Neumann knew about the details of Zuse' floating-point proposal. This suggests that the sentence 'Zuse was largely ignored' should be stricken. Any objections? -- Ancheta Wis ( talk) 10:30, 5 May 2008 (UTC)
Zuse did not implement the floating-point design he patented in 1939, before WWII ended. Von Neumann was aware of Zuse's patent and refused to include it in his Princeton machine, as documented in the seminal paper (Burks, Goldstine and von Neumann, 1946). -- Hennesey and Patterson p.313, note "A decimal floating point unit was available for the IBM 650, and [binary floating-point hardware was available for] 704, 709, 7090, 7094, ... ". "As a result, everybody had floating point, but every implementation was different." .
To this day, floating point operations are less convenient, less reliable, and more difficult to implement (in both hardware and software). - Ancheta Wis ( talk) 08:07, 10 May 2008 (UTC)
This assertion is made about the Colossus in this article. It is also made about the ACE in that article. THERE CAN BE ONLY ONE! Twang ( talk) 18:59, 10 May 2008 (UTC)
Fellow editors, you are welcome to make your contribution to this article. See the sections above for examples on adding citations. Be Bold.
-- Ancheta Wis ( talk) 10:43, 11 May 2008 (UTC)
The article currently states "(Electronic Numerical Integrator and Computer) .... it was 1,000 times faster than its contemporaries." As it is stated that ENIAC was Turing complete, if it had been programmed to break " Tunny" would it have been 1,000 times faster than Colossus? If not then this sentence needs changing. -- PBS ( talk) 10:08, 13 May 2008 (UTC)
Ancheta Wis, you're doing amazing work here - but don't you think the article should have less pictures? — Wackymacs ( talk ~ edits) 06:23, 15 May 2008 (UTC)
It is no good adding lots of citations, when half of them are not formatted properly with the citation templates provided. Please see Wikipedia:Citation templates. All web citations should use the Cite web template, and must have an access date. Also, a lot of the current citations look questionable, and some are useless. (For example, the two citations in the lead explaining hardware and software) - Why? Wikipedia has articles on both of these. — Wackymacs ( talk ~ edits) 10:45, 15 May 2008 (UTC)
Replaced the {{cite}} with {{Citation}}. Retained {{Ref patent}} on the recommendation of the Citations people. The notes now use {{harvnb}} Harvard-style references. -- Ancheta Wis ( talk) 06:46, 19 June 2008 (UTC)
For the record I am aware that Lord Bowden's first name is not Lord. But I am forced into this by the strictures of the Citation system while using Harvard references. The Ref patent template also does not appear to play well with the References section. That is the reason that I have the 3 patent citations in a hybrid, one style for the Notes, and the Last, First names preceding the Ref patent template in the References section. -- Ancheta Wis ( talk) 12:12, 19 June 2008 (UTC)
SandyGeorgia, the harvnb templates still need last|year, but I notice that the 'last=' was missing from the Intel and IEEE. I restored the Citation|last=IEEE and then noticed that the Citation|last=Intel was changed as well. How is the Harvard-style referencing method going to work, in this case? -- Ancheta Wis ( talk) 01:38, 2 July 2008 (UTC)
We need a name akin to the concept of first light of an observatory telescope; I propose the denotation first good run, and wish to apply it to Baby's first good run, June 21, 1948, 60 years ago. -- Ancheta Wis ( talk) 23:00, 21 June 2008 (UTC)
It seems like the table titled
"Defining characteristics of some early digital computers of the 1940s (See History of computing hardware)"
has a mistake. In the row about the Harvard Mark I – IBM ASCC in the column "Turing Complete" the link (1998) is clearly copied and pasted from the row about Z3. I don't know if Harvard Mark I was turing complete but the reference is wrong for sure. I am not familiar with the markup that references this table (obviously across multiple pages) and could not remove the information. Can someone else do it.
Stilgar ( talk) 07:40, 25 June 2008 (UTC)
I don't agree with extending the Rojas conclusion to another machine. Isn't it more complicated? It sounds like a piece of original research that hasn't been published. Zebbie ( talk) 23:30, 22 August 2008 (UTC)
As a separate issue, I think Rojas' conclusion was wrong. Turing's most important contribution to computer science was to postulate "halting problem." Simply put, you can't tell how long a program will take to finish. Therefore Turing defined his Turing machine with the conditional branch. Rojas conclusion, again paraphrased, was: you can write a program without conditionals, but you have to make the tape as long as the program run time is.
1. Rojas is redefining a Turing machine to have no conditionals. I'd argue that is no longer a Turing machine. 2. Rojas' new machine has to know in advance how long the program will run. Turing would argue you cannot know this.
Zebbie ( talk) 23:30, 22 August 2008 (UTC)
As it stands, this still doesn't meet the 2008 FA criteria standards. I just ran the link checker tool on this article, and found some broken links (many are used as references):
http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=History_of_computing_hardware
The broken links will need to be replaced with other reliable sources, preferably books. — Wackymacs ( talk ~ edits) 07:53, 6 July 2008 (UTC)
At the moment, it seems page numbers are being given in the 'References' section instead of in the footnotes where they should be. — Wackymacs ( talk ~ edits) 08:18, 6 July 2008 (UTC)
Why is there a special section for 'American developments' and not one for 'British developments', or any other country? Are Americans special?
--Bias Detector-- 21st July 2008 —Preceding unsigned comment added by 86.9.138.200 ( talk) 16:45, 21 July 2008 (UTC)
Claude Shannon founded digital design. Open any electrical engineering book and you will see what Shannon did. This is a link to his thesis. -- Ancheta Wis ( talk) 10:07, 27 January 2009 (UTC)
This isn't the same as "implementing" a circuit. However ground-breaking his thesis, it provided a proof, not an implementation. Follow the wikilinks. All we have is words to communicate here; we do need to be able to understand what they mean to make progress on this issue. -- TraceyR ( talk) 10:42, 27 January 2009 (UTC)"In his 1937 MIT master's thesis, A Symbolic Analysis of Relay and Switching Circuits, Claude Elwood Shannon 'proved' that Boolean algebra and binary arithmetic could be used to simplify the arrangement of the electromechanical relays then used in telephone routing switches, then turned the concept upside down and also proved that it should be possible to use arrangements of relays to solve Boolean algebra problems."
Thank you for taking this to the talk page, which I propose be the venue for improving the article: "In 1937, Shannon produced his master's thesis[61] at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history." In this sentence, implemented refers to George Boole's work, which Shannon reduced to practice. Proof was established in the nineteenth century, before Shannon, by Boole. In other words, Shannon implemented Boole, with Boolean logic gates. In turn, successive generations of engineers re-implemented these logic gates in successive, improved technologies, which computing hardware has taken to successively higher levels of abstraction.
As a metaphor, take Jimbo Wales' statement of principle for Wikipedia. All successive editors implement Wales' vision. In the same way, Shannon implemented Boole.
If you have improvements to the article, I propose we work through them on the talk page. -- Ancheta Wis ( talk) 11:18, 27 January 2009 (UTC)
I think I see the disconnect: some things can be viewed as purely academic and theoretical; Boole's system of logic might be viewed in this light. But when Shannon expressed Boole's concepts in hardware (which had been done in an ad-hoc way earlier) he showed AT&T that there was another way to build the PSTN, which at one time was completely composed of humans doing the switching of telephone conversations. Today of course, this is all automated. So Shannon's accomplishment was essentially to provide an alternative vocabulary for the existing practice and mindset of the telephone company which in 1937 was analog circuitry. -- Ancheta Wis ( talk) 11:34, 27 January 2009 (UTC)
Here is a proposed sentence and reference:
-- Ancheta Wis ( talk) 12:54, 27 January 2009 (UTC)
I need to put in a plug for Emil Post's work. His formulation of the Turing machine is simpler and Post was actually earlier than Turing, but he failed to publish early enough. That is actually the reason I left in the 'and others'. But, c'est la vie. Maybe the Post-Turing machine will gain currency in future Category:Computational models. -- Ancheta Wis ( talk) 18:36, 28 January 2009 (UTC)
Since Stibnitz is mentioned in the same paragraph as Shannon, there is a suggestion that Stibnitz's work was based on Shannon's thesis. If this is the case, perhaps this should be stated explicitly (and mentioned in the Stibnitz article too). If not, maybe a new paragraph is needed. -- TraceyR ( talk) 14:02, 29 January 2009 (UTC)
Many of the section hatnotes are a little non-sequitorous. Others "belong" in other sections. I don't have the time to sift through them all myself though. – OrangeDog ( talk • edits) 18:37, 29 January 2009 (UTC)
The lead summary states: "Eventually the voltages or currents were standardized, and then digitized". Could someone explain how voltages or currents were digitized. In what way(s) was this breakthrough made? I thought that my PC used 'analogue' power. Many thanks. -- TraceyR ( talk) 07:42, 30 April 2009 (UTC)
I appreciate ArnoldReinhold's edits; they show that the flat memory model is a definite advance on the delay line memory model that early programmers had to deal with; however the current style of programming did not arise from nothing. If the deleted edits were unclear, then we might have to give an example of the contortions that programmers had to go through when solving a problem in the early days. Hardware-independent programming did not exist in the early days. Even today, operating system-independent programming is not a given: the API is typically OS dependent. In the absence of contributions to the article in this vein, consider how one would have to program if the items in memory were to decay before they were reused -- one would be forced to refresh critical data before the delay time had elapsed. -- Ancheta Wis ( talk) 19:01, 24 May 2009 (UTC)
I reached this article looking for a reference to the MOSAIC computer (Ministry of Supply Automatic Integrator and Calculator) and wondered if the following Introduction might be short enough and apposite:
Computing hardware subsumes (1) machines that needed separate manual action to perform each arithmetic operation, (2) punched card machines, and (3) stored program computers. The history of (3) relates largely to (a) the organization of the units to perform input and output, to store data and to combine it, into a complete machine (computer architecture), (b) the electronic components and mechanical devices that comprise these units, (3) higher levels of organization into 21st century supercomputers. Increases in speed and memory capacity, and decreases in cost and size in relation to compute power, are major features of the history.
Five lines instead of 36. The present Introduction could become the first section, headed say Overview, and the pre-stored program coverage extended to mention the abacus, the National Accounting Machines that "cross footed" under control of a "form bar" that facilitated table construction using difference methods, and machines of mid 20th century typified by the Brunswiga (not sure of spelling) and Marchand. The overlap of punched card and stored program computers, by dint of control panels and then card programmed computers could be mentioned. Michael P. Barnett ( talk) 01:47, 24 December 2010 (UTC)
There is currently a slow edit war at IEEE 754-1985. I put down the Z3 as the first working computer as is in this article and it was reverted. I pointed out this article as a better venue to argue matters about history but they can't be bothered to do that so I'm doing it instead. Discussion at Talk:IEEE 754-1985#Z3 first working computer. 17:50, 8 February 2011 (UTC)
Please put in a note, that the idea of punched card driven looms originated from french mechanic Jean Baptist Falcon in 1728, although Falcon never successed in building one by himself. —Preceding unsigned comment added by 91.97.182.235 ( talk) 15:12, 13 February 2011 (UTC)
I propose to rename the analog section in order to preserve the content that was removed.
Alternatively, perhaps a new section with this name might be inserted to contain that content. -- Ancheta Wis ( talk) 11:15, 5 May 2011 (UTC)
We currently label the Mk I as NOT Turing complete - presumably because of a lack of jump instructions. There was some discussion of this on this talk page back in 2008.
It must be noted that:
// Initialization: typedef unsigned char byte ; int lut [ 256 ] = { 1, 1, 1, 1, 1, 1, 1, .... // 128 ones. 0, 0, 0, 0, 0, 0, 0, .... // 128 zeroes. } ; byte mem [...whatever...] = { ...whatever... } ; // The initial state of memory in the SUBLEQ machine int PC = 0 ; // The SUBLEQ machine's program counter.
// Runtime: while ( 1 ) // (Implemented via a paper tape loop) { // Read instruction operands from the program counter location. int a = mem[PC++] ; int b = mem[PC++] ; int c = mem[PC++] ; // Perform subtraction: mem[b] -= mem[a] ; // Use lookup table to extract sign of mem[b] so that: // c is multiplied by 1 and added to the program counter if mem[b]<=0 // c is multiplied by 0 and added to the program counter if mem[b]>0. PC += lut[mem[b]+128] * c ; }
Ergo, the Harvard Mark I was indeed Turing Complete. This is rather important IMHO. SteveBaker ( talk) 15:26, 3 May 2012 (UTC)
I think the Turing completeness column is useful to our readers as a rough guide to how the technology evolved. The controversial entries should have a footnote that says later researchers have attempted to show the machines in question were Turing complete but those capabilities were not envisioned when the machines were developed and used. -- agr ( talk) 10:39, 9 May 2012 (UTC)
To anon 86.177.118.203: I patched in a phrase in the new footnote 1 which I hope matches your intent. Please feel free to alter my patch to your contribution. -- Ancheta Wis (talk | contribs) 03:25, 25 January 2013 (UTC)
In the same light, I propose to use 'accelerated' rather than 'underpinned' in your contribution because the article makes it clear that there were funding sources other than military contract, in both US and Germany. I do not deny that IC-based computers in military systems (1958-1960s) were materially funded by US (& likely USSR) contracts. -- Ancheta Wis (talk | contribs) 04:06, 25 January 2013 (UTC)
What category (or categories) is appropriate for machines that use integrated circuits, but don't put the entire processor on a single chip? In other words, what category covers what History of computing hardware (1960s–present) calls "Third generation" computers?
In other words, what category goes in the blank of the following?:
-- DavidCary ( talk) 14:55, 23 August 2013 (UTC)
The category: minicomputers covers a many of them, but it doesn't cover other multi-chip processors such as the Apollo Guidance Computer, the Cray-1, the first hardware prototype of the Motorola 6800, etc.
Should we start a new category, perhaps category: integrated circuit processors? -- DavidCary ( talk) 14:55, 23 August 2013 (UTC)
Archimedes' method of performing calculations was the use of mechanical balance (think see-saw) of countable items versus the object being measured. This method was used for estimating the number of grains of sand in the universe, etc. (see the the sand reckoner).
Thus Archimedes' method of calculation was very concrete, as befits his status as engineer, inventor, and physicist. For this reason I propose to add his method to history of computing rather than to this article. I am pretty sure there is already a main article about this. -- Ancheta Wis (talk | contribs) 02:19, 30 September 2013 (UTC)
The lede previously claimed that Zuse was commonly known as *the* "inventor of the computer" and the only citations given are to discussions in blogs. Published histories of computing have variously proposed that the "inventor of the computer" is Babbage (who designed the first programmable computer), Aiken (for the Harvard Mark1 which was a highly influential electromechanical computer designed and built around the time of Zuse's Z3), Atanasoff (for the first electronic digital computer), Eckert and von Neumann (for the stored program concept), and several other milestones. Zuse's Z3 could certainly could support the claim of his being the creator of the first working electromechanical programmable computer, but this does not imply that he is commonly known as the inventor of the computer. Wikipedia articles should not be used to push non-mainstream views.
For now I have moved this claim down to section on Zuse's computer for now, but I think that either a separate section discussing the complex issue of who was *the* inventor of the computer should be added, or this claim should be removed (in any case, the claim needs reputable citations, not just blogs). 198.255.141.250 ( talk) 16:33, 22 December 2013 (UTC)
Hi, the article is rather chaotic and unorganized. It's very difficult for a casual reader to make sense of the important developments and stages. There's is also lots of important information that is missing. Noodleki ( talk) 19:19, 7 January 2014 (UTC)
Hi, I understood from the above that you would revert. I think your suggestions equally apply to the version as it stands, although I think software wouldn't necessarily come under this article's purview. Thanks. Noodleki ( talk) 21:20, 8 January 2014 (UTC)
Noodleki, I am waiting for the other editors to respond. Your changes for Babbage fit nicely in the nineteenth c. and I suggest that you add them to that section. However I do not agree with your characterization of 'chaotic' and suggest to you that there is a logical flow in the article already. It goes a bit far to place as much emphasis on Babbage as your version, as his design required repeatable manufacturing tolerances beyond the capacities of the nineteenth c. It took another century. __ Ancheta Wis (talk | contribs) 12:00, 12 January 2014 (UTC)
Is there a Wikipedia article dedicated to vacuum tube computers?
I think there's enough material in this article about vacuum tube computers to create an article ( WP:SPINOUT) focused on that category of computers.
Usually when there exists both a Wikipedia category about some topic, and also a Wikipedia "List of" article about that same topic, there is usually an WP:EPONYMOUS article dedicated to exactly that topic.
For example, there is both a list of transistorized computers article and a category: transistorized computers, so I am glad to see there is also a transistor computer article.
I see there is both a list of vacuum tube computers article and a category: vacuum tube computers, so I am surprised that there is apparently no article dedicated to vacuum tube computers.
When I click on vacuum tube computers, hoping to find an article dedicated to them, today I find it is a redirect to vacuum tube, which has much less information (mostly in vacuum tube#Use in electronic computers) about such machines than this "History of computing hardware" article.
Is there an article that more specifically discusses vacuum tube computers that vacuum tube computer and vacuum tube computers should redirect to? -- DavidCary ( talk) 18:37, 28 May 2015 (UTC)
Hi,
Does the Wilbur machine (analog computer, on display in the science museum in Tokyo) fit in the history of (analog) computers or did it have any significans? -- Butch ( talk) 13:40, 22 November 2015 (UTC)
Probably, a separate article would fare better, along the lines of the Atanasoff–Berry computer, which attacked the same application (systems of linear equations). Or, a contribution to System of linear equations, including both Atanasoff–Berry computer and Wilbur machine would add interest to a math article. -- Ancheta Wis (talk | contribs) 13:14, 23 November 2015 (UTC)
@Butch, I see that Google's quantum computer from D-wave is also a hard-coded device. That is, it embodies some quantum-mechanical experiment. In Google's case, it was quantum annealing. So we are back to the limitations of the Wilbur machine; like the Wilbur machine, the current Google machine is not general purpose, even though it ran 10^8 times faster [2] than a conventional computer working on the same problem, [3] simulated annealing. -- Ancheta Wis (talk | contribs) 15:53, 9 December 2015 (UTC)
References
Hi
any particular reason Lull's Ars Magna is not included (or at least referenced) here?
T
88.89.219.147 (
talk) 23:50, 17 May 2016 (UTC)
There is an undefined reference to "Robinson" in the portion related to Colossus. — Preceding unsigned comment added by 146.18.173.105 ( talk) 19:06, 8 June 2016 (UTC)
Hello fellow Wikipedians,
I have just modified 2 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{
Sourcecheck}}
).
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 22:33, 11 September 2016 (UTC)
There is a picture of this artifact but no mention of it in the text. Such an arrangement is not helpful. Kdammers ( talk) 17:53, 12 September 2016 (UTC)
Done -- Ancheta Wis (talk | contribs) 21:36, 18 September 2016 (UTC)
I assume that MESM, "the first universally programmable computer in continental Europe", that is, present-day Ukraine, should be added to History of computing hardware, before EDVAC. That or EDVAC removed from that section, since it's unclear how it contributes anything there. Or maybe both.
Ilyak ( talk) 05:24, 13 March 2017 (UTC)
Hello fellow Wikipedians,
I have just modified 3 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
{{
dead link}}
tag to
http://www.ourcomputerheritage.org/wp/When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 10:22, 3 April 2017 (UTC)
Hello fellow Wikipedians,
I have just modified 4 external links on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 07:39, 20 May 2017 (UTC)
I have been curious how IBM had such good knowledge of Zuse. Perhaps a history which details Dehomag can clarify this. -- Ancheta Wis (talk | contribs) 18:12, 23 August 2017 (UTC)
This quotation is a paraphrase of Machiavelli, The Prince, ch. XVIII:
—ch. XVIII
Since we are seeing a revert war, might we consider:
I am being vague because these statements could be misused against the existing order. I for one wish to preserve the stability of the existing order. -- Ancheta Wis (talk | contribs) 07:58, 8 September 2017 (UTC)
See: The Social Construction of Reality. In other words, as social beings, we belong to social systems which can be at war with each other. Can't we rise above the issues that divide us, and join in building up the social systems that unite us? -- Ancheta Wis (talk | contribs) 08:15, 8 September 2017 (UTC)
I paraphrase the preface to The Answers of Ernst Von Salomon to the 131 Questions in the Allied Military Government "Fragebogen" (This book has never been out of print in Germany, ever since its first publication) Ernst von Salomon wrote (I paraphrase) "As I wrote my answers, which would determine whether I lived or died, whether I would remain imprisoned or go free, I got the sense of a vast alien intelligence that had not the slightest interest in my own well-being ..." 08:31, 8 September 2017 (UTC)
I don't think that "magnetic storage" should be a subsection under "stored program". Magnetic storage isn't necessary for a stored program computer. Bubba73 You talkin' to me? 02:31, 24 September 2017 (UTC)
Turing is known for articulating the idea of a universal computer, but the first description of a universal computer was the lambda calculus, invented by Alonzo Church (who then became Turing's thesis advisor). Doesn't he belong in the same section with Turing? Briankharvey ( talk) 20:50, 16 October 2017 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on History of computing hardware. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 18 January 2022).
Cheers.— InternetArchiveBot ( Report bug) 02:12, 5 November 2017 (UTC)
Nothing on amateur computing?
john f 2.26.119.204 ( talk) 09:24, 5 December 2017 (UTC)
A NeXT Computer and its object-oriented development tools and libraries were used by Tim Berners-Lee and Robert Cailliau at CERN to develop the world's first web server software, CERN httpd, and also used to write the first web browser, WorldWideWeb. These facts, along with the close association with Steve Jobs, secure the 68030 NeXT a place in history as one of the most significant computers of all time.[citation needed]
This strikes me as opinion, and not necessarily fitting for a topic on computing hardware. Internet history, definitely, however it is still phrased as opinion. I happen to agree that the NeXT Computer (I believe the NeXTCube) that Tim Berners-Lee used to develop the WWW, and I would add John Carmack's development of Doom on a NeXtStation, are historical, I don't feel this paragraph fits in this article.
Perhaps the history of the WWW, an article on the history of Next, video games, etc. but not in this article.
Communibus locis ( talk) 21:42, 18 January 2018 (UTC)
The article has a book citation to Reconstruction of the Atanasoff-Berry Computer by John Gustafson. I can't find such a book but there is this paper. Is that it? Bubba73 You talkin' to me? 04:22, 8 April 2018 (UTC)
== Computers and Automation Magazine ==
Pictorial Report on the Computer Field:
A PICTORIAL INTRODUCTION TO COMPUTERS - [1], pp. 49-56
A PICTORIAL MANUAL ON COMPUTERS - [2], pp. 10-13, 15-17, 19-24, 28, 30, 32
A PICTORIAL MANUAL ON COMPUTERS, Part 2 - [3], pp. 12-17, 20-22, 24, 26-27
1958 Pictorial Report on the Computer Field - [4], pp. 6, 8-10, 12-14, 16-18, 20-21
1959 PICTORIAL REPORT ON THE COMPUTER FIELD - [5], pp. 8-19
1960 Pictorial Report on the Computer Field - [6], pp. 13-32
1961 PICTORIAL REPORT ON THE COMPUTER FIELD - [7], pp. digital 24-36, analog 41-45, I/O devices 60-69; 72-78, 83-88 (Bernoulli disk rotating storage device - p. 62, IBM 1301 - 69, Semiconductor Network Computer - 85)
1962 PICTORIAL REPORT ON THE COMPUTER FIELD - [8], pp. 26-42, I/O / components/others: 67-73 / 74-79/80-82
1963 PICTORIAL REPORT ON THE COMPUTER FIELD - [9], pp. 26-44
1964 PICTORIAL REPORT ON THE COMPUTER FIELD - [10], pp. 28-36, 37-51 (UNIVAC FLUID COMPUTER - air-operated, SDS 92 IC, Fairchild Planar II)
1965 Pictorial Report on the Computer Field - [11], pp. 18-30, 31-38; IC memories, Floating Floor :)
1966 Pictorial Report on the Computer Field - [12], pp. 22--- 89.25.210.104 ( talk) 18:21, 19 June 2018 (UTC)
Both men and women contributed, but works of women have been exaggerated and sources are inaccurate, based on words of feminist authors rather than neutral . When a job is male specific we never say "the field was primary dominated by men", but if a women in the slightest roles we bring up "women were involved", jobs primarily specified to women we say women were more involved. This is an article on computer hardware not a feminist propaganda article! The source Light, Jennifer S. (July 1999). "When Computers Were Women". Technology and Culture. 40: 455–483. comes from a feminist citation needed author rather than a neutral research and is unreliable. Respected Person ( talk) 10:16, 14 December 2018 (UTC)
The article has "...but the 'program' was hard wired right into the set up, usually in a patch panel". Is it correct to call a patch panel (plug board) hard-wired, since it is easily changed? See this dictionary. Bubba73 You talkin' to me? 03:11, 24 December 2018 (UTC)
I think that most of Post-1960 (integrated circuit based) section should be moved to History of computing hardware (1960s–present). -- MarMi wiki ( talk) 22:08, 30 December 2018 (UTC)
It is clear that Americans were intimately involved in the use of Colossu during WWII - see: "Small, Albert W. (December 1944),
The Special Fish Report, The American National Archive (NARA) College Campus Washington{{
citation}}
: CS1 maint: date and year (
link) CS1 maint: location missing publisher (
link)". So I shall revert the recent edit. --
TedColes (
talk) 04:15, 11 January 2019 (UTC)
I would like to discuss this edit: [15]
@ Tom94022: <--ping
It seems to me that computers based on integrated circuits were an important intermediate step between computers based upon discrete transistors and computers based upon microprocessors. I this the section should be restored. -- Guy Macon ( talk) 01:14, 20 January 2019 (UTC)
Integrated circuit computers never left the article; some of the history of integrated circuits did. The article should be aligned with the traditional four generations of electronic computers; tube, transistor, IC(not micoprocessor) and (monolithic) microprocessor. @ Guy Macon:'s edit sort of messed this up lumping three into one which I will restore. As far as the history of the invention of the IC does it really have much to do with this article and is very well coverred in the integrated circuit article. I will leave the history in until we hear from other editors. Tom94022 ( talk) 07:04, 20 January 2019 (UTC)
The section History_of_computing_hardware#Integrated_circuits really doesn't say anything about how/when ICs got into computers. The Apollo Guidance Computer should be mentioned as one of the first; it used only one type of small-scale IC (double 3-input NOR). Probably some of you know other early computers based on ICs (is that in what was removed?). Dicklyon ( talk) 20:11, 20 January 2019 (UTC)
Per WP:TALKDONTREVERT and WP:BRD I reverted the following rather major removal of sourced material, only to face an editor who chooses to re-revert rather than discuss. [16] [17] [18] This is the same editor who tried tto deleted a large chunk of material that we discussed in the section above. [19] [20]
Before I go any further, I would like to bring this up for discussion. Should that material be deleted or retained? -- Guy Macon ( talk) 22:22, 20 January 2019 (UTC)
The idea of the integrated circuit was conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952:[141]
With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires.[142] The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers”.The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[143] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[144]In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated.”[145] The first customer for the invention was the US Air Force.[146]Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[147] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
Its not clear why we have two articles that overlap, History_of_computing_hardware and History_of_computing_hardware_(1960s–present) but given that's the situation it seems appropriate to me that most of the information about the third and fourth generations of computer hardware belong to the second article while this article has summary sections pointing to the sections in the second article as the main article. To that line, I intend to move much of the microprocessor material to the main article. Comments?
I do think the "generations" belong in both articles and have so edited them into both articles. For this reason I reverted Ancheta Wis change of the section title. I have no objection to a new section on Current Commputers but shouldn't be in History_of_computing_hardware_(1960s–present)? It should include Microprocessor Computers but likely includes other computer devices. Would need to find a reference to add it in either article but I am sure there are some. Tom94022 ( talk) 17:45, 21 January 2019 (UTC)
Reference 118 is a link to a patent application. It is used at the end of a sentence saying that magnetic core was dominant until the mid-1970s. This is not in the source. Bubba73 You talkin' to me? 03:00, 22 January 2019 (UTC)
References
The scope of computing has moved beyond microprocessors; multiple governments beyond the US government are seeking quantum computing as a matter of national security, including Canada, Australia, the Netherlands, the United Kingdom, the European Union, Singapore, Russia, North Korea and Japan. [1] What this means for the article is a change to the post 1960 section name, beyond the microprocessor. [1] -- Ancheta Wis (talk | contribs) 07:02, 23 February 2019 (UTC)
References
Per Wikipedia:Disambiguation: There are three important aspects to disambiguation: "Making the links for ambiguous terms point to the correct article title. For example, an editor of an astronomy article may have created a link to Mercury, and this should be corrected to point to Mercury (planet)."
The term Modern is vague and meaningless. Modern history covers the period from the 16th to the 21st century. Modernity is also used for the "socio-cultural norms" of the world prior to World War II, and is associated with Modernism as an art movement (late 19th century to early 20th century).
Meanwhile contemporary history refers to the present time period. Dimadick ( talk) 19:24, 5 March 2019 (UTC)
I think not! A recent edit reverted without relevant discussion the removal of material that is not particularly relevant to this article and well covered in the linked article and elsewhere. I'm going to revert it again and hopefully there will be some discussion here and not edit warring. Tom94022 ( talk) 01:24, 13 September 2019 (UTC)
The above subsection is not particularly relevant to the question raised. It doesn't matter whether irrelevant material is copied from within Wikipedia or obtained from reliable sources it is still not relevant to this article. Similarly, it is not particularly relevant that the disputed material was in the article at some distant time. Nor does the age of the article or prior authors particularly matter. History of MOSFETs just not deserve a place in this article. I suppose if a reliable source can be found perhaps a single sentence might be added; something along the lines of, "Modern microprocessors are built upon MOSFET technology." Tom94022 ( talk) 20:10, 14 September 2019 (UTC)
See Talk:Analog computer#Earliest? -- Guy Macon ( talk) 16:23, 6 October 2019 (UTC)
Entire Class of Electro/Mechanical Computers MISSING
There were thousands of types of mechanical & electro-mechanical load/store computers used for hundreds of years.
Most of these used a sled, cart, or feeder robot that would take a sequence control (such as an index number, turns, box or document number, page or slot, etc) and go fetch or access something.
These, by storing values, performed almost any sequence of computations from any library of punched tape, reels, feed tape, mechanical route cards, etc...
There were huge tabulation and computational facilities supporting these little bots that followed routes on rails or channels to leads anywhere. — Preceding unsigned comment added by 172.58.187.157 ( talk) 19:07, 18 December 2019 (UTC)
The 1962 book Computers: the machines we think with, by D. S. Halacy, Jr, pg. 49, says that the ENIAC was followed by BINAC, MANIAC, JOHNNIAC, UNIVAC, RECOMP, STRETCH, and LARC. I couldn't find anything about RECOMP, but there is Autonetics Recomp II. Was there a computer named RECOMP, before the RECOMP II? Bubba73 You talkin' to me? 21:02, 19 August 2020 (UTC)
Did some more searching: April 1, 1957 "Recomp I, a new portable, high-speed, completely transistorized digital computer" https://www.americanradiohistory.com/hd2/IDX-Site-Technical/Engineering-General/Archive-Electronics-IDX/IDX/50s/57/Electronics-1957-04-OCR-Page-0138.pdf
Theory #1: The Recomp I was introduced in 1957, the Recomp II was introduced in 1958, and they were sill selling Recomp Is in 1958.
Theory #2: They called the Recomp I "Recomp" until they decided to build a Recomp II, and at that point started calling the first Recomp a Recomp I.
-- Guy Macon ( talk) 06:12, 20 August 2020 (UTC)
Why do I keep seeing this same exact line "The castle clock, a hydropowered mechanical astronomical clock invented by Al-Jazari in 1206, was the first programmable analog computer.[10][11][12]" posted on all the major computing history Wikipedia articles? For starters, I believe the claim is a bit sensationalized, as the word "programmable" is being used very loosely here. The term programmable is usually meant in the context of being able to provide instructions to a machine so that the machine can adjust its operations accordingly. In this scenario for Al-Jazari's clock (also very loosely associated with a computer, but it performs a computation of sorts, namely, keeping time and such, so I will grant that I suppose), the clock had to be manually recalibrated. Does this qualify as programmable? In addition to this, the actual cited source isn't even correct. The episode in question of Ancient Discoveries of the History Channel is Series 3 Episode 9, and the episode itself (available on YouTube) doesn't even support the claim that Al-Jazari's clock was the first programmable analog computer. The episode actually makes an even stranger claim: that Al-Jazari's clock was a "super computer". I also looked through source 11 and didn't find the claim supported on the page given. What is going on here? 2601:82:200:8B20:0:0:0:3C04 ( talk) 01:24, 16 June 2022 (UTC)
@ TedColes: My edit [24] which you reverted, corrected the suggestion in the article that Turing's designs was independent of the work done by Mauchly and Eckert at the University of Pennsylvania, as reported by John von Neumann in his "First Draft of a Report on the EDVAC." My correction gave references which you removed.
The version prior to my edits and the current version after being reverted says:
In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. His 1945 report 'Proposed Electronic Calculator' was the first specification for such a device.
Meanwhile, John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the " von Neumann architecture".
However Turing himself states on page 3 of his 'Proposed Electronic Calculator': [1]
"The present report gives a fairly complete account of the proposed calculator. It is recomended however that it be read in conjunction with J. von Neumann's 'Report on the EDVAC',"
Turing indeed wrote a more full worked out design, but he does not claim to have written the "first specification for such a device." My edits which correct the timing, without denigrating Turing's contribution in any way, should be restored.-- agr ( talk) 19:57, 27 August 2023 (UTC) agr ( talk) 19:57, 27 August 2023 (UTC)
The idea of a universal stored-programme computing machine was promulgated in the USA by von Neumann and in the UK by [Max] Newman, the two mathematicians who, along with Turing himself, were by and large responsible for placing Turing’s abstract universal machine into the hands of electronic engineers. [2]
The theoretical basis for the stored-program computer had been proposed by Alan Turing in his 1936 paper.
Meanwhile,John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the "von Neumann architecture". Turing presented a more detailed paper to the National Physical Laboratory (NPL) Executive Committee in 1946, giving the first reasonably complete design of a stored-program computer, a device he called the Automatic Computing Engine (ACE). However, the better-known EDVAC design of John von Neumann, who knew of Turing's theoretical work, received more publicity, despite its incomplete nature and questionable lack of attribution of the sources of some of the ideas.[54]In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. Turing thought that the speed an ...
Why does the article claim that 'The first commercial computer was the Ferranti Mark 1, built by Ferranti and delivered to the University of Manchester in February 1951. ' when the Z4 was already rented to ETH and in operation there in 1950? This in my view clearly makes the Z4 the first commercial computer. The Z4 article even says (with references) that 'In 1950/1951, the Z4 was the only working digital computer in Central Europe, and the second digital computer in the world to be sold or loaned,[1]: 981 beating the Ferranti Mark 1 by five months and the UNIVAC I by ten months, but in turn being beaten by the BINAC (although that never worked at the customer's site[19]).' Claiming a computer that never really worked the 'firs commercial computer' seems rather misleading, so the first computer working for money is clearly the Zuse Z4. -- 85.169.148.50 ( talk) 22:36, 10 March 2024 (UTC)