This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||||||
|
The idea of "generations" of programming languages appears to have arisen as a bit of marketing jargon particularly around the epoch of the so-called "fourth-generation" languages. The proposed distinctions imply that trends in language popularity are progressive rather than being driven by a combination of marketing fads and shifting requirements.
It is increasingly obvious, however, that this is the case: while there is a broad general trend towards greater abstraction from the hardware, it is not monotonic. For instance see the decline in popularity of the more-abstract language Lisp in favor of the closer-to-hardware C and C++ in the 1980s and '90s. Nor is there a determined trend towards application specificity; see, for instance, the demise of special-purpose COBOL for general-purpose Java in business applications.
Of course, changes in language popularity are not driven entirely by marketing. COBOL lacks standard libraries to talk to Internet clients; Java has them. As talking to the Internet becomes more important for the problem domain, usage migrates to a language where it is natural: Java. Likewise in other domains: biological science programming, once dominated by Fortran, acquires a need for text processing due to the rising importance of genomics, and begins to migrate to Perl. These changes are not toward greater application specificity, but rather toward closer fit to changing application requirements.
(Indeed, the newly adopted languages often lack underlying application-specific features the old ones have: Java does not have fixed-point decimal numbers, a COBOL feature valuable for business applications.)
What's my point? The idea of successive "generations" of programming languages replacing one another at higher levels of abstraction and application specificity is not historically accurate after, say, 1960. (COBOL, Fortran, and Lisp all existed in 1960.) Wikipedia should not present it uncritically, but rather note it wherever it appears as folk-history and marketing jargon rather than historical reality. -- FOo 15:23, 8 Dec 2003 (UTC)
I'll add another confuser -- I don't think you can separate language from era. We could also look at the term being languages of second-generation programming; where both the knowledge and the tools are advancing which enables and changes how everything is done. First generation programming is wires or toggle switches and the language is thus machine language; second generation goes to keyboards and memory and thus assembly can be and is done; third generation has punched cards or tape and computers with discs that can compile so compiled languages; then we get video editing and floppies in the 80s and languages come with callable libraries extending the language with basis of programming books like Knuth so there's common algorithm and architecture ideas to procedural work, and linguistic concept languages like LISP APL, ADA, C++ object-oriented and so on. Then 5th generation programming could be considered the cell phone and app store era, with collaboration and multi-language CM tools in play and more about glue code between or extending large foundation products. 71.88.51.20 ( talk) 13:31, 21 July 2013 (UTC)
This article, the other articles in the programming language generation series, and the first comment on this discussion page, all characterize language generations in a way that seems strange to me.
In particular, the idea of calling C a second generation language is bizarre. Its relatively low level notwithstanding, this flies in the face of how the term has always been used. Just the fact that it is possible (important!) to write optimizers for C, and that assembly language programmers shake (or rather, shook) their heads about the inefficiencies of C code generation, would seem to bear this out.
My understanding of the history of this term – but one for which I am as yet unable to find good sources – is as follows:
I present this for what it's worth. Having programmed using languages in all generations, fought the good fight to introduce higher-level languages into second-generation shops, and participated in the development of so-called fourth-generation and fifth-generation languages, I would hope this description isn't too far off.
I agree with the comment above that we need to distinguish the historical use of nth-generation language from neologisms based on programming complexity/hierarchy. However, since the latter seems to be the current trend in practice, perhaps both views need to be presented, on these pages and on a new article about programming language generations that all of these should use as a main article. Trevor Hanson 18:18, 7 October 2007 (UTC)
Something worth considering: It may well be that these terms are ill-defined; that they are inconsistent, self-undermining, and ultimately nonsense.
As I understand it, these terms were largely advanced by software marketing folks. They wanted their own companies' software tools considered to be of a higher "generation" than other people's tools -- particularly database companies pushing "fourth generation languages", meaning "report generators" ... with the implication of higher generation being, of course, that these were newer, more advanced, more powerful, more productive tools.
Naturally, these terms caught on with people who like to think of themselves as expert in the newest, coolest tools. But that's precisely because they aren't very descriptive terms. They're mostly wind. -- FOo 05:23, 8 October 2007 (UTC)
My handful of Eurocents: I agree with the characterization of these terms; they were invented in the 1970s and popular in the 1980s, they haven't really been updated since, and they were never more than a very rough, informal characterization of the languages (over 5000) that already existed at the time. I think it would be best to merge all of these articles into a single one, ("Generations of programming languages"?) and include the above text there. Rp ( talk) 14:57, 28 July 2008 (UTC)
I'm finally getting a bit of time to deal with this material, which worries me because it appears to be a modern attempt to re-define old terminology rather than an accurate description of the way the terms were defined and used in their time. I've been working with programming languages since the early 1960's and lived through most of this. It is my recollection that the terms "first-generation" and "second-generation" programming languages were not in any real use until the "marketing" need arose to indicate that "third-generation" languages were an improvement. Prior to that time, the terms "assembler language" and "high-level language" were in use, with assembler language having the usual meaning in which the programmer wrote on a symbolic but more-or-less one-to-one equivalent to the machine's language. I say "more or less" because assemblers like FAP even then started to include macro facilities. "Higher level" languages included FORTRAN, COBOL, MAD, IAL etc. Contrary to some claims (occasionally made elsewhere), these languages had constructs for data type declaration that was weakly enforced, subroutines/functions, flow of control, shared (global) storage, and were automatically mapped to assembler or machine code without the programmer having to manage issues like: general layout of memory for data and program, choice of machine-specific instructions and sequences, and use of machine registers and storage, and establishment of machine-level linkage conventions. "Third-generation" languages appeared with ALGOL and PL/I, and were seen as a great improvement because they could be described with context-free grammars, they automatically managed dynamic storage (especially through the use of a "stack"), and therefore permitted recursive programs, they enforced a stronger type-checking discipline (although one that generally allowed for explicit, planned exposure of memory layout) AND introduced many, varied enhancements of capabilities like: array operations, complex interplays of automatic type conversions, task-management, and references to explicitly managed data objects. This generation of languages reached its "peak" with ADA.
LISP, APL, and other "non-commercial" languages were never really included within the "generational" terminology, although LISP was sometimes talked about as "second-generation" because of the era of its introduction. Many practitioners of the main-line saw the way forward as languages that would include data-structure choice, like SETL, and those were called "very-high-level languages".
After that, the story meanders. "Fourth generation" language was term used in a catch-all attempt to market languages that integrated general algorithmic languages with other kinds of language, such as data base manipulation languages. But it never caught on as a useful technical or marketing term, and "fifth-generation" languages encoded the promise that non-determinism would be practical, emerging in languages like PROLOG and its descendants and synergizing with the AI-oriented "fifth-generation computing" of its time. And since then, languages seem to be "functional", "oriented", or domain-specific rather than "generational".
That's history as I saw it pass, and I have programmed in all those languages, (well, ... except IAL), and written compilers or interpreters for many. I've worked with and listened to stories of many of the people involved in creating Fortran and provided an occasional listening ear to Jean Sammet while she was writing her book on the history of programming languages. I know of no-one working in the field before 1970 who would have considered “second-generation” programming languages to be assembler languages. So, I would like to undertake to revise this page (and related pages) along these lines. I'm fairly sure I can find contemporary material to document the progression, but memories are always individual so if someone has a reference (other than to a new article attempting to guess what happened - and I've seen a couple), I'm happy to try to find a wider view.
I'm not sure that separate pages for the "generations" is the right way to do things, especially with the meandering away from "generations" since 1980, but I don't myself propose to change this overall structure, just the content. CSProfBill ( talk) 14:28, 13 August 2009 (UTC)
Pursuing the suggestions to reorganize this material, I have created a page called Programming language generations, using this and additional material, as explained on its discussion page. If interested, please go there and make further improvements. Thanks CSProfBill ( talk) 14:33, 23 September 2009 (UTC)
This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||||||
|
The idea of "generations" of programming languages appears to have arisen as a bit of marketing jargon particularly around the epoch of the so-called "fourth-generation" languages. The proposed distinctions imply that trends in language popularity are progressive rather than being driven by a combination of marketing fads and shifting requirements.
It is increasingly obvious, however, that this is the case: while there is a broad general trend towards greater abstraction from the hardware, it is not monotonic. For instance see the decline in popularity of the more-abstract language Lisp in favor of the closer-to-hardware C and C++ in the 1980s and '90s. Nor is there a determined trend towards application specificity; see, for instance, the demise of special-purpose COBOL for general-purpose Java in business applications.
Of course, changes in language popularity are not driven entirely by marketing. COBOL lacks standard libraries to talk to Internet clients; Java has them. As talking to the Internet becomes more important for the problem domain, usage migrates to a language where it is natural: Java. Likewise in other domains: biological science programming, once dominated by Fortran, acquires a need for text processing due to the rising importance of genomics, and begins to migrate to Perl. These changes are not toward greater application specificity, but rather toward closer fit to changing application requirements.
(Indeed, the newly adopted languages often lack underlying application-specific features the old ones have: Java does not have fixed-point decimal numbers, a COBOL feature valuable for business applications.)
What's my point? The idea of successive "generations" of programming languages replacing one another at higher levels of abstraction and application specificity is not historically accurate after, say, 1960. (COBOL, Fortran, and Lisp all existed in 1960.) Wikipedia should not present it uncritically, but rather note it wherever it appears as folk-history and marketing jargon rather than historical reality. -- FOo 15:23, 8 Dec 2003 (UTC)
I'll add another confuser -- I don't think you can separate language from era. We could also look at the term being languages of second-generation programming; where both the knowledge and the tools are advancing which enables and changes how everything is done. First generation programming is wires or toggle switches and the language is thus machine language; second generation goes to keyboards and memory and thus assembly can be and is done; third generation has punched cards or tape and computers with discs that can compile so compiled languages; then we get video editing and floppies in the 80s and languages come with callable libraries extending the language with basis of programming books like Knuth so there's common algorithm and architecture ideas to procedural work, and linguistic concept languages like LISP APL, ADA, C++ object-oriented and so on. Then 5th generation programming could be considered the cell phone and app store era, with collaboration and multi-language CM tools in play and more about glue code between or extending large foundation products. 71.88.51.20 ( talk) 13:31, 21 July 2013 (UTC)
This article, the other articles in the programming language generation series, and the first comment on this discussion page, all characterize language generations in a way that seems strange to me.
In particular, the idea of calling C a second generation language is bizarre. Its relatively low level notwithstanding, this flies in the face of how the term has always been used. Just the fact that it is possible (important!) to write optimizers for C, and that assembly language programmers shake (or rather, shook) their heads about the inefficiencies of C code generation, would seem to bear this out.
My understanding of the history of this term – but one for which I am as yet unable to find good sources – is as follows:
I present this for what it's worth. Having programmed using languages in all generations, fought the good fight to introduce higher-level languages into second-generation shops, and participated in the development of so-called fourth-generation and fifth-generation languages, I would hope this description isn't too far off.
I agree with the comment above that we need to distinguish the historical use of nth-generation language from neologisms based on programming complexity/hierarchy. However, since the latter seems to be the current trend in practice, perhaps both views need to be presented, on these pages and on a new article about programming language generations that all of these should use as a main article. Trevor Hanson 18:18, 7 October 2007 (UTC)
Something worth considering: It may well be that these terms are ill-defined; that they are inconsistent, self-undermining, and ultimately nonsense.
As I understand it, these terms were largely advanced by software marketing folks. They wanted their own companies' software tools considered to be of a higher "generation" than other people's tools -- particularly database companies pushing "fourth generation languages", meaning "report generators" ... with the implication of higher generation being, of course, that these were newer, more advanced, more powerful, more productive tools.
Naturally, these terms caught on with people who like to think of themselves as expert in the newest, coolest tools. But that's precisely because they aren't very descriptive terms. They're mostly wind. -- FOo 05:23, 8 October 2007 (UTC)
My handful of Eurocents: I agree with the characterization of these terms; they were invented in the 1970s and popular in the 1980s, they haven't really been updated since, and they were never more than a very rough, informal characterization of the languages (over 5000) that already existed at the time. I think it would be best to merge all of these articles into a single one, ("Generations of programming languages"?) and include the above text there. Rp ( talk) 14:57, 28 July 2008 (UTC)
I'm finally getting a bit of time to deal with this material, which worries me because it appears to be a modern attempt to re-define old terminology rather than an accurate description of the way the terms were defined and used in their time. I've been working with programming languages since the early 1960's and lived through most of this. It is my recollection that the terms "first-generation" and "second-generation" programming languages were not in any real use until the "marketing" need arose to indicate that "third-generation" languages were an improvement. Prior to that time, the terms "assembler language" and "high-level language" were in use, with assembler language having the usual meaning in which the programmer wrote on a symbolic but more-or-less one-to-one equivalent to the machine's language. I say "more or less" because assemblers like FAP even then started to include macro facilities. "Higher level" languages included FORTRAN, COBOL, MAD, IAL etc. Contrary to some claims (occasionally made elsewhere), these languages had constructs for data type declaration that was weakly enforced, subroutines/functions, flow of control, shared (global) storage, and were automatically mapped to assembler or machine code without the programmer having to manage issues like: general layout of memory for data and program, choice of machine-specific instructions and sequences, and use of machine registers and storage, and establishment of machine-level linkage conventions. "Third-generation" languages appeared with ALGOL and PL/I, and were seen as a great improvement because they could be described with context-free grammars, they automatically managed dynamic storage (especially through the use of a "stack"), and therefore permitted recursive programs, they enforced a stronger type-checking discipline (although one that generally allowed for explicit, planned exposure of memory layout) AND introduced many, varied enhancements of capabilities like: array operations, complex interplays of automatic type conversions, task-management, and references to explicitly managed data objects. This generation of languages reached its "peak" with ADA.
LISP, APL, and other "non-commercial" languages were never really included within the "generational" terminology, although LISP was sometimes talked about as "second-generation" because of the era of its introduction. Many practitioners of the main-line saw the way forward as languages that would include data-structure choice, like SETL, and those were called "very-high-level languages".
After that, the story meanders. "Fourth generation" language was term used in a catch-all attempt to market languages that integrated general algorithmic languages with other kinds of language, such as data base manipulation languages. But it never caught on as a useful technical or marketing term, and "fifth-generation" languages encoded the promise that non-determinism would be practical, emerging in languages like PROLOG and its descendants and synergizing with the AI-oriented "fifth-generation computing" of its time. And since then, languages seem to be "functional", "oriented", or domain-specific rather than "generational".
That's history as I saw it pass, and I have programmed in all those languages, (well, ... except IAL), and written compilers or interpreters for many. I've worked with and listened to stories of many of the people involved in creating Fortran and provided an occasional listening ear to Jean Sammet while she was writing her book on the history of programming languages. I know of no-one working in the field before 1970 who would have considered “second-generation” programming languages to be assembler languages. So, I would like to undertake to revise this page (and related pages) along these lines. I'm fairly sure I can find contemporary material to document the progression, but memories are always individual so if someone has a reference (other than to a new article attempting to guess what happened - and I've seen a couple), I'm happy to try to find a wider view.
I'm not sure that separate pages for the "generations" is the right way to do things, especially with the meandering away from "generations" since 1980, but I don't myself propose to change this overall structure, just the content. CSProfBill ( talk) 14:28, 13 August 2009 (UTC)
Pursuing the suggestions to reorganize this material, I have created a page called Programming language generations, using this and additional material, as explained on its discussion page. If interested, please go there and make further improvements. Thanks CSProfBill ( talk) 14:33, 23 September 2009 (UTC)