From Wikipedia, the free encyclopedia

Image request

{{reqimageother|a representative image of computer code (see discussion)}}

I have little difficulty understanding the feature specification. I may have it missed it entirely.

I see the need for artwork to match the standard form (fashion) of other articles. An image representative of 'computer code' is simple, photograph some representative structured code that displays an editing style of say indenting. That is code, and its Boring, hence appropriate?.

If the goal is a little larger, to for instance portray the archetypal process of creating a program in a computer programming language using the artistic medium, image. I would suggest an photo of an actual 80 char screen, possibly green, with individual pixels large enough to be seen, the one line of text would read "Syntax Error: redo from start". It could, if one could be thought of, include the actual text that was a classic, obvious yet confusing problem in say basic syntax. This would to me encapsulate the constant process of humans trying battle the computers rigid interpretation of the humans otherwise reasonable requests. The prior syntax error could also be the humorous "please help" instead of just the more likely but terse "help". This idea that the human has mistaken (anthropomorphised) the computer for someone that can be persuaded with politeness is the point of desperation. If the image context was to be larger an undrunk cup of coffee and an empty pizza box would round out the stereotype as these imply both the multitude of hours spent already and yet the incompleteness of the task.

Humans or machines?

I think we need to point out that programming languages, like other languages, are for humans to express human ideas in. The unique thing about programming languages is that we can automatically translate these expressions into the ones and zeros that computers use. Still, the primary purpose that should be stressed is that these are human languages, for humans to express solutions in which are meant to be understandable by other humans. Since most of the cost of software across the useful lifespan of a program is invested in enhancements and maintenance, the human-readability of programs is much more important than their nature as a "technique for expressing instructions to a computer".

Low-level languages such as machine code are also programming languages. All programming languages are in principle both human- and machine-readable, but the relative emphasis varies.

I agree, the importance of producing code that can be easily understood by a human is of extremely important and must be mentioned prominently in this article. Indeed, producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.
The original and still primary purpose of a computer programming language is not communication with other humans. A programming language is not a human language in the oridnary sense of the term; even two hardcore professional programmers don't ever go to lunch and talk to each other in Java or C. Natural human languages are far better suited to interpersonal communication. Programming languages were invented for and are used nearly exclusively for the express purpose of allowing humans to easily communicate instructions to computers (and later be able to easily modify those instructions), and not for human-to-human communication. Even programming manuals frequently express algorithms as human language influenced pseudocode rather than in a real programming language.
It's perfectly possible (and regrettably common) to write huge complex blocks of code or even entire applications that are unintelligible to anyone but the author, yet which work perfectly well when executed by a computer. A programming language is basically and fundamentally a "technique for expressing instructions to a computer." Doing so in a human-readable fashion is a big plus, of course, but it is not the fundamental purpose of a programming language, and is usually accomplished through appropriately garnishing the code with comments in some natural human language. Kwertii 09:38, 9 May 2004 (UTC)
FWIW, I am a programming language theorist and I would define "programming language" formally in the following manner: a programming language is a decidable formal language equipped with a Turing-complete semantics; a program is a programming language together with a member of that language. (BTW, the page for "Turing-complete" is not really, er, adequate...) This means that a language is a set of finitary strings, for which it is computable whether or not a given string belongs to it, together with a computationally adequate model, for example a mapping from each such string to a λ-term, or Turing tape, or partial recursive numeric function. In my opinion, few programming language researchers would disagree with this definition as applied to formal work. However, if you want to define the popular notion of "programming language" then, yes, you might want to add fuzzy conditions like "human-readable" and so on, and maybe weaken the Turing-completeness condition to admit terminating languages like Charity. -- [ Frank Atanassow], 24 July 2004
That formal definition is too constraining. It would be a rather odd definition of "programming language" that did not count the simply-typed lambda calculus as a programming language. Conversely, I had a colleague working on a language with an undecidable type system. (The subtyping constraints were flexible enough to allow the user to encode Prolog programs in them.) He had proposed various restrictions to make the type system decidable, but it seems clear to me that his language didn't become a "programming language" only when he imposed those restrictions. So a programming language need not have a Turing-complete semantics, nor must it be a decidable formal language. k.lee 23:09, 25 Jul 2004 (UTC)
And conversely, to determine which is more "primary" or "important"--which is worse, a programming language that humans can't read, or one computers can't read? "Most of the cost of software across the useful lifespan of a program" may be in enhancements and maintenance, but if the program weren't machine-readable, its "useful lifespan" would be zero. --Daniel.
A programming language that humans can't read is clearly worse. There are plenty of useful programs that are not machine-readable. Virtually all work by academic programming language designers begins with the development of "core" languages, which are mathematical constructs first and foremost. The call-by-name lambda calculus is hardly primarily a tool for communicating with computers (if by "computers" you mean those beeping chunks of silicon that sit on people's desks). I find this this machine-oriented focus kind of disheartening. Do we not agree with the Dijkstra quote concerning astronomy and telescopes at the top of the computer science article?
"plenty of useful programs that are not machine-readable"--and similarly there are plenty of useful programs that are barely human-readable. That's not a good test.
There is a big difference between "not" and "barely". k.lee 23:09, 25 Jul 2004 (UTC)
The test I have in mind is:
-Examine the total amount of resources that humans have put into designing, studying, and using programming languages. (Or, if you like, examine the returns on that investment.)
-What portion of that investment would have been made, and what portion of those returns would have been received, if no programming language had been machine-readable? I think 3% for both would be a wild overestimate.
-What portion of that investment would have been made, and what portion of those returns would have been received, if no programming language had been more human-readable than, say, FORTRAN? I'd guess 70% of the investment and 20% of the returns, anyway...given how badly even readable programming languages are typically used.
The exact numbers are obviously arguable, but that's the test I had in mind. I agree with Dijkstra, but what computer science is about, and what programming languages are for, are two different things. Computer science is not about machines, but programming languages are for controlling machines, first and foremost; if they could not be used for that they would be relegated to a smallish sub-discipline of mathematics, neither very popular nor very well-funded.
-Daniel.
FORTRAN is already tremendously human-readable, compared to binary machine code or any number of encodings that would be adequate to the purpose of describing computation to machines. Furthermore, I don't see too much programming language research on making languages easier to read by machines. So this argument actually demonstrates the opposite of what you intended it to. k.lee 23:09, 25 Jul 2004 (UTC)
Lastly, Kwertii's comments strike me as unconvincing:
  • It is irrelevant that programmers don't speak to each other in Java. Musicians don't speak to each other in musical notation. Mathematicians don't speak to each other in pure set theory and first-order logic. (Admittedly, some of them come close.) Lawyers do not speak to each other purely in legalese. Sculpters do not speak to each other with a series of little statuettes. There are other forms of human-to-human communication besides the oral use of natural language. Programming languages are one of them.
  • Furthermore, although programs can be written that are often called "unreadable", they are not literally unreadable, merely difficult to read. Nor should we construe the existence of hard to read programs as evidence that the primary purpose of programming languages is to communicate with machines. The English translation of Jacques Derrida's On Grammatology is nigh unreadable, but that does not show that deconstructionist literary theory is not a form of human-to-human communication.
  • It is certainly not the case that programmers communicate with other programmers primarily through comments! In fact, natural language comments are a noriously bad way to communicate precisely about a program. Code itself is the major form of programmer-to-programmer communication. (See the amicus curiae brief in MPAA v. 2600, which argues a similar point.)
  • Finally, as for Kwertii's claim that the "original" purpose of programming languages was to communicate with machines --- programming languages predate executable programming language implementations by at least two decades. The lambda calculus was invented in the 1930's. FORTRAN was invented in the 1950's. k.lee 09:24, 20 May 2004 (UTC)
Make that one decade. The codes given to some of the first computers in the 1940's, via paper tape and the like, certainly count as programming languages, and were designed to control machines. For that matter, Babbage's punch cards were designed to control machines.
-Daniel.
Augusta Ada Byron King, Countess of Lovelace, born December 10th 1813 ... invented the first computer programming language. -- WikiWikiWeb:AugustaAdaByron. I'm not sure which side of the argument this factoid supports.

producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.

Many people believe this. Quite a few programmers disagree very, very strongly. We believe that producing readable code is mostly through renaming, refactoring, etc. so that the name of an variable communicates (to humans) what it is, the name of a method communicates (to humans) what it does, etc.

See WikiWikiWeb:TreatCommentsWithSuspicion, WikiWikiWeb:ToNeedComments ("Refactor the code properly and you won't need comments.")

-- DavidCary 23:36, 5 Jul 2004 (UTC)


As computers grow more complex, our ability to translate solutions into executable binary has become more abstract, enabling us to express solutions in terms of objects, templates, patterns and aspects. Such abstractions enable a more natural translation from human needs, often expressed as "use cases", into executable solutions. It is this trend toward greater abstraction in the expression of programming solutions that enables programmer productivity to double, despite programmers being locked into fixed biological hardware.

If we gaze deep into the crystal ball, we see the logical extension of this trend as computers that are capable of conversing with humans and creating executable binary programs from desires or solutions expressed in pure human language. The shift away from computer-centric aspects of programming languages toward more human-solution-centric aspects will continue to be the defining characteristic of near-future programming languages.

The "holy grail" of programming language development from this viewpoint would be the creation of a transparent interface to a computing substrate that can extract requirements from the user and instantiate an executable solution. Of course, at this level of abstraction, there is no "programming language" any longer, merely a somewhat pedantic conversation required to define the essential complexity of the problem the user wishes to solve.

At any rate, some mention should be made of this shift from computer-centric aspects toward human-centric aspects and how this affects programmer productivity and how it will shape the role of programming languages going forward.

language links

(moved to Talk:List_of_programming_languages)


Possible prejudices re: "mainstream" languages

This article seems to be written largely from the point of view of a programmer in mainstream languages. For example, interactive use is attributed to interpreters, without considering that eg. many Smalltalk and Lisp systems have native compilers that are used interactively. Sorry for not bothering to work this rant into a considered and balanced edit of the article.

-- han

I disagree. I don't think we need to perpetuate the prejudices of "programmers in mainstream languages" (read C/C++, Java). That would be about as stupid as rewriting the operating systems entry from the point of view of a windows user.

Anyways, someone who has a copy of 'Programming Language Concepts and Paradigms' handy, an exceedingly comprehensible book on the subject, should rewrite this article. -- Ark

Total rewrite

This entire section needs to be rewritten from scratch. This includes this topic plus those for the various languages and language concept articles. This is going to be a big project but I think its important. Computer programming is too much a part of modern life to be half covered in an encyclopedia so I have to agree with Ark. Rlee0001 05:25 Jul 27, 2002 (PDT)

On another note: I would limit the list of programming languages here to just the main languages and not all the dialects. For example, there are something like 15-20 dialects of BASIC listed in the BASIC programming language page. Instead of listing all of them, one link for the entire language would suffice. If the user want's a dialect, he/she can stiff get to it from the BASIC page. Same goes for all the languages. Further, I fail to see why people are listing such obscure languages and dialects in an encyclopedia. Some languages have historical or technological significance. Others are just current brandnames for half-written freeware with a source forge page and no user base. Should "Applesoft BASIC" really get its own topic? What did it do to revolutionize the language? Did it have a particularly large user base? Did it establish any conventions which are widely in use today? If not its probably not worthy of its own topic. Even worse is articles like ibasic. This is a basic interpreter for the mac. It has no historical significance: it was just created within the last year by an ameture developer who lives in some small cottage in sweden somewhere. It gets it's own encyclopedia article? Rlee0001

I would propose the following:
  • Make a (short) list of the most significant programming languages in history to put on the Programming language page. Annotate the list to make clear just why these language are mentioned.
  • Make new article called List of programming languages, where every single progamming language can be mentioned, even dialects. This page can have several different orders, such as alphabetical, but also by type (functional, OO, etc.) or maybe even a history tree (there's a good book about the history of Programming Language by Sebasta, if I'm right, you may use that as a reference).
  • For those dialects/spin-offs/implementations/ports of programming languages that are never going to be more than a single-sentence article: assemble on the page of the main article ( BASIC programming language here) and make a section where you mention this or, when this is getting a long list, make it a separate article.
That's what I think would be best. I'll try and see if I can help you with some of the work you're proposing to do; there are enough other people with knowledge about the subject around, so it should be possible to get something good out of this. Jeronimo 01:56 Jul 29, 2002 (PDT)

---

BTW re: classifying languages by category, many languages belong in more than one category (constraint languages vs. rule-based languages vs. logic languages; and what about functional + OO languages like CLOS?) Just to keep in mind. -- k.lee

Wikibooks
Wikibooks
Wikibooks
Wikibooks
Wikibooks has more about this subject:

I have added two Wikibook links which allready have the texts which where suggested - The first link has an alphabetical and category list of languages - the 2nd link points to short introductions. I hope that helps. -- Krischik  T 16:52, 18 September 2005 (UTC)

Rewrite of k.lee

FYI: For some time I've been working on a ground-up rewrite of this article, because its current state does not make me happy. It's not ready to go live, but I've finally posted my current draft in my user space. I welcome comment on my rewrite; also feel free to edit it directly. It's taking me a long time to do the rewrite, but I plan to replace the entire current article eventually. k.lee 02:28, 27 Aug 2003 (UTC)

The link seems to be broken. -- Doradus 11:05, 27 Aug 2003 (UTC)
It appears to me as a red "edit" link rather than a regular blue link. -- Doradus 21:49, 28 Aug 2003 (UTC)
is there a reason that javascript isn't mention in the " Commonly Used Languages"? Also the link is red for me too ... reddi 21:58, 28 Aug 2003 (UTC)
Ok, the link works now. -- Doradus 00:03, 1 Sep 2003 (UTC)
I'm not sure I like the rewrite. I haven't read the original to compare it, but the rewrite seems to be at a very awkward level of detail. Anyone with enough background in the area to understand that writeup presumably doesn't need to read it. For instance, the grammar example casually uses the terms "atom" and "symbol" which have very little meaning to those outside the field of computer programming. In fact, the whole section on grammars would be better off in another article (say, on parsing). -- Doradus 00:10, 1 Sep 2003 (UTC)
Actually, I think that when I get to editing this article some more, I'll factor out several sections (e.g., the history of programming languages, and language semantics) into separate articles. I'll keep your suggestion in mind. k.lee 02:07, 2 Sep 2003 (UTC)

The link seems to be working now. :-)

I would like to ask is there a clear concensus that the original article is unsatisfactory to the extent that it needs a re-write? TonyClarke 11:38, 27 Aug 2003 (UTC)

Well, I don't know about a consensus, but here are my reasons for wanting to rewrite the article. First, the original article is rather disorganized. Second, it leaps into issues like the representation of data without even saying why programming languages exist in terms that a layperson can understand. Third, the original article does not maintain a sufficient distinction between the design of programming languages and their implementation. Fourth, the article does not give enough attention to formal languages (actually, if you counted all the programming languages ever invented, I suspect formal languages would outnumber "practical" languages). Finally, and most importantly, as the poster at the top of this talk page noted, the article does not give priority to a programming language's role in human-to-human communication --- which all language designers and software engineering researchers, not to mention most working programmers, understand as its most important role. It's possible that you could alter the original article to fix these flaws, but the changes would be radical enough to resemble a ground-up rewrite anyway. BTW I have reused sections from the original article where I thought appropriate. k.lee 02:07, 2 Sep 2003 (UTC)

The current main page definitely needs to be re-considered. While it is quite accurate (it seems to me), it is mostly a summary of the topic using the terminology of the discipline, and so is quite inscrutable to a newcomer. It occurs to me that an encyclopedia needs both a specialist and non-specialist version of the general information articles. The specialists need a means to agree on the theoretical structure of the topic, and the newcomers need to learn about it from scratch.

Removed from subject page:

To Do: this is just an outline to get started; add some descriptive text (or put in '/' links) and add a few representative languages to the descriptions


Rlee0001 01:51 Oct 20, 2002 (UTC)

Numbers of Users?

Do Ruby and Scheme really have several hundred thousand users, as in programmers who use them regularly? I doubt it, but I've been wrong before. Wesley

I believe so. But no one can prove either point anyway. --TakuyaMurata

I think it's probably true. For example, OCaml has at least 10^3 vocal users, probably 10^4 real users and probably 10^5 people who've played with it. However, such things are so difficult to quantify (e.g. look at Tiobe's silly estimates, which see huge bias from big business) and even to define (e.g. should we be talking about the total running time of programs written in different languages in order to combat, for example, the majority of Sourceforge projects "written in C++" that have yet to see an alpha release?) that I don't think such (mis)information belongs on Wikipedia. -- Jon Harrop

Naming conventions?

Hi,

why have we put virtually every programming language on "Foo programming language", and not on "Foo" if "Foo" is reasonably unique? "Programming language" is disambiguation, and that should only be used when there is ambiguity, should it not? -- Eloquence 00:08 Jan 24, 2003 (UTC)

Please see Wikipedia talk:Naming conventions (languages). There's no counterargument against changing to a sensible naming scheme that I'm aware of other than that certain people seemed to take it as a personal affront when it has been suggested in the past. -- Brion 00:12 Jan 24, 2003 (UTC)

Heck, I'd forgotten about that. The convention as stated at Wikipedia:Naming conventions does now say not to add "programming language" if the name of the language is unique, but I've not done any work in moving pages to reflect this new convention yet. I don't have time to start on this right now, but now that I've been reminded about it, I'll get to it when I have time (others, of course, are more than welcome - indeed encouraged, nay begged - to get there before me). -- Camembert

Great. I'll start some moving, although fixing double redirs will be an annoyance and we'll probably lose some page histories .. -- Eloquence 00:19 Jan 24, 2003 (UTC)


Proper classification of Python

Some people including me might take exception to Python being classified as procedural with bolt-on OO technology, this has been extensively discussed in the Python community

Miscellany

I would like to point out that the programming language list above misses Objective-C. Brent Gulanowski 15:54, 15 Oct 2003 (UTC)

Turing completeness and generality

One thing the article appears to miss is the basic elements that all languages must share to be able to express any computable algorithm. I was taught this as "Sequence, Selection, Repetition" but it may be known in a number of ways. I feel this is important, as well as correct attribution to whoever proved it mathematically - was it Turing maybe? I feel that once it is clear that all languages must support this basic elements, then they can be discussed in the abstract without having to say language X has this feature, language Y has this feature, etc. (though that can be added as an extension). GRAHAMUK 23:29, 9 Nov 2003 (UTC)

I agree that it would be very nice if the article has discussion for foundmental aspects of programming language. Particularly I would like to see what makes programming language real languages. What distinguishes them from markup languages like HTML. I am not sure about Sequence, Selection, Repetition. I believe the most basic elements of languages are control and data abstraction. This theory fits to the explanation of assembly language very well. Assembly language supports jump and conditional jump, very primitive form of control abstraction. It also supports labeling, primitive form of data abstraction.
Also I think that the article is too much devoted to discussion for data type and data structures. Is it so important to mention strong typed or dynamic type checking since datatype article covers such adequately. And control flow on the other hand has very few. Don't we have to even mention if or while? -- Taku 05:34, Nov 11, 2003 (UTC)
Sequence/selection/repetition is far more fundamental to the concept of a programming language than data or control abstraction. I'm not saying that a language with only those features would be a good language, but I am saying that without those features it would not be a programming language at all. See brainfuck - it's a perfectly valid programming language, any computable problem can be expressed in it - it's just not a good one for expressing human ideas. It's at that second level that abstractions are important, but these are building on the fundamental requirement for stepwise execution, decision branches and loops. Incidentally assembly languages are possible without the features you mention, yet still remain valid languages. I can remember using a (poor) 6502 assembler on the commodore 64 that did not have labels - you had to specify branches using line numbers. But it did work. GRAHAMUK 06:28, 11 Nov 2003 (UTC)
Thinking about this further, this is precisely what separates programming languages from markup languages. I'm not familiar with the full extent of HTML, but as far as I know it lacks the ability to perform branches based on conditions, or the ability to perform repeats. Also, talking of assembly language, it is possible in theory to design a serial processor with only one instruction - "subtract and branch if negative", yet such a processor could still implement any known algorithm, because it obeys the fundamental requirement of seq/sel/rep. This is pretty close to the idea of brainfuck in hardware. I sometimes wonder if such a processor, despite lacking, well, anything much really, could be made to go so fast that it would still be actually pretty good on performance. You could also make it massively pipelined. The ultimate RISC machine... Anyway, I digress, but the point is made, I think. See also turing complete. GRAHAMUK 06:41, 11 Nov 2003 (UTC)

Umm, very interesting. I think you are talking about what an minimum requirement to make the language capable of simulating the Turing machine. And probably the three criteria Sequence, Selection, Repetition are right. I was thinking like programming language as the mean of abstraction. The Turing machine is the most powerful computer we know today and we don't need any programming languages or such to perform computable algorithms. Programming languages were in my view needed and then invented because human beings need something abstract to make programming easier. Mnemonics in assembly languages are completely meaningless to the computer but are only to we human. This is why I claimed data and code abstraction are basic elements of programming language.

But you are right. Brainfuck is considered as programming language generally, if it misses my picture of programming languages. SImilar small languages like PostScript are also among them. In other words, programming languages are not only for human or are they? -- Taku 06:40, Nov 13, 2003 (UTC)

-- Taku 06:40, Nov 13, 2003 (UTC)

Is "sequence/selection/repetition" what they're teaching the kiddies nowadays? :-) Functional programming folks might prefer to say it's abstraction/evaluation/recursion. Turing equivalence can't be used as a precise criterion, because for instance it assumes infinite storage, and classic Fortran requires fixed-size allocation, yet few would say it's not a programming language. I would call Turing-equivalent languages "general-purpose programming languages", while leaving "programming language" as a more general moniker for any linguistic form of expression that instructs a computer, irrespective of generality. Stan 08:23, 13 Nov 2003 (UTC)

"Sequence, selection, and repetition" are not enough to make a language Turing-complete. A finite-state automaton is capable of all of the above (which correspond to the concatenation, union, and Kleene star operator of regular languages). You also need arbitrary memory allocation (or infinite storage, which is the same thing).

Also, Stan's point re: FORTRAN is a good one. The primary distinction between markup and programming languages is one of emphasis. They overlap --- any Turing-complete markup language (e.g., LaTeX) can also be considered a programming language --- but "everyone knows" when something is primarily a programming language or a markup language.

Finally, it's pretty obvious that any non-joke programming language is intended primarily for human consumption. Joke languages like Brainfuck or Unlambda are simply exceptions that prove the rule --- they're designed by humans for human amusement. k.lee 05:27, 17 Nov 2003 (UTC)

Actually Stan I have no idea what they are teaching the "kiddies" these days as you so dismissively put it. I actually picked up the seq/sel/rep thing from a course I did many years ago which was an introduction to microprocessor design. The point was emphasised that as long as the hardware provided these things, then it could run any software "language", and therefore all programming languages mapped to these fundamental concepts at their heart. This stayed with me so it must have made some sort of good sense to me at the time. Now, it's quite likely that from a software perspective, some of these things may be self evident - for example, the fact that each statement of a language is executed in written order and thus one thing logically follows another (Sequence). Programmers take that for granted (though obviously CPU designers need to construct a mechanism to make it happen), so maybe it doesn't need to be stated - but we must write for the proper audience here.
Too bad that they take it for granted :-) What you say is valid for procedural languages, but absolutely not for declarative ones. In Prolog language, you state the set of "rules of inference". The burden of setting the order of their application is onto compiler. Once you see what I mean, it is easies to agree that HTML and TEX are as good a programming language in a sense that they make the computer do what I want. The "magic triads" abstraction/evaluation/recursion, sequence/selection/repetition, encapsulation/.../... are woodoo talk for specific approaches, important for sure, but they must be discussed where they fit. When speaking about PL in general, one must have a broader POV. Read more good SF, folks :-) Mikkalai

That audience is not other programmers - they know this stuff already -, it is the "intelligent layman" who may not realise that that's the case. Without some proper foundations, the abstraction/evaluation/recursion thing is still too "high concept". Another example I clearly remember from my own early steps with programming (I've been a professional programmer for 20 years, so it's a while ago!) is parameter passing - programmers take for granted that parameters map from caller to callee based on the position of the parameter in a list, but I can remember thinking how error prone that seemed - back then I thought it would be better to "find" the parameter based on its name and ignore its position. Of course knowing now how a CPU actually implements a subroutine, the "position" thing is clearly far more efficient and sensible, and there are likely other undesirable side effects that name binding might have. The point to make here is that to the uninitiated, what seems obvious to a programmer may not be at all obvious to someone else, so starting with implicit fundamentals in order to eliminate any misunderstandings seems a good way to go with an article such as this. Given this approach, I'm not sure that referring to Turing completeness is even a good idea - is the intelligent layman that bothered about the mathematics? I suspect most people coming to WP are looking for a solid, precise but not necessarily complete discussion of the subject. If it grabs them sufficiently, they can look into the maths further if they want. GRAHAMUK 12:05, 18 Nov 2003 (UTC)

People learn positional parameter passing in junior high school algebra: if f(x, y) = 3x + 4y, then f(1,2) = 11. It's true that a function in a programming language is not the same thing as a function in pure math, but the notation ought to be familiar enough. It's only because (a) programming is usually taught at such a low, machine-oriented level and (b) math is usually taught badly, period, that so many programmers find the "high-concept" explanation less intuitive. To the educated layperson, who has not been forced into low-level thinking by a typical CS curriculum, abstraction and application may well be as easy to explain as branching and looping. The experiences of the PLT Scheme folks suggest that Scheme (which is, basically, the call-by-value lambda calculus) is easier to teach to undergraduates than C or Java. The only undergraduates who struggle with Scheme are those who learned bad hackery in C in high school, and convinced themselves that this was the only way to program. If we're targeting the article towards laypeople, then there's no reason to avoid conceptual explanations in favor of a machine-oriented explanation.
Also, a Wikipedia article should be as complete as the contributors can make it. If the article grows too long, then it should have a quick summary at the top of the article, followed by the more in-depth discussion; or else it should be broken into sub-articles. But there's no reason to leave something out if it's an important concept, simply because it requires more effort on the part of the reader to grasp.
P.S. Minor pedantic point: you don't need recursion in order to be Turing complete. Abstraction and application suffice; you can build recursion out of abstraction and application, as with the pure lambda calculus's y-combinator. k.lee 18:15, 18 Nov 2003 (UTC)
Hey, I added the ":-)" to indicate clearly that the "kiddie" remark was in jest, oh well. Although I'm probably the most formally qualified WP editor for this article (PhD in languages and all that), my first step would be to learn from my predecessors - review the current EB/Encarta/etc writeups, plus reread the intro sections to the best textbooks, both those aimed at specialists and those aimed at nonspecialists - and get an idea of the strategy that others have used. Since encyclopedias are reference works, there is a certain falloff; everybody reads the first sentence, 50% read the second also, 10% get to the second paragraph, and so forth, with only the deeply interested lasting all the way to the end of the article; so you generally want to transition gradually from generalities ("language is how we tell a computer what to do") to Church-Turing, which has to be mentioned eventually, because it's one of the bases that justify some of our classification of types of languages. Stan 18:43, 18 Nov 2003 (UTC)
I agree about the structure 100% - far too many WP articles dive in with no context establishing stuff up front. I suppose the rest of it comes down to whether we approach languages bottom-up or top-down. I can see advantages to both approaches. Since I came from a hardware background, bottom-up seemed to work well for me, going from boolean algebra to logic gates to registers to CPU architecture to stored programs to machine code blah blah etc. Other readers will respond better to the top-down approach, going from "high concepts" of languages towards the underlying bits and bytes. Perhaps both approaches need integrating in the article by including two sections. I'm presuming that the top-down approach is preferred by most teachers of the subject these days, but one thing I do notice is that as a result few people (who are not programmers but nevertheless are expected to know the basic principles) cannot understand or make the mental leap from the language to the chips that implement it in hardware. I've read a lot of vague handwaving arguments to explain it recently while marking some student work at the local uni, so perhaps I've just got a bee in my bonnet about bridging the software/hardware divide in a sensible, clear manner. GRAHAMUK 22:29, 18 Nov 2003 (UTC)
The hardware/software divide should get its own article I think; tricky to explain but worth trying. You could maybe do it as a sort of slice across other articles, xref'ing if the reader doesn't know a particular term. Something like "global = 1;" -> "ld r2,1; st global,r2" -> memory-mapped device -> electricity flowing -> light bulb turning on. If you stuck to the "what" and "how", and leave out the "why", it could be both succinct and illuminating. Stan 08:15, 19 Nov 2003 (UTC)

I know you guys know much about programming languages and theories in computer science than I do but I was wondering how about historical approach. RISC is a very good article. Although I don't have much expertise in hardware, the article makes a lot of sense to me. The nice thing is that it doesn't know give particular examples of RISC like a list of instruction code but it focuses on why computer science comes up with the idea of RISC in historical and technical context and also gives a plenty of practical examples of architectures. I think the same strategy can be applied to this article. In some ways, the article simply gives a summary of concepts which really doesn't make sense unless you know it before and sometimes goes to too much detailed. For example, I don't see why it is so important to spend a lot of space to discuss type system while some important concepts such as lazy evalution, side effects and referentical transparency are completely omited.

As I keep repeating, I think it is very important to avoid the article is like a textbook. RISC article is completely useless if you want to learn an assembly language and how to make a code generator for RISC architectures. Let alone Wikibook for such case. Well, just a thought. I am just hoping I am any help at all. -- Taku 07:15, Nov 19, 2003 (UTC)

Yes, a bit of historical recapitulation is helpful, especially to motivate why there are different languages. This article can't really get much into specific concepts like lazy evaluation though, those have to be pushed to language semantics articles so as to keep the top-level article readable by laypeople. Stan 08:15, 19 Nov 2003 (UTC)
You would not expect to learn an assembly language or how to make a code generator for RISC architectures from Wikipedia. You would get that from a text book! So which is it to be? To my mind, the RISC article is actually a very good encyclopedia article. GRAHAMUK 09:40, 20 Nov 2003 (UTC)

Umm, can we have a slight summary of different programming paradigms? I think it is important to show that why we have come to have several languages and what is difference between them. I am not suggesting to have complete discussion of specific topics like lazy evaluation but I don't know how to put, more like how different languages approach the problem of programming in different ways. I think such discussion can have the article more focused on making sense to the general public about what programming is like. The imperative approach is not only one and surprisingly many even computer programmers know little about how problems are done in programming with many different ways. Many people just learn how to do programming in a particular language like C or LISP and not sure programming language as general. For example, I think it would be nice to see how to reverse a string in many ways as an example. While it is very unnecessary to discuss how arguments are passed or how type system works. The bottom line is that the article must not be a summary of programming languages topics, but should discuss actual problems. I know it is easy to say and hard to achive, just chating about my idealistic view. -- Taku 08:40, Nov 20, 2003 (UTC)

I agree that there should be some explanation for why there are many languages - this itself shows that there is no one, true way to program. However, I'm not sure about examples, at least not in this article. There could be a link to a separate article listing the same program in all the various languages if you wanted. The problem is that including examples here of string reversal (or whatever) is EXACTLY turning it into a text book. The article should be about languages, not a tutorial for any particular one (or all of them). Seems to me your suggestions lean more towards the textbook approach, despite declaring that you don't want WP to be one. GRAHAMUK 09:52, 20 Nov 2003 (UTC)


Well, it shows that we have not yet found any one, true way to program. - Doradus 15:40, 20 Nov 2003 (UTC)


I agree with Taku -- if you think something is "too much" for an encyclopedia article, please move it to one of the Wiki Books http://en.wikibooks.org/wiki/IT_bookshelf -- DavidCary 15:36, 26 Jul 2004 (UTC)


(moved to User talk:Dysprosia)

Cut from "History of..."

The following piece is cut out of section "History of programming languages".

<<<

As the cost of computers has dropped significantly and the complexity of computer programs has increased dramatically, development time is now seen as a more costly consideration than computer time.

Newer integrated, visual development environments have brought clear progress. They have reduced expenditure of time, money (and nerves). Regions of the screen that control the program can often be arranged interactively. Code fragments can be invoked just by clicking on a control. The work is also eased by prefabricated components and software libraries with re-usable code, primarily object-oriented.

Object-oriented methodology was introduced to reduce the complexity of programs, making code easier to write and to maintain. However, some argue that programs have, despite this, continued to increase in complexity. Recent languages are emphasising new features, like meta classes, mix-ins, delegation, program patterns and aspects.

See programming paradigm

>>>

All the above is true, but... This rant is good for a pop-sci article in an online magazine, but not for encyclopedia: chaotic, no *history*, and no *programming languages* Mikkalai 00:37, 13 Dec 2003 (UTC)


" Computer language" is not synonymous with " programming language". A programming language is a computer language used for programming.- Doradus 00:14, 2 Jan 2004 (UTC)

I'm agree, computer language has its own article, and shouldn't be stated that is a synonym of "programming language" because "computer language" is broader. -- surueña 13:02:51, 2005-09-06 (UTC)

Writing From Scratch

This Section Programming Language needs a total wash and be written from scracth. I volunteer myslef to devote some time to it. As I am new to this site and I am learning how to edit things so this section will be online within a week and with a new style yana209 22:01, Jun 22, 2004 (UTC)


Some languages such as MUMPS and is called dynamic recompilation; emulators and other virtual machines exploit this technique for greater performance.

The clause before the semicolon isn't even complete. I'd fix it, but I'm not sure how exactly it should read. - Furrykef 15:17, 9 Sep 2004 (UTC)

fixed. Ancheta Wis 11:42, 29 Jan 2005 (UTC)

Programming languages causing crashes

The reason that I deleted the sentence

"Unfortunately many programming languages cause crashes because the languages themselves are poorly constructed."

is that I don't understand it. I suppose it means that the poor design of some programming languages is causing crashes, which raises the question: which languages is the author refering to? A possible interpretation is that programs written in low-level languages like C are prone to crashes, but that is not poor design in my opinion, but a conscious design choice to prefer speed, ease of compilation and flexibility at the price of allowing more crashes. Putting this sentence in the history section confuses me even more, because that implies that the problem of poor design no longer exists, which is at odds with my interpretation. So, please explain what you mean if you reinsert the above sentence. Thanks, Jitse Niesen ( talk) 16:30, 28 July 2005 (UTC)

You're right it shouldn't be in history section because it is such a fundamental point and the bane of any decent computer scientist. We are swamped with poorly written junk languages (and operating systems) that gain prominence via clever marketing rather than on merit. Yes I'm referring to design and unfortunately don't have time right now to flesh it out (though, again, anyone who knows the field should be able to do so). Treat it as a stub and add to the list. No I'm not referring to C and I recognise it's horses for courses. Thanks for the feedback. And please add to the "stub" rather than delete again. Mccready 01:22, 1 August 2005 (UTC)

Of all major programming languages I would only say of C++ and C#, that they are poorly constructed. This could hardly be considered many, and should be discussed in their respecive articles, not here. Your statement also lacks any form of argument. -- R.Koot 01:44, 1 August 2005 (UTC)

I would like to add to the discussion here. In fact there aren't many badly constructed languages at all (I know none, and I've programmed in many). What one can say is the lower the languages (closer to the machine language) the more stress is put on the knowledge of the developer to create a good working program (You need to know your language's do's and don'ts). Take for instance the difference between C++ and Java. Although they both are Modern Languages, C++ still can cause buffer over and underruns while this is pretty difficult to achieve in Java. Although memory leaks still can appear in Java this is probably one of the major flaws in C++ Programs. It's not the language that's bad, but the code that's written in them! -- Paul Sinnema 09:44, 16 September 2005 (UTC)

Programming languages don't cause crashes. Programmers who write bad code, or faulty compilers, runtimes, etc. do. Dysprosia 08:08, 17 September 2005 (UTC) {{Wikibooks Module|Computer programming|Error handling}}

I begg to differ. Example: X / 0 will crash a C and C++ programm but will be a CONSTRAINT_ERROR in Ada. So there are differences in how languages tread error conditions and it is rightfull that we describe that to the reader - preaps a comparison table might be helpfull to the reader in that respect. -- Krischik  T 09:08, 18 September 2005 (UTC)
I beg to differ in turn -- it is you who is causing the division by zero error, not the programming language. What you are measuring above is something completely different. Dysprosia 10:31, 18 September 2005 (UTC)
True. And yes, we talk about two half of the same coin - a programming mistake and how the language reacts to that - but then this Module is called Programming language and not Programmers. And how about X / Y. Is still the programmers fault or prehaps faulty data delivered to the programm. If you say "programmer" then I ask: Have you realy got a if (Y != 0) in front of every division - or made a full static analysis proving that Y can't be 0 - for all programms you have ever coded?
We live in a time of Viruses, Worms and Trojian Horses. A time where an unprotected Windows XP computer is infected and turned into a Spam-Bot in about 4 minutes after beeing connected to the internet. Programmers are not perfect and error handling - or the absence - is an important aspect of a Programming language and need to be explained to the reader. Please Note: I have never said a programming language is faulty because it has no error handling - that would break the NPOV. -- Krischik  T 12:31, 18 September 2005 (UTC)
It is the programmer's fault for not making a program that is not tolerant to faults and errant data, the programmer's fault that they do not perform adequate buffer checking, the programmer's fault for inadequate testing. Blaming Windows XP's insecurity on the programming language that they used, in my mind, is a cop-out. A programming language is just a means of expressing an algorithm -- what has any significance is the behaviour of the compiler or the runtime or the libraries or all of these, when unspecified or illegal behaviour occurs. The compiler is not the programming language.
You can't have is that some programming language has better features then your beloved C, can you? You can stand that expressing an algrorithm (incl. the needed error handling) might be easier in another language, can you? A typical case of "If my beloved programming language does not have feature X then feature X is bad, evil, send from hell - or whatever other nasty place you religion has to offer". You think I am unfair - well, you violent defence of C's greates weakness leads to no other conclusion. Or why should a programmer not know about or choose a programming language wich makes error handling easier? And Yes: I have 15 years of C/C++ programming experience I know what I am talking about - but then: I also have experience in Pascal, Modula-2 and Ada - programming languages where buffer overruns are virtualy unknown. -- Krischik  T 09:44, 19 September 2005 (UTC)
You ought not to get into insults and making nonsense claims that I have thought you "unfair" and that I have made some "violent defence" towards C (I've barely mentioned the name of the language in this thread). Think about the actual matter at hand. If you actually think about this carefully, you will understand where I am coming from: a programming language is only a specification on how to translate ideas into something more low-level. A specification does not cause crashes. Misunderstanding the specification does. If I tell you not to cross the street when the light is red, and you cross it anyway and get hit by a car, whose fault is it? Dysprosia
The person who walks of course. But then: there must be a traffic light in the first place. -- Krischik  T 11:47, 19 September 2005 (UTC)
It is still possible to cross a light safely without a light. Dysprosia 05:07, 20 September 2005 (UTC)
One person crossing one road: Yes. But can you image a city like New York, London or Paris without traffic lights? Trafic light where invented because traffic became to compex for the old "look right look left" technique of crossing the road. (BTW: The Ada motto is "Ada, the language for a complex world") --- Krischik  T 10:45, 21 September 2005 (UTC)
However, it is still possible. Regardless, what you need to also understand is that while runtime checks are fine and dandy, the overhead they incur is simply unacceptable for certain applications. When you're running some application where a bit of performance lag isn't an issue, then fine. But if you're writing hardware applications or time-critical applications, performing checks may unacceptably slow things down. The lack or disabling of runtime error checking/handling is not necessarily a bad thing. Again, the language is not to blame for this (for example, you can get C compilers which integrate bounds checking, but slow the application down -- you have to check every single array access), but of the compilers or the execution environment.
Which is what I've been maintaining all along. Dysprosia 09:18, 22 September 2005 (UTC)
And since you mention it, I do check return values when I call C library functions, amongst other things.
All return values - all parameters - all buffers sizes - all integer ranges - all type convertions - allways - never ever forgetting it even once? Because that is what you have to measure up against when compare yourself to a compiler which provides all that be default. -- Krischik  T 09:44, 19 September 2005 (UTC)
So you understand now that it is not the programming language that provides typechecking et al? You said yourself "a compiler which provides..." (Regardless, understanding types is not very difficult. There are other more strongly typed languages than C, and there are even dynamic typed programming languages. It is not difficult to understand and keep in mind.) Dysprosia
And so should you. Dysprosia 22:23, 18 September 2005 (UTC)
Actualy: No I don't have to check return values - my current assingment is done in Ada. -- Krischik  T 09:44, 19 September 2005 (UTC)
When you program in C, you'd better. Do you catch exceptions and/or perform error checks when you program in Ada or do you just ignore them? Dysprosia 10:10, 19 September 2005 (UTC)
Depends - the great strenght of expeptions is that I only need to handle the expeptions I am interested in and let the applications framework worry about the rest. Shure, the applications framework is done by programmers as well, just like the compiler. And I can still forget an exeption I should be interested in - but it does not happen as often as in C. Net result: I start the debugger only twice a month. -- Krischik  T 11:47, 19 September 2005 (UTC)
Thus Ada is not immune to you having to perform error correction at some point. No programming language is. (This is beside the nub of the matter though.) Dysprosia 05:07, 20 September 2005 (UTC)
Shure can't argue with that. -- Krischik  T 10:45, 21 September 2005 (UTC)

programming language popularity: the TIOBE Programming Community Index

Should we realy have TIOBE Programming Community Index a link to a biased statistic without telling the readers in which way the index is flawed?

TIOBE's company statement is: We offer out-of-the-box solutions for the programming languages C, C++, C# and Java. — which oviously means they want those languages to look "good" on there statistic.

We should either explain he flaw or remove the link. Of course it is tricky to keep an NPOV when explaining an flaw in some way.

How is the Index flawed?

The search query '+"<language> programming" -tv -channel' is used to calculate the TPC Index.

Since you are probably reading this discussing because you are language advocate of some sort you can just google for '+"<language> programming" +tv +channel' with <language> beeing your favorite programming language - and then decide if those pages where rightfully excluded from the index.

My first google hit for my favorite language is: This association is aimed at promoting ADA programming language to the software... TV channel, producer) on any type of platform (OpenTV, MediaHighway). ... — rightfully excluded — don't think so.

I removed the link, because you are not alloweds to link to your own website, not because I'm advocating some language. Please assume good faith. Cheers, -- R.Koot 14:01, 14 August 2005 (UTC)
Hmmm, there are several links to some of my pages on Wikipedia and Wikibooks. But they have never been added for vanity but only because they fit to the topic at hand. If it was any different the other contributers would have removed them. And in that respect I assume good faith. -- Krischik 14:21, 14 August 2005 (UTC)


popular programming languages

The article " buffer overflow" currently claims that

" As of 2005, the most popular languages generally are C and its derivative, C++."

While I suspect this is correct, I wonder how that author found out?

Is it even possible to rank programming languages according to "popularity" (or in some other, more objective way) in a NPOV way? If so, should we discuss "popularity" here in the programming language article, or split it off into a popular programming language article? See C2: Programming Language Usage Statistics. -- DavidCary 05:19, 11 November 2005 (UTC)

Rewrite has gone on long enough

I have given User:K.lee two weeks to put up or give up with his rewrite. His "rewrite" approach is anti-collaborative and he has been claiming that his rewrite is pending for over two years. The "reqeustrewrite" template is only used for k.lee's claim for this article. Wikipedia itself is not much older than that, which meas that no else has had a real and equal "turn" at this article since most of the work for the past two years will be lost when/if k.lee ever commits his version. Fplay 19:55, 9 December 2005 (UTC)

k.lee e-mails that that requstrewrite tag can be removed immediately. Fplay 20:14, 9 December 2005 (UTC)

FORTRAN

Minor note, but the spelling was changed from "Fortran" to "FORTRAN" since the article was referring to the first version of the language which was indeed spelled that way. The current accepted convention (see the Fortran page) is as follows: FORTRAN, FORTRAN II, FORTRAN IV, FORTRAN 66, FORTRAN 77 are in upper-case, with the new versions (such as Fortran 90) are in lower-case "Fortran" as per their convention. The FORTRAN spelling is a very important issue for FORTRAN programmers worldwide.


New introduction

I am willing to agree that my first stab at an introductory paragraph for "programming language" might not be ideal. However, the current first paragraph is likely to be incomprehensible to all but the most knowledgable of people.

The first paragraph should provide a brief summary/definition for a reasonably intelligent person who knows nothing about computers. Later material can get technical and dense.

Let's work out some good wording that encapulates this rather nebulous of entities.

Derek farn 14:15, 15 February 2006 (UTC) (Copied here from my talk page -- TuukkaH 16:31, 15 February 2006 (UTC))

I'm sorry for impolitely reverting your introductory paragraph, I was thinking of the edit history of another programming-related article. I'll restore it here for further discussion:

A programming language is a language designed to allow programmers to specify a sequence of operations to be performed (usually by a computer). The syntax and semantics of programming languages are much more restricted than natural languages. The written, human accessible, form of a programming language is known as source code and may be translated by a compiler into a form that can be executed by the cpu of a computer.

What I like about your version is that it's easy to read and probably also to understand. What I like about the earlier version is that it doesn't oversimplify, and it has links to other relevant articles. To me, the important things to tell in the introduction include:
  • The purpose of programming languages is to allow people to describe computer programs on a level where they can be made executable.
  • Programs are described by the data they act on and the algorithms they employ.
  • My pet with some nice links: "Programming languages are a type of computer languages, excluding pseudocode which is exclusively for human communication."
-- TuukkaH 16:31, 15 February 2006 (UTC)

I'm not sure I would have made the programming/computer language distinction. Yes html is a markup language, but some people think it is a programming language. Who are we to disagree? Do we require programming languages to be Turing complete? Is calling html a computer language just a way of deflecting critisism of not calling it a programming language?

We can somewhere down the article mention that some people think that HTML is a programming language, but we need not include this topic in the introduction. HTML doesn't allow one to describe algorithms so it clearly isn't a programming language. HTML and programming languages allow humans to communicate with computers so they're computer languages and mentioning this puts the topic in its place. -- TuukkaH 19:23, 16 February 2006 (UTC)

Anyway, back to the problem at hand. I now appreciate that my definition was overly restrictive. Some languages require programmers to specify what, not how. For instance, SQL requires a set of conditions to be specified, not the nuts and bolts of finding the data. What about Prolog which consists of clauses (ok, most implementations have extensions that support a more imperative style). How about:

A programming language is a language designed to allow people to create programs that control the actions of a machine (which is often a computer). The syntax and semantics of programming languages are much more restrictive than natural languages. The written, human accessible, form of a programming language is known as source code and is often translated into a different form before being executed on a machine (in the case of a computer, by the cpu).
Programming languages take many forms. Many in common use require programmers to write source code that implements the algorithms used to produce the required actions (so called imperative programming). Other languages require programmers to specify the what rather than the how (so called declarative programming). For instance, users of structured query language specify the conditions that must be met by data in a database for it to be returned as the result of a query, and various support programmers work out how to actually retrieve that data).
Programming languages differ from languages such as markup languages in that they are Turing complete.

Derek farn 01:34, 16 February 2006 (UTC)

This feels like a runaway introduction, see for example Wikipedia:Lead_section. I don't feel we should go into details such as syntax, semantics, source code, cpu, imperative vs. declarative, let alone Turing-completeness. -- TuukkaH 19:23, 16 February 2006 (UTC)

Yes, it is getting a bit bloated. I think that Turing completeness is a good way of distinguishing programming languages from other kinds of languages (I don't understand you later comment about infinite execution; there are various mathematical formalisms used when discussing properties of languages; I am using the term as a way of putting a minimum limit on the expressive power of a programming language). Or are we going to duck this issue entirely?

Could you check Turing completeness and machine that always halts? My comment below referred to the fact that if you require programming languages to be Turing complete you artificially leave out programming languages that are somehow able to limit programs to always halt. I say we duck this issue entirely because it's about a mathematical theory which isn't directly applicable in practice, in programming. We can duck for example by restricting programming languages to those that are used to describe data structures and algorithms. -- TuukkaH 09:31, 17 February 2006 (UTC)

People are familiar with what a natural language is, so let's make use of this knowledge.

A programming language is a language designed to allow people to create programs that control the actions of a machine (which is often a computer). The syntax and semantics of programming languages are much more restrictive than natural languages. The written, human accessible, form of a programming language is known as source code and is often translated into a different form before being executed on a machine.
Programming languages take many forms, the two major divisions being imperative programming and declarative programming. Some languages are intended to be used within specific domains (eg, banking), while others are intended for more general usage.
Languages differ in their expressive power and a language powerful enough to be Turing complete would be regarded as a programming language. Markup languages such a pure html are not Turing complete, but can be made so with the addition of extensions such as PHP, or Javascript.

Derek farn 00:28, 17 February 2006 (UTC)

Turing complete is not enough

I don't think that beeing Turing complete is enough to make a programming language. The language should also be used for general programming. That's why at Wikibooks we draw the line between Wikibooks:Programming languages bookshelf and Wikibooks:Domain-specific languages bookshelf. A "domain-specific language" may as well be Turing complete but it is not used for general programming bu only in a specific domain. i.E. PostScript is considered turing complete however I would not consider it a programming language as it not used for general programming - Or has anybody seen a text-editor or an excel clone written in PostScript?

-- Krischik  T 07:11, 16 February 2006 (UTC)

A "domain-specific language" can very well be a programming language but not necessary one suitable for general purposes. The article on PostScript has links to game of life, a webserver, fractals, barcodes, HTML renderer, raytracer. Even if a language isn't the best tool for the job it can still be a programming language. -- TuukkaH 19:23, 16 February 2006 (UTC)
Now that I find intersting - especialy the webserver part. And unlike the akademic brabble below this could change my mind about PostScript. In which case I have choosen a bad example. -- Krischik  T 12:00, 17 February 2006 (UTC)

Yes it is

What difference does it make what a language is actually used for? And what exactly is general programming?

Postscript is very much like Forth, would you say that Forth is not a programming language?

Forth is used by humans to create programms for computers - so certanly it is programming language. -- Krischik  T 11:53, 17 February 2006 (UTC)

Most languages are only used by a handful of people. Does that mean they are not programming languages because they are only used for specific tasks (whatever it is that the handful of people write with them)?

To take your example, I know people who work with printers and formatting software who spend large amounts of time writing code in postscript. I even have a program that prints out a calendar that is written in postscript. Just because lots of people choose not to write their software in postscript does not stop it being a programming language.

Derek farn 12:46, 16 February 2006 (UTC)

In your edit comment you claim that a programming language is equivalent to Turing complete. There are a lot of programming tasks that don't need infinite execution, so if a language would otherwise be a programming language but it has only finite execution you wouldn't call it a programming language. Now I wouldn't call it a Turing-complete programming language. To take another example, primitive lambda calculus is a Turing-complete model of computation but it lacks data types and IO. I don't really know if it should be classified as a programming language but I think it's better to keep the models of computation separate from the introduction of programming language. -- TuukkaH 19:23, 16 February 2006 (UTC)
Actualy: How many programming languages are turing complete? int in C is either 16, 32 or 64 bit - hardly infinite. C demands that there is an integer type intptr_t to which a pointer can be cast to - here goes support for infinite memory. "Turing complete" is the theoretical concept for academic use only. -- Krischik  T 11:53, 17 February 2006 (UTC)
The assertion that PostScript is not a programming language is utterly nonsensical. What is it, if it is not a programming language?
A page description language. Because that is what it is used for in 99.99% of all cases. Even the calendar programm mentioned above is only that - it prints a calender - it is not an outlook clone. If it starts having a database to keep appointments then it aproaches programming. -- Krischik  T 11:53, 17 February 2006 (UTC)
Look. Do you even know about the PostScript language at all? It sounds like you think that PostScript's capabilities are solely devoted to graphics output. Yes, it does this well, but it is not its only functionality.
Furthermore, it can be argued that nearly all programming languages can in some sense be page description languages. What do you think a GUI display does when it prints a calendar?
Just because PostScript does not have the regular idea of user I/O that other programming languages have does not mean the language is suddenly "not" a programming language. Dysprosia
I never tought that a programming language needs "user i/o". Many embedded programms or batch processing programms do well without. -- Krischik  T 13:01, 17 February 2006 (UTC)
So what on earth are you basing your argument upon that PostScript is not a programming language? You seem to use some sort of arbitrary definition that because some programming language doesn't "keep appointments" or is an "excel clone", it's not a programming language? Dysprosia 13:17, 17 February 2006 (UTC)
Does a programming language have to have curses or GUI capabilities to be considered a programming language? Of course not. Just because PostScript does not have a lot of user-interaction libraries available to it doesn't mean it fails to be a programming language. If PostScript had such libraries, it would be rather easy to create a "text-editor" or "excel clone"! Dysprosia 09:49, 17 February 2006 (UTC)
I think we are all in agreement that PostScript is a programming language, and even a Turing-complete one, and you can even write different kinds of programs in it in practice. Perhaps you meant this answer to Krischik above, he was the one who didn't want to think of PostScript as a programming language? -- TuukkaH 10:37, 17 February 2006 (UTC)
Of course. Dysprosia 11:05, 17 February 2006 (UTC)
No, I am not in agreement - I still think it is a page description language. -- Krischik  T 11:53, 17 February 2006 (UTC)
And Lisp is a list processing language and Visual Basic a painting program. — Ruud 13:29, 17 February 2006 (UTC)

Definition

I propose the following definition, distilled from many sources throughout my education and career. This definition disposes of the qualitative assessments of languages, and focuses on quantifiable features:

General purpose programming language
A general purpose programming language must meet three criteria: it must be Turing complete; it must have at least a prototype compiler/translator/interpreter implementation; it must be in use, or targetted for use, independent of any particular application software.

Examples of general purpose programming languages:

C
Clearly satisfies all three requirements.
Perl 6
While its implementations are currently shakey at best, they exist, and it otherwise meets 1 & 3.
JavaScript
It is rare that JavaScript is used outside of a Web browser, but it is used that way.
PostScript
Implementations of PostScript as a stand-alone language have existed since the 80s, though they are rarely used outside of research and QA for embedded applications.

Examples of non-general purpose programming languages:

SQL
Misses on item 1
pseudocode
The typical pseudocode used in computer science books and papers has no formal implementation that I am aware of.
vi macros
Misses on item 3 (yes, vi macros are Turing complete).

If we can agree to this definition, then we could agree to limit the scope of the article to such languages, and then create articles to collect the exceptions. - Harmil 13:48, 6 March 2006 (UTC)

SQL is not Turing complete

SQL has no looping construct. Ok, the support software behind it contains lots of loops, but that is not the same thing.

I have always thought of SQL as a programming language, but on reflection I guess it should be called something like a database query language and along with markup language's not be included in the list of programming languages. The SQL Standard is not maintained by SC22, the ISO committee responsible for programming languages.

Derek farn 13:08, 16 February 2006 (UTC)

Have you had a look at PL/SQL? Krischik  T 13:03, 17 February 2006 (UTC)

PL/SQL adds programming language features to SQL (because it does not have any), hence the name Programming Language/SQL.

Derek farn 14:17, 17 February 2006 (UTC)

Doesn't the acronym "SQL" stand for "Structured Query Language"? I would not call it a programming language in its own right, but possibly a tool for other languages. -- BBM 22:33, 16 May 2006 (UTC)

Removed comments

I just removed the following comments from the end of "History of programming languages" (reformatted to shorten lines), as their presence (with newlines in between) was creating excessive vertical whitespace, and in any case they really belonged here. Hairy Dude 05:14, 2 February 2006 (UTC)

<!--- Chaotic and not to the point of the section, i.e., "history of
comp lang." I move this piece in "Talk" according to wikipedia principle:
better no article than bad article --->

<!--- Changed "teached" to "taught." I also disagree with the claim
that Java was the first programming language taught in universities.
Languages like FORTRAN, Cobol, C, etc., were all extensively taught
before Java came on the scene. shrao@acm.org, 2005-02-02 --->

<!-- The assertion: "Java...became...the first programming language
taught at the universities" is intended to convey that Java has become
the programming language of choice for 100 level language classes in
university curricula. It's a rather badly worded sentence. Changed "first"
to "initial." tim@fourstonesExpressions.com, 2005-03-24 -->

<!-- The sentence is still poorly worded; changed "the initial" to
"an introductory". danb@cs.utexas.edu, 2005-10-04 -->

"Millions" of what?

http://hopl.murdoch.edu.au/ reads "This site lists 8276 languages... It has delivered more than 1,720,000 programming languages descriptions in the last 14 months". (I think this latter bit is their way of saying web-hits.) Ewlyahoocom 20:15, 11 March 2006 (UTC)

I think it was deliberatly done to confuse people. — Ruud 20:43, 11 March 2006 (UTC)
I'm pretty sure it was a good faith mistake. Haven't we all discovered some new bit of information, rushed to add it to Wikipedia, only to find out later (usually only a few minutes) that we misread it? Hmmm... maybe we'll have to keep our eye on you. Ewlyahoocom 20:48, 11 March 2006 (UTC)
Oy... what happened to WP:AGF? — Ruud 20:57, 11 March 2006 (UTC)
When the editor in question writes "I think it was deliberatly", then one no longer has to assume, yes? However... if I may assume that you're joking, then you may assume that I'm joking, too. Ewlyahoocom 21:11, 11 March 2006 (UTC)
My "it being done" refered to HOPL stating such a large number without sopecifing what "delivered" mean, not that my addition was done to confuse people. — Ruud 21:18, 11 March 2006 (UTC)
Oh, I see! HAHAHA! Wait a sec... where's your good faith for those guys!? HAHAHAHA! :-) Ewlyahoocom 05:49, 12 March 2006 (UTC)

Article image

Is there a better, more representative image of computer code than the current one?

What would it ideally depict? Certainly not HTML like currently. Perhaps a simple function in a typical, popular programming language such as C. Or something more readable such as Python. -- TuukkaH 12:32, 16 January 2006 (UTC)
Instead of cropping the picture to make it fit, maybe it's better to have a scaled-down image that conveys the shape and layout of some chunk of code, without necessarily trying to be legible in detail.
As for the language, i'm sure everyone has their favorite (also Python, in my case). But for Wikipedia, the language's historical recognition, heritage, and influence are probably more important factors than current popularity. To me, the two most obvious contenders are C, and Lisp/Scheme. -- Piet Delport 16:58, 16 January 2006 (UTC)
I don't have a problem with C/C++ or LISP. However, to keep source code language agnostic I'd propose using a psuedo-language that just shows block structure layout of programming languages, which is universal to all languages.-- Capi crimm 04:30, 6 March 2006 (UTC)
Pseudocode would feel a bit like giving the Cola article an artificial " Acme Cola" image. -- Piet Delport 10:43, 8 March 2006 (UTC)
To be accurate, block structure layout is not universal to all languages. – Zawersh 02:13, 5 June 2006 (UTC)
What about a segment of MediaWiki PHP code? MediaWiki powers Wikipedia, so I think it is appropriate, should one specific language have need to be used. -- BBM 22:42, 16 May 2006 (UTC)
My inner programmer cringes at the thought of being represented by PHP... but the idea has merit. -- Piet Delport 07:22, 17 May 2006 (UTC)
I've added the reqimageother template above this page if you guys wouldn't mind. I agree that this article deserves something better. -- Face 09:47, 2 May 2006 (UTC)
P.S.: Maybe we can do something with this too?
What is the cryptic "High-Level Language" block supposed to represent, though? -- Piet Delport 14:41, 2 May 2006 (UTC)
What about the misleading steps/indentation? If we use something like that, we need to do it differently. The indentation gives the impression that higher-level languages are somehow less than lower-level ones, even to the point that hardware is better than software. How about a ladder diagram instead. Jaxad0127 00:10, 6 June 2006 (UTC)

What about this one about the history of programming languages: Image:Historie.png? If not as main image, at least could be used in the History section. ManuelGR 22:44, 14 May 2006 (UTC)

The German text probably rules it out speedily, but beyond that, the selection of languages strike me as unbalanced: Smalltalk gets no less than five entries, while many major languages are left out. There are also some... creative arrows, such as the Prolog -> Lisp one. -- Piet Delport 01:20, 15 May 2006 (UTC)

Assessment comment

The comment(s) below were originally left at Talk:Programming language/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

Needs fixing of a few {{ fact}}s. Tito xd( ?!?) 17:01, 11 September 2006 (UTC)

Last edited at 17:01, 11 September 2006 (UTC). Substituted at 21:56, 3 May 2016 (UTC)

huezo202 [1] The Parent Trap (1998 film) 5.56×45mm NATO 3Y (disambiguation)

  1. ^ {{ cite journal}}: Empty citation ( help)
From Wikipedia, the free encyclopedia

Image request

{{reqimageother|a representative image of computer code (see discussion)}}

I have little difficulty understanding the feature specification. I may have it missed it entirely.

I see the need for artwork to match the standard form (fashion) of other articles. An image representative of 'computer code' is simple, photograph some representative structured code that displays an editing style of say indenting. That is code, and its Boring, hence appropriate?.

If the goal is a little larger, to for instance portray the archetypal process of creating a program in a computer programming language using the artistic medium, image. I would suggest an photo of an actual 80 char screen, possibly green, with individual pixels large enough to be seen, the one line of text would read "Syntax Error: redo from start". It could, if one could be thought of, include the actual text that was a classic, obvious yet confusing problem in say basic syntax. This would to me encapsulate the constant process of humans trying battle the computers rigid interpretation of the humans otherwise reasonable requests. The prior syntax error could also be the humorous "please help" instead of just the more likely but terse "help". This idea that the human has mistaken (anthropomorphised) the computer for someone that can be persuaded with politeness is the point of desperation. If the image context was to be larger an undrunk cup of coffee and an empty pizza box would round out the stereotype as these imply both the multitude of hours spent already and yet the incompleteness of the task.

Humans or machines?

I think we need to point out that programming languages, like other languages, are for humans to express human ideas in. The unique thing about programming languages is that we can automatically translate these expressions into the ones and zeros that computers use. Still, the primary purpose that should be stressed is that these are human languages, for humans to express solutions in which are meant to be understandable by other humans. Since most of the cost of software across the useful lifespan of a program is invested in enhancements and maintenance, the human-readability of programs is much more important than their nature as a "technique for expressing instructions to a computer".

Low-level languages such as machine code are also programming languages. All programming languages are in principle both human- and machine-readable, but the relative emphasis varies.

I agree, the importance of producing code that can be easily understood by a human is of extremely important and must be mentioned prominently in this article. Indeed, producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.
The original and still primary purpose of a computer programming language is not communication with other humans. A programming language is not a human language in the oridnary sense of the term; even two hardcore professional programmers don't ever go to lunch and talk to each other in Java or C. Natural human languages are far better suited to interpersonal communication. Programming languages were invented for and are used nearly exclusively for the express purpose of allowing humans to easily communicate instructions to computers (and later be able to easily modify those instructions), and not for human-to-human communication. Even programming manuals frequently express algorithms as human language influenced pseudocode rather than in a real programming language.
It's perfectly possible (and regrettably common) to write huge complex blocks of code or even entire applications that are unintelligible to anyone but the author, yet which work perfectly well when executed by a computer. A programming language is basically and fundamentally a "technique for expressing instructions to a computer." Doing so in a human-readable fashion is a big plus, of course, but it is not the fundamental purpose of a programming language, and is usually accomplished through appropriately garnishing the code with comments in some natural human language. Kwertii 09:38, 9 May 2004 (UTC)
FWIW, I am a programming language theorist and I would define "programming language" formally in the following manner: a programming language is a decidable formal language equipped with a Turing-complete semantics; a program is a programming language together with a member of that language. (BTW, the page for "Turing-complete" is not really, er, adequate...) This means that a language is a set of finitary strings, for which it is computable whether or not a given string belongs to it, together with a computationally adequate model, for example a mapping from each such string to a λ-term, or Turing tape, or partial recursive numeric function. In my opinion, few programming language researchers would disagree with this definition as applied to formal work. However, if you want to define the popular notion of "programming language" then, yes, you might want to add fuzzy conditions like "human-readable" and so on, and maybe weaken the Turing-completeness condition to admit terminating languages like Charity. -- [ Frank Atanassow], 24 July 2004
That formal definition is too constraining. It would be a rather odd definition of "programming language" that did not count the simply-typed lambda calculus as a programming language. Conversely, I had a colleague working on a language with an undecidable type system. (The subtyping constraints were flexible enough to allow the user to encode Prolog programs in them.) He had proposed various restrictions to make the type system decidable, but it seems clear to me that his language didn't become a "programming language" only when he imposed those restrictions. So a programming language need not have a Turing-complete semantics, nor must it be a decidable formal language. k.lee 23:09, 25 Jul 2004 (UTC)
And conversely, to determine which is more "primary" or "important"--which is worse, a programming language that humans can't read, or one computers can't read? "Most of the cost of software across the useful lifespan of a program" may be in enhancements and maintenance, but if the program weren't machine-readable, its "useful lifespan" would be zero. --Daniel.
A programming language that humans can't read is clearly worse. There are plenty of useful programs that are not machine-readable. Virtually all work by academic programming language designers begins with the development of "core" languages, which are mathematical constructs first and foremost. The call-by-name lambda calculus is hardly primarily a tool for communicating with computers (if by "computers" you mean those beeping chunks of silicon that sit on people's desks). I find this this machine-oriented focus kind of disheartening. Do we not agree with the Dijkstra quote concerning astronomy and telescopes at the top of the computer science article?
"plenty of useful programs that are not machine-readable"--and similarly there are plenty of useful programs that are barely human-readable. That's not a good test.
There is a big difference between "not" and "barely". k.lee 23:09, 25 Jul 2004 (UTC)
The test I have in mind is:
-Examine the total amount of resources that humans have put into designing, studying, and using programming languages. (Or, if you like, examine the returns on that investment.)
-What portion of that investment would have been made, and what portion of those returns would have been received, if no programming language had been machine-readable? I think 3% for both would be a wild overestimate.
-What portion of that investment would have been made, and what portion of those returns would have been received, if no programming language had been more human-readable than, say, FORTRAN? I'd guess 70% of the investment and 20% of the returns, anyway...given how badly even readable programming languages are typically used.
The exact numbers are obviously arguable, but that's the test I had in mind. I agree with Dijkstra, but what computer science is about, and what programming languages are for, are two different things. Computer science is not about machines, but programming languages are for controlling machines, first and foremost; if they could not be used for that they would be relegated to a smallish sub-discipline of mathematics, neither very popular nor very well-funded.
-Daniel.
FORTRAN is already tremendously human-readable, compared to binary machine code or any number of encodings that would be adequate to the purpose of describing computation to machines. Furthermore, I don't see too much programming language research on making languages easier to read by machines. So this argument actually demonstrates the opposite of what you intended it to. k.lee 23:09, 25 Jul 2004 (UTC)
Lastly, Kwertii's comments strike me as unconvincing:
  • It is irrelevant that programmers don't speak to each other in Java. Musicians don't speak to each other in musical notation. Mathematicians don't speak to each other in pure set theory and first-order logic. (Admittedly, some of them come close.) Lawyers do not speak to each other purely in legalese. Sculpters do not speak to each other with a series of little statuettes. There are other forms of human-to-human communication besides the oral use of natural language. Programming languages are one of them.
  • Furthermore, although programs can be written that are often called "unreadable", they are not literally unreadable, merely difficult to read. Nor should we construe the existence of hard to read programs as evidence that the primary purpose of programming languages is to communicate with machines. The English translation of Jacques Derrida's On Grammatology is nigh unreadable, but that does not show that deconstructionist literary theory is not a form of human-to-human communication.
  • It is certainly not the case that programmers communicate with other programmers primarily through comments! In fact, natural language comments are a noriously bad way to communicate precisely about a program. Code itself is the major form of programmer-to-programmer communication. (See the amicus curiae brief in MPAA v. 2600, which argues a similar point.)
  • Finally, as for Kwertii's claim that the "original" purpose of programming languages was to communicate with machines --- programming languages predate executable programming language implementations by at least two decades. The lambda calculus was invented in the 1930's. FORTRAN was invented in the 1950's. k.lee 09:24, 20 May 2004 (UTC)
Make that one decade. The codes given to some of the first computers in the 1940's, via paper tape and the like, certainly count as programming languages, and were designed to control machines. For that matter, Babbage's punch cards were designed to control machines.
-Daniel.
Augusta Ada Byron King, Countess of Lovelace, born December 10th 1813 ... invented the first computer programming language. -- WikiWikiWeb:AugustaAdaByron. I'm not sure which side of the argument this factoid supports.

producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.

Many people believe this. Quite a few programmers disagree very, very strongly. We believe that producing readable code is mostly through renaming, refactoring, etc. so that the name of an variable communicates (to humans) what it is, the name of a method communicates (to humans) what it does, etc.

See WikiWikiWeb:TreatCommentsWithSuspicion, WikiWikiWeb:ToNeedComments ("Refactor the code properly and you won't need comments.")

-- DavidCary 23:36, 5 Jul 2004 (UTC)


As computers grow more complex, our ability to translate solutions into executable binary has become more abstract, enabling us to express solutions in terms of objects, templates, patterns and aspects. Such abstractions enable a more natural translation from human needs, often expressed as "use cases", into executable solutions. It is this trend toward greater abstraction in the expression of programming solutions that enables programmer productivity to double, despite programmers being locked into fixed biological hardware.

If we gaze deep into the crystal ball, we see the logical extension of this trend as computers that are capable of conversing with humans and creating executable binary programs from desires or solutions expressed in pure human language. The shift away from computer-centric aspects of programming languages toward more human-solution-centric aspects will continue to be the defining characteristic of near-future programming languages.

The "holy grail" of programming language development from this viewpoint would be the creation of a transparent interface to a computing substrate that can extract requirements from the user and instantiate an executable solution. Of course, at this level of abstraction, there is no "programming language" any longer, merely a somewhat pedantic conversation required to define the essential complexity of the problem the user wishes to solve.

At any rate, some mention should be made of this shift from computer-centric aspects toward human-centric aspects and how this affects programmer productivity and how it will shape the role of programming languages going forward.

language links

(moved to Talk:List_of_programming_languages)


Possible prejudices re: "mainstream" languages

This article seems to be written largely from the point of view of a programmer in mainstream languages. For example, interactive use is attributed to interpreters, without considering that eg. many Smalltalk and Lisp systems have native compilers that are used interactively. Sorry for not bothering to work this rant into a considered and balanced edit of the article.

-- han

I disagree. I don't think we need to perpetuate the prejudices of "programmers in mainstream languages" (read C/C++, Java). That would be about as stupid as rewriting the operating systems entry from the point of view of a windows user.

Anyways, someone who has a copy of 'Programming Language Concepts and Paradigms' handy, an exceedingly comprehensible book on the subject, should rewrite this article. -- Ark

Total rewrite

This entire section needs to be rewritten from scratch. This includes this topic plus those for the various languages and language concept articles. This is going to be a big project but I think its important. Computer programming is too much a part of modern life to be half covered in an encyclopedia so I have to agree with Ark. Rlee0001 05:25 Jul 27, 2002 (PDT)

On another note: I would limit the list of programming languages here to just the main languages and not all the dialects. For example, there are something like 15-20 dialects of BASIC listed in the BASIC programming language page. Instead of listing all of them, one link for the entire language would suffice. If the user want's a dialect, he/she can stiff get to it from the BASIC page. Same goes for all the languages. Further, I fail to see why people are listing such obscure languages and dialects in an encyclopedia. Some languages have historical or technological significance. Others are just current brandnames for half-written freeware with a source forge page and no user base. Should "Applesoft BASIC" really get its own topic? What did it do to revolutionize the language? Did it have a particularly large user base? Did it establish any conventions which are widely in use today? If not its probably not worthy of its own topic. Even worse is articles like ibasic. This is a basic interpreter for the mac. It has no historical significance: it was just created within the last year by an ameture developer who lives in some small cottage in sweden somewhere. It gets it's own encyclopedia article? Rlee0001

I would propose the following:
  • Make a (short) list of the most significant programming languages in history to put on the Programming language page. Annotate the list to make clear just why these language are mentioned.
  • Make new article called List of programming languages, where every single progamming language can be mentioned, even dialects. This page can have several different orders, such as alphabetical, but also by type (functional, OO, etc.) or maybe even a history tree (there's a good book about the history of Programming Language by Sebasta, if I'm right, you may use that as a reference).
  • For those dialects/spin-offs/implementations/ports of programming languages that are never going to be more than a single-sentence article: assemble on the page of the main article ( BASIC programming language here) and make a section where you mention this or, when this is getting a long list, make it a separate article.
That's what I think would be best. I'll try and see if I can help you with some of the work you're proposing to do; there are enough other people with knowledge about the subject around, so it should be possible to get something good out of this. Jeronimo 01:56 Jul 29, 2002 (PDT)

---

BTW re: classifying languages by category, many languages belong in more than one category (constraint languages vs. rule-based languages vs. logic languages; and what about functional + OO languages like CLOS?) Just to keep in mind. -- k.lee

Wikibooks
Wikibooks
Wikibooks
Wikibooks
Wikibooks has more about this subject:

I have added two Wikibook links which allready have the texts which where suggested - The first link has an alphabetical and category list of languages - the 2nd link points to short introductions. I hope that helps. -- Krischik  T 16:52, 18 September 2005 (UTC)

Rewrite of k.lee

FYI: For some time I've been working on a ground-up rewrite of this article, because its current state does not make me happy. It's not ready to go live, but I've finally posted my current draft in my user space. I welcome comment on my rewrite; also feel free to edit it directly. It's taking me a long time to do the rewrite, but I plan to replace the entire current article eventually. k.lee 02:28, 27 Aug 2003 (UTC)

The link seems to be broken. -- Doradus 11:05, 27 Aug 2003 (UTC)
It appears to me as a red "edit" link rather than a regular blue link. -- Doradus 21:49, 28 Aug 2003 (UTC)
is there a reason that javascript isn't mention in the " Commonly Used Languages"? Also the link is red for me too ... reddi 21:58, 28 Aug 2003 (UTC)
Ok, the link works now. -- Doradus 00:03, 1 Sep 2003 (UTC)
I'm not sure I like the rewrite. I haven't read the original to compare it, but the rewrite seems to be at a very awkward level of detail. Anyone with enough background in the area to understand that writeup presumably doesn't need to read it. For instance, the grammar example casually uses the terms "atom" and "symbol" which have very little meaning to those outside the field of computer programming. In fact, the whole section on grammars would be better off in another article (say, on parsing). -- Doradus 00:10, 1 Sep 2003 (UTC)
Actually, I think that when I get to editing this article some more, I'll factor out several sections (e.g., the history of programming languages, and language semantics) into separate articles. I'll keep your suggestion in mind. k.lee 02:07, 2 Sep 2003 (UTC)

The link seems to be working now. :-)

I would like to ask is there a clear concensus that the original article is unsatisfactory to the extent that it needs a re-write? TonyClarke 11:38, 27 Aug 2003 (UTC)

Well, I don't know about a consensus, but here are my reasons for wanting to rewrite the article. First, the original article is rather disorganized. Second, it leaps into issues like the representation of data without even saying why programming languages exist in terms that a layperson can understand. Third, the original article does not maintain a sufficient distinction between the design of programming languages and their implementation. Fourth, the article does not give enough attention to formal languages (actually, if you counted all the programming languages ever invented, I suspect formal languages would outnumber "practical" languages). Finally, and most importantly, as the poster at the top of this talk page noted, the article does not give priority to a programming language's role in human-to-human communication --- which all language designers and software engineering researchers, not to mention most working programmers, understand as its most important role. It's possible that you could alter the original article to fix these flaws, but the changes would be radical enough to resemble a ground-up rewrite anyway. BTW I have reused sections from the original article where I thought appropriate. k.lee 02:07, 2 Sep 2003 (UTC)

The current main page definitely needs to be re-considered. While it is quite accurate (it seems to me), it is mostly a summary of the topic using the terminology of the discipline, and so is quite inscrutable to a newcomer. It occurs to me that an encyclopedia needs both a specialist and non-specialist version of the general information articles. The specialists need a means to agree on the theoretical structure of the topic, and the newcomers need to learn about it from scratch.

Removed from subject page:

To Do: this is just an outline to get started; add some descriptive text (or put in '/' links) and add a few representative languages to the descriptions


Rlee0001 01:51 Oct 20, 2002 (UTC)

Numbers of Users?

Do Ruby and Scheme really have several hundred thousand users, as in programmers who use them regularly? I doubt it, but I've been wrong before. Wesley

I believe so. But no one can prove either point anyway. --TakuyaMurata

I think it's probably true. For example, OCaml has at least 10^3 vocal users, probably 10^4 real users and probably 10^5 people who've played with it. However, such things are so difficult to quantify (e.g. look at Tiobe's silly estimates, which see huge bias from big business) and even to define (e.g. should we be talking about the total running time of programs written in different languages in order to combat, for example, the majority of Sourceforge projects "written in C++" that have yet to see an alpha release?) that I don't think such (mis)information belongs on Wikipedia. -- Jon Harrop

Naming conventions?

Hi,

why have we put virtually every programming language on "Foo programming language", and not on "Foo" if "Foo" is reasonably unique? "Programming language" is disambiguation, and that should only be used when there is ambiguity, should it not? -- Eloquence 00:08 Jan 24, 2003 (UTC)

Please see Wikipedia talk:Naming conventions (languages). There's no counterargument against changing to a sensible naming scheme that I'm aware of other than that certain people seemed to take it as a personal affront when it has been suggested in the past. -- Brion 00:12 Jan 24, 2003 (UTC)

Heck, I'd forgotten about that. The convention as stated at Wikipedia:Naming conventions does now say not to add "programming language" if the name of the language is unique, but I've not done any work in moving pages to reflect this new convention yet. I don't have time to start on this right now, but now that I've been reminded about it, I'll get to it when I have time (others, of course, are more than welcome - indeed encouraged, nay begged - to get there before me). -- Camembert

Great. I'll start some moving, although fixing double redirs will be an annoyance and we'll probably lose some page histories .. -- Eloquence 00:19 Jan 24, 2003 (UTC)


Proper classification of Python

Some people including me might take exception to Python being classified as procedural with bolt-on OO technology, this has been extensively discussed in the Python community

Miscellany

I would like to point out that the programming language list above misses Objective-C. Brent Gulanowski 15:54, 15 Oct 2003 (UTC)

Turing completeness and generality

One thing the article appears to miss is the basic elements that all languages must share to be able to express any computable algorithm. I was taught this as "Sequence, Selection, Repetition" but it may be known in a number of ways. I feel this is important, as well as correct attribution to whoever proved it mathematically - was it Turing maybe? I feel that once it is clear that all languages must support this basic elements, then they can be discussed in the abstract without having to say language X has this feature, language Y has this feature, etc. (though that can be added as an extension). GRAHAMUK 23:29, 9 Nov 2003 (UTC)

I agree that it would be very nice if the article has discussion for foundmental aspects of programming language. Particularly I would like to see what makes programming language real languages. What distinguishes them from markup languages like HTML. I am not sure about Sequence, Selection, Repetition. I believe the most basic elements of languages are control and data abstraction. This theory fits to the explanation of assembly language very well. Assembly language supports jump and conditional jump, very primitive form of control abstraction. It also supports labeling, primitive form of data abstraction.
Also I think that the article is too much devoted to discussion for data type and data structures. Is it so important to mention strong typed or dynamic type checking since datatype article covers such adequately. And control flow on the other hand has very few. Don't we have to even mention if or while? -- Taku 05:34, Nov 11, 2003 (UTC)
Sequence/selection/repetition is far more fundamental to the concept of a programming language than data or control abstraction. I'm not saying that a language with only those features would be a good language, but I am saying that without those features it would not be a programming language at all. See brainfuck - it's a perfectly valid programming language, any computable problem can be expressed in it - it's just not a good one for expressing human ideas. It's at that second level that abstractions are important, but these are building on the fundamental requirement for stepwise execution, decision branches and loops. Incidentally assembly languages are possible without the features you mention, yet still remain valid languages. I can remember using a (poor) 6502 assembler on the commodore 64 that did not have labels - you had to specify branches using line numbers. But it did work. GRAHAMUK 06:28, 11 Nov 2003 (UTC)
Thinking about this further, this is precisely what separates programming languages from markup languages. I'm not familiar with the full extent of HTML, but as far as I know it lacks the ability to perform branches based on conditions, or the ability to perform repeats. Also, talking of assembly language, it is possible in theory to design a serial processor with only one instruction - "subtract and branch if negative", yet such a processor could still implement any known algorithm, because it obeys the fundamental requirement of seq/sel/rep. This is pretty close to the idea of brainfuck in hardware. I sometimes wonder if such a processor, despite lacking, well, anything much really, could be made to go so fast that it would still be actually pretty good on performance. You could also make it massively pipelined. The ultimate RISC machine... Anyway, I digress, but the point is made, I think. See also turing complete. GRAHAMUK 06:41, 11 Nov 2003 (UTC)

Umm, very interesting. I think you are talking about what an minimum requirement to make the language capable of simulating the Turing machine. And probably the three criteria Sequence, Selection, Repetition are right. I was thinking like programming language as the mean of abstraction. The Turing machine is the most powerful computer we know today and we don't need any programming languages or such to perform computable algorithms. Programming languages were in my view needed and then invented because human beings need something abstract to make programming easier. Mnemonics in assembly languages are completely meaningless to the computer but are only to we human. This is why I claimed data and code abstraction are basic elements of programming language.

But you are right. Brainfuck is considered as programming language generally, if it misses my picture of programming languages. SImilar small languages like PostScript are also among them. In other words, programming languages are not only for human or are they? -- Taku 06:40, Nov 13, 2003 (UTC)

-- Taku 06:40, Nov 13, 2003 (UTC)

Is "sequence/selection/repetition" what they're teaching the kiddies nowadays? :-) Functional programming folks might prefer to say it's abstraction/evaluation/recursion. Turing equivalence can't be used as a precise criterion, because for instance it assumes infinite storage, and classic Fortran requires fixed-size allocation, yet few would say it's not a programming language. I would call Turing-equivalent languages "general-purpose programming languages", while leaving "programming language" as a more general moniker for any linguistic form of expression that instructs a computer, irrespective of generality. Stan 08:23, 13 Nov 2003 (UTC)

"Sequence, selection, and repetition" are not enough to make a language Turing-complete. A finite-state automaton is capable of all of the above (which correspond to the concatenation, union, and Kleene star operator of regular languages). You also need arbitrary memory allocation (or infinite storage, which is the same thing).

Also, Stan's point re: FORTRAN is a good one. The primary distinction between markup and programming languages is one of emphasis. They overlap --- any Turing-complete markup language (e.g., LaTeX) can also be considered a programming language --- but "everyone knows" when something is primarily a programming language or a markup language.

Finally, it's pretty obvious that any non-joke programming language is intended primarily for human consumption. Joke languages like Brainfuck or Unlambda are simply exceptions that prove the rule --- they're designed by humans for human amusement. k.lee 05:27, 17 Nov 2003 (UTC)

Actually Stan I have no idea what they are teaching the "kiddies" these days as you so dismissively put it. I actually picked up the seq/sel/rep thing from a course I did many years ago which was an introduction to microprocessor design. The point was emphasised that as long as the hardware provided these things, then it could run any software "language", and therefore all programming languages mapped to these fundamental concepts at their heart. This stayed with me so it must have made some sort of good sense to me at the time. Now, it's quite likely that from a software perspective, some of these things may be self evident - for example, the fact that each statement of a language is executed in written order and thus one thing logically follows another (Sequence). Programmers take that for granted (though obviously CPU designers need to construct a mechanism to make it happen), so maybe it doesn't need to be stated - but we must write for the proper audience here.
Too bad that they take it for granted :-) What you say is valid for procedural languages, but absolutely not for declarative ones. In Prolog language, you state the set of "rules of inference". The burden of setting the order of their application is onto compiler. Once you see what I mean, it is easies to agree that HTML and TEX are as good a programming language in a sense that they make the computer do what I want. The "magic triads" abstraction/evaluation/recursion, sequence/selection/repetition, encapsulation/.../... are woodoo talk for specific approaches, important for sure, but they must be discussed where they fit. When speaking about PL in general, one must have a broader POV. Read more good SF, folks :-) Mikkalai

That audience is not other programmers - they know this stuff already -, it is the "intelligent layman" who may not realise that that's the case. Without some proper foundations, the abstraction/evaluation/recursion thing is still too "high concept". Another example I clearly remember from my own early steps with programming (I've been a professional programmer for 20 years, so it's a while ago!) is parameter passing - programmers take for granted that parameters map from caller to callee based on the position of the parameter in a list, but I can remember thinking how error prone that seemed - back then I thought it would be better to "find" the parameter based on its name and ignore its position. Of course knowing now how a CPU actually implements a subroutine, the "position" thing is clearly far more efficient and sensible, and there are likely other undesirable side effects that name binding might have. The point to make here is that to the uninitiated, what seems obvious to a programmer may not be at all obvious to someone else, so starting with implicit fundamentals in order to eliminate any misunderstandings seems a good way to go with an article such as this. Given this approach, I'm not sure that referring to Turing completeness is even a good idea - is the intelligent layman that bothered about the mathematics? I suspect most people coming to WP are looking for a solid, precise but not necessarily complete discussion of the subject. If it grabs them sufficiently, they can look into the maths further if they want. GRAHAMUK 12:05, 18 Nov 2003 (UTC)

People learn positional parameter passing in junior high school algebra: if f(x, y) = 3x + 4y, then f(1,2) = 11. It's true that a function in a programming language is not the same thing as a function in pure math, but the notation ought to be familiar enough. It's only because (a) programming is usually taught at such a low, machine-oriented level and (b) math is usually taught badly, period, that so many programmers find the "high-concept" explanation less intuitive. To the educated layperson, who has not been forced into low-level thinking by a typical CS curriculum, abstraction and application may well be as easy to explain as branching and looping. The experiences of the PLT Scheme folks suggest that Scheme (which is, basically, the call-by-value lambda calculus) is easier to teach to undergraduates than C or Java. The only undergraduates who struggle with Scheme are those who learned bad hackery in C in high school, and convinced themselves that this was the only way to program. If we're targeting the article towards laypeople, then there's no reason to avoid conceptual explanations in favor of a machine-oriented explanation.
Also, a Wikipedia article should be as complete as the contributors can make it. If the article grows too long, then it should have a quick summary at the top of the article, followed by the more in-depth discussion; or else it should be broken into sub-articles. But there's no reason to leave something out if it's an important concept, simply because it requires more effort on the part of the reader to grasp.
P.S. Minor pedantic point: you don't need recursion in order to be Turing complete. Abstraction and application suffice; you can build recursion out of abstraction and application, as with the pure lambda calculus's y-combinator. k.lee 18:15, 18 Nov 2003 (UTC)
Hey, I added the ":-)" to indicate clearly that the "kiddie" remark was in jest, oh well. Although I'm probably the most formally qualified WP editor for this article (PhD in languages and all that), my first step would be to learn from my predecessors - review the current EB/Encarta/etc writeups, plus reread the intro sections to the best textbooks, both those aimed at specialists and those aimed at nonspecialists - and get an idea of the strategy that others have used. Since encyclopedias are reference works, there is a certain falloff; everybody reads the first sentence, 50% read the second also, 10% get to the second paragraph, and so forth, with only the deeply interested lasting all the way to the end of the article; so you generally want to transition gradually from generalities ("language is how we tell a computer what to do") to Church-Turing, which has to be mentioned eventually, because it's one of the bases that justify some of our classification of types of languages. Stan 18:43, 18 Nov 2003 (UTC)
I agree about the structure 100% - far too many WP articles dive in with no context establishing stuff up front. I suppose the rest of it comes down to whether we approach languages bottom-up or top-down. I can see advantages to both approaches. Since I came from a hardware background, bottom-up seemed to work well for me, going from boolean algebra to logic gates to registers to CPU architecture to stored programs to machine code blah blah etc. Other readers will respond better to the top-down approach, going from "high concepts" of languages towards the underlying bits and bytes. Perhaps both approaches need integrating in the article by including two sections. I'm presuming that the top-down approach is preferred by most teachers of the subject these days, but one thing I do notice is that as a result few people (who are not programmers but nevertheless are expected to know the basic principles) cannot understand or make the mental leap from the language to the chips that implement it in hardware. I've read a lot of vague handwaving arguments to explain it recently while marking some student work at the local uni, so perhaps I've just got a bee in my bonnet about bridging the software/hardware divide in a sensible, clear manner. GRAHAMUK 22:29, 18 Nov 2003 (UTC)
The hardware/software divide should get its own article I think; tricky to explain but worth trying. You could maybe do it as a sort of slice across other articles, xref'ing if the reader doesn't know a particular term. Something like "global = 1;" -> "ld r2,1; st global,r2" -> memory-mapped device -> electricity flowing -> light bulb turning on. If you stuck to the "what" and "how", and leave out the "why", it could be both succinct and illuminating. Stan 08:15, 19 Nov 2003 (UTC)

I know you guys know much about programming languages and theories in computer science than I do but I was wondering how about historical approach. RISC is a very good article. Although I don't have much expertise in hardware, the article makes a lot of sense to me. The nice thing is that it doesn't know give particular examples of RISC like a list of instruction code but it focuses on why computer science comes up with the idea of RISC in historical and technical context and also gives a plenty of practical examples of architectures. I think the same strategy can be applied to this article. In some ways, the article simply gives a summary of concepts which really doesn't make sense unless you know it before and sometimes goes to too much detailed. For example, I don't see why it is so important to spend a lot of space to discuss type system while some important concepts such as lazy evalution, side effects and referentical transparency are completely omited.

As I keep repeating, I think it is very important to avoid the article is like a textbook. RISC article is completely useless if you want to learn an assembly language and how to make a code generator for RISC architectures. Let alone Wikibook for such case. Well, just a thought. I am just hoping I am any help at all. -- Taku 07:15, Nov 19, 2003 (UTC)

Yes, a bit of historical recapitulation is helpful, especially to motivate why there are different languages. This article can't really get much into specific concepts like lazy evaluation though, those have to be pushed to language semantics articles so as to keep the top-level article readable by laypeople. Stan 08:15, 19 Nov 2003 (UTC)
You would not expect to learn an assembly language or how to make a code generator for RISC architectures from Wikipedia. You would get that from a text book! So which is it to be? To my mind, the RISC article is actually a very good encyclopedia article. GRAHAMUK 09:40, 20 Nov 2003 (UTC)

Umm, can we have a slight summary of different programming paradigms? I think it is important to show that why we have come to have several languages and what is difference between them. I am not suggesting to have complete discussion of specific topics like lazy evaluation but I don't know how to put, more like how different languages approach the problem of programming in different ways. I think such discussion can have the article more focused on making sense to the general public about what programming is like. The imperative approach is not only one and surprisingly many even computer programmers know little about how problems are done in programming with many different ways. Many people just learn how to do programming in a particular language like C or LISP and not sure programming language as general. For example, I think it would be nice to see how to reverse a string in many ways as an example. While it is very unnecessary to discuss how arguments are passed or how type system works. The bottom line is that the article must not be a summary of programming languages topics, but should discuss actual problems. I know it is easy to say and hard to achive, just chating about my idealistic view. -- Taku 08:40, Nov 20, 2003 (UTC)

I agree that there should be some explanation for why there are many languages - this itself shows that there is no one, true way to program. However, I'm not sure about examples, at least not in this article. There could be a link to a separate article listing the same program in all the various languages if you wanted. The problem is that including examples here of string reversal (or whatever) is EXACTLY turning it into a text book. The article should be about languages, not a tutorial for any particular one (or all of them). Seems to me your suggestions lean more towards the textbook approach, despite declaring that you don't want WP to be one. GRAHAMUK 09:52, 20 Nov 2003 (UTC)


Well, it shows that we have not yet found any one, true way to program. - Doradus 15:40, 20 Nov 2003 (UTC)


I agree with Taku -- if you think something is "too much" for an encyclopedia article, please move it to one of the Wiki Books http://en.wikibooks.org/wiki/IT_bookshelf -- DavidCary 15:36, 26 Jul 2004 (UTC)


(moved to User talk:Dysprosia)

Cut from "History of..."

The following piece is cut out of section "History of programming languages".

<<<

As the cost of computers has dropped significantly and the complexity of computer programs has increased dramatically, development time is now seen as a more costly consideration than computer time.

Newer integrated, visual development environments have brought clear progress. They have reduced expenditure of time, money (and nerves). Regions of the screen that control the program can often be arranged interactively. Code fragments can be invoked just by clicking on a control. The work is also eased by prefabricated components and software libraries with re-usable code, primarily object-oriented.

Object-oriented methodology was introduced to reduce the complexity of programs, making code easier to write and to maintain. However, some argue that programs have, despite this, continued to increase in complexity. Recent languages are emphasising new features, like meta classes, mix-ins, delegation, program patterns and aspects.

See programming paradigm

>>>

All the above is true, but... This rant is good for a pop-sci article in an online magazine, but not for encyclopedia: chaotic, no *history*, and no *programming languages* Mikkalai 00:37, 13 Dec 2003 (UTC)


" Computer language" is not synonymous with " programming language". A programming language is a computer language used for programming.- Doradus 00:14, 2 Jan 2004 (UTC)

I'm agree, computer language has its own article, and shouldn't be stated that is a synonym of "programming language" because "computer language" is broader. -- surueña 13:02:51, 2005-09-06 (UTC)

Writing From Scratch

This Section Programming Language needs a total wash and be written from scracth. I volunteer myslef to devote some time to it. As I am new to this site and I am learning how to edit things so this section will be online within a week and with a new style yana209 22:01, Jun 22, 2004 (UTC)


Some languages such as MUMPS and is called dynamic recompilation; emulators and other virtual machines exploit this technique for greater performance.

The clause before the semicolon isn't even complete. I'd fix it, but I'm not sure how exactly it should read. - Furrykef 15:17, 9 Sep 2004 (UTC)

fixed. Ancheta Wis 11:42, 29 Jan 2005 (UTC)

Programming languages causing crashes

The reason that I deleted the sentence

"Unfortunately many programming languages cause crashes because the languages themselves are poorly constructed."

is that I don't understand it. I suppose it means that the poor design of some programming languages is causing crashes, which raises the question: which languages is the author refering to? A possible interpretation is that programs written in low-level languages like C are prone to crashes, but that is not poor design in my opinion, but a conscious design choice to prefer speed, ease of compilation and flexibility at the price of allowing more crashes. Putting this sentence in the history section confuses me even more, because that implies that the problem of poor design no longer exists, which is at odds with my interpretation. So, please explain what you mean if you reinsert the above sentence. Thanks, Jitse Niesen ( talk) 16:30, 28 July 2005 (UTC)

You're right it shouldn't be in history section because it is such a fundamental point and the bane of any decent computer scientist. We are swamped with poorly written junk languages (and operating systems) that gain prominence via clever marketing rather than on merit. Yes I'm referring to design and unfortunately don't have time right now to flesh it out (though, again, anyone who knows the field should be able to do so). Treat it as a stub and add to the list. No I'm not referring to C and I recognise it's horses for courses. Thanks for the feedback. And please add to the "stub" rather than delete again. Mccready 01:22, 1 August 2005 (UTC)

Of all major programming languages I would only say of C++ and C#, that they are poorly constructed. This could hardly be considered many, and should be discussed in their respecive articles, not here. Your statement also lacks any form of argument. -- R.Koot 01:44, 1 August 2005 (UTC)

I would like to add to the discussion here. In fact there aren't many badly constructed languages at all (I know none, and I've programmed in many). What one can say is the lower the languages (closer to the machine language) the more stress is put on the knowledge of the developer to create a good working program (You need to know your language's do's and don'ts). Take for instance the difference between C++ and Java. Although they both are Modern Languages, C++ still can cause buffer over and underruns while this is pretty difficult to achieve in Java. Although memory leaks still can appear in Java this is probably one of the major flaws in C++ Programs. It's not the language that's bad, but the code that's written in them! -- Paul Sinnema 09:44, 16 September 2005 (UTC)

Programming languages don't cause crashes. Programmers who write bad code, or faulty compilers, runtimes, etc. do. Dysprosia 08:08, 17 September 2005 (UTC) {{Wikibooks Module|Computer programming|Error handling}}

I begg to differ. Example: X / 0 will crash a C and C++ programm but will be a CONSTRAINT_ERROR in Ada. So there are differences in how languages tread error conditions and it is rightfull that we describe that to the reader - preaps a comparison table might be helpfull to the reader in that respect. -- Krischik  T 09:08, 18 September 2005 (UTC)
I beg to differ in turn -- it is you who is causing the division by zero error, not the programming language. What you are measuring above is something completely different. Dysprosia 10:31, 18 September 2005 (UTC)
True. And yes, we talk about two half of the same coin - a programming mistake and how the language reacts to that - but then this Module is called Programming language and not Programmers. And how about X / Y. Is still the programmers fault or prehaps faulty data delivered to the programm. If you say "programmer" then I ask: Have you realy got a if (Y != 0) in front of every division - or made a full static analysis proving that Y can't be 0 - for all programms you have ever coded?
We live in a time of Viruses, Worms and Trojian Horses. A time where an unprotected Windows XP computer is infected and turned into a Spam-Bot in about 4 minutes after beeing connected to the internet. Programmers are not perfect and error handling - or the absence - is an important aspect of a Programming language and need to be explained to the reader. Please Note: I have never said a programming language is faulty because it has no error handling - that would break the NPOV. -- Krischik  T 12:31, 18 September 2005 (UTC)
It is the programmer's fault for not making a program that is not tolerant to faults and errant data, the programmer's fault that they do not perform adequate buffer checking, the programmer's fault for inadequate testing. Blaming Windows XP's insecurity on the programming language that they used, in my mind, is a cop-out. A programming language is just a means of expressing an algorithm -- what has any significance is the behaviour of the compiler or the runtime or the libraries or all of these, when unspecified or illegal behaviour occurs. The compiler is not the programming language.
You can't have is that some programming language has better features then your beloved C, can you? You can stand that expressing an algrorithm (incl. the needed error handling) might be easier in another language, can you? A typical case of "If my beloved programming language does not have feature X then feature X is bad, evil, send from hell - or whatever other nasty place you religion has to offer". You think I am unfair - well, you violent defence of C's greates weakness leads to no other conclusion. Or why should a programmer not know about or choose a programming language wich makes error handling easier? And Yes: I have 15 years of C/C++ programming experience I know what I am talking about - but then: I also have experience in Pascal, Modula-2 and Ada - programming languages where buffer overruns are virtualy unknown. -- Krischik  T 09:44, 19 September 2005 (UTC)
You ought not to get into insults and making nonsense claims that I have thought you "unfair" and that I have made some "violent defence" towards C (I've barely mentioned the name of the language in this thread). Think about the actual matter at hand. If you actually think about this carefully, you will understand where I am coming from: a programming language is only a specification on how to translate ideas into something more low-level. A specification does not cause crashes. Misunderstanding the specification does. If I tell you not to cross the street when the light is red, and you cross it anyway and get hit by a car, whose fault is it? Dysprosia
The person who walks of course. But then: there must be a traffic light in the first place. -- Krischik  T 11:47, 19 September 2005 (UTC)
It is still possible to cross a light safely without a light. Dysprosia 05:07, 20 September 2005 (UTC)
One person crossing one road: Yes. But can you image a city like New York, London or Paris without traffic lights? Trafic light where invented because traffic became to compex for the old "look right look left" technique of crossing the road. (BTW: The Ada motto is "Ada, the language for a complex world") --- Krischik  T 10:45, 21 September 2005 (UTC)
However, it is still possible. Regardless, what you need to also understand is that while runtime checks are fine and dandy, the overhead they incur is simply unacceptable for certain applications. When you're running some application where a bit of performance lag isn't an issue, then fine. But if you're writing hardware applications or time-critical applications, performing checks may unacceptably slow things down. The lack or disabling of runtime error checking/handling is not necessarily a bad thing. Again, the language is not to blame for this (for example, you can get C compilers which integrate bounds checking, but slow the application down -- you have to check every single array access), but of the compilers or the execution environment.
Which is what I've been maintaining all along. Dysprosia 09:18, 22 September 2005 (UTC)
And since you mention it, I do check return values when I call C library functions, amongst other things.
All return values - all parameters - all buffers sizes - all integer ranges - all type convertions - allways - never ever forgetting it even once? Because that is what you have to measure up against when compare yourself to a compiler which provides all that be default. -- Krischik  T 09:44, 19 September 2005 (UTC)
So you understand now that it is not the programming language that provides typechecking et al? You said yourself "a compiler which provides..." (Regardless, understanding types is not very difficult. There are other more strongly typed languages than C, and there are even dynamic typed programming languages. It is not difficult to understand and keep in mind.) Dysprosia
And so should you. Dysprosia 22:23, 18 September 2005 (UTC)
Actualy: No I don't have to check return values - my current assingment is done in Ada. -- Krischik  T 09:44, 19 September 2005 (UTC)
When you program in C, you'd better. Do you catch exceptions and/or perform error checks when you program in Ada or do you just ignore them? Dysprosia 10:10, 19 September 2005 (UTC)
Depends - the great strenght of expeptions is that I only need to handle the expeptions I am interested in and let the applications framework worry about the rest. Shure, the applications framework is done by programmers as well, just like the compiler. And I can still forget an exeption I should be interested in - but it does not happen as often as in C. Net result: I start the debugger only twice a month. -- Krischik  T 11:47, 19 September 2005 (UTC)
Thus Ada is not immune to you having to perform error correction at some point. No programming language is. (This is beside the nub of the matter though.) Dysprosia 05:07, 20 September 2005 (UTC)
Shure can't argue with that. -- Krischik  T 10:45, 21 September 2005 (UTC)

programming language popularity: the TIOBE Programming Community Index

Should we realy have TIOBE Programming Community Index a link to a biased statistic without telling the readers in which way the index is flawed?

TIOBE's company statement is: We offer out-of-the-box solutions for the programming languages C, C++, C# and Java. — which oviously means they want those languages to look "good" on there statistic.

We should either explain he flaw or remove the link. Of course it is tricky to keep an NPOV when explaining an flaw in some way.

How is the Index flawed?

The search query '+"<language> programming" -tv -channel' is used to calculate the TPC Index.

Since you are probably reading this discussing because you are language advocate of some sort you can just google for '+"<language> programming" +tv +channel' with <language> beeing your favorite programming language - and then decide if those pages where rightfully excluded from the index.

My first google hit for my favorite language is: This association is aimed at promoting ADA programming language to the software... TV channel, producer) on any type of platform (OpenTV, MediaHighway). ... — rightfully excluded — don't think so.

I removed the link, because you are not alloweds to link to your own website, not because I'm advocating some language. Please assume good faith. Cheers, -- R.Koot 14:01, 14 August 2005 (UTC)
Hmmm, there are several links to some of my pages on Wikipedia and Wikibooks. But they have never been added for vanity but only because they fit to the topic at hand. If it was any different the other contributers would have removed them. And in that respect I assume good faith. -- Krischik 14:21, 14 August 2005 (UTC)


popular programming languages

The article " buffer overflow" currently claims that

" As of 2005, the most popular languages generally are C and its derivative, C++."

While I suspect this is correct, I wonder how that author found out?

Is it even possible to rank programming languages according to "popularity" (or in some other, more objective way) in a NPOV way? If so, should we discuss "popularity" here in the programming language article, or split it off into a popular programming language article? See C2: Programming Language Usage Statistics. -- DavidCary 05:19, 11 November 2005 (UTC)

Rewrite has gone on long enough

I have given User:K.lee two weeks to put up or give up with his rewrite. His "rewrite" approach is anti-collaborative and he has been claiming that his rewrite is pending for over two years. The "reqeustrewrite" template is only used for k.lee's claim for this article. Wikipedia itself is not much older than that, which meas that no else has had a real and equal "turn" at this article since most of the work for the past two years will be lost when/if k.lee ever commits his version. Fplay 19:55, 9 December 2005 (UTC)

k.lee e-mails that that requstrewrite tag can be removed immediately. Fplay 20:14, 9 December 2005 (UTC)

FORTRAN

Minor note, but the spelling was changed from "Fortran" to "FORTRAN" since the article was referring to the first version of the language which was indeed spelled that way. The current accepted convention (see the Fortran page) is as follows: FORTRAN, FORTRAN II, FORTRAN IV, FORTRAN 66, FORTRAN 77 are in upper-case, with the new versions (such as Fortran 90) are in lower-case "Fortran" as per their convention. The FORTRAN spelling is a very important issue for FORTRAN programmers worldwide.


New introduction

I am willing to agree that my first stab at an introductory paragraph for "programming language" might not be ideal. However, the current first paragraph is likely to be incomprehensible to all but the most knowledgable of people.

The first paragraph should provide a brief summary/definition for a reasonably intelligent person who knows nothing about computers. Later material can get technical and dense.

Let's work out some good wording that encapulates this rather nebulous of entities.

Derek farn 14:15, 15 February 2006 (UTC) (Copied here from my talk page -- TuukkaH 16:31, 15 February 2006 (UTC))

I'm sorry for impolitely reverting your introductory paragraph, I was thinking of the edit history of another programming-related article. I'll restore it here for further discussion:

A programming language is a language designed to allow programmers to specify a sequence of operations to be performed (usually by a computer). The syntax and semantics of programming languages are much more restricted than natural languages. The written, human accessible, form of a programming language is known as source code and may be translated by a compiler into a form that can be executed by the cpu of a computer.

What I like about your version is that it's easy to read and probably also to understand. What I like about the earlier version is that it doesn't oversimplify, and it has links to other relevant articles. To me, the important things to tell in the introduction include:
  • The purpose of programming languages is to allow people to describe computer programs on a level where they can be made executable.
  • Programs are described by the data they act on and the algorithms they employ.
  • My pet with some nice links: "Programming languages are a type of computer languages, excluding pseudocode which is exclusively for human communication."
-- TuukkaH 16:31, 15 February 2006 (UTC)

I'm not sure I would have made the programming/computer language distinction. Yes html is a markup language, but some people think it is a programming language. Who are we to disagree? Do we require programming languages to be Turing complete? Is calling html a computer language just a way of deflecting critisism of not calling it a programming language?

We can somewhere down the article mention that some people think that HTML is a programming language, but we need not include this topic in the introduction. HTML doesn't allow one to describe algorithms so it clearly isn't a programming language. HTML and programming languages allow humans to communicate with computers so they're computer languages and mentioning this puts the topic in its place. -- TuukkaH 19:23, 16 February 2006 (UTC)

Anyway, back to the problem at hand. I now appreciate that my definition was overly restrictive. Some languages require programmers to specify what, not how. For instance, SQL requires a set of conditions to be specified, not the nuts and bolts of finding the data. What about Prolog which consists of clauses (ok, most implementations have extensions that support a more imperative style). How about:

A programming language is a language designed to allow people to create programs that control the actions of a machine (which is often a computer). The syntax and semantics of programming languages are much more restrictive than natural languages. The written, human accessible, form of a programming language is known as source code and is often translated into a different form before being executed on a machine (in the case of a computer, by the cpu).
Programming languages take many forms. Many in common use require programmers to write source code that implements the algorithms used to produce the required actions (so called imperative programming). Other languages require programmers to specify the what rather than the how (so called declarative programming). For instance, users of structured query language specify the conditions that must be met by data in a database for it to be returned as the result of a query, and various support programmers work out how to actually retrieve that data).
Programming languages differ from languages such as markup languages in that they are Turing complete.

Derek farn 01:34, 16 February 2006 (UTC)

This feels like a runaway introduction, see for example Wikipedia:Lead_section. I don't feel we should go into details such as syntax, semantics, source code, cpu, imperative vs. declarative, let alone Turing-completeness. -- TuukkaH 19:23, 16 February 2006 (UTC)

Yes, it is getting a bit bloated. I think that Turing completeness is a good way of distinguishing programming languages from other kinds of languages (I don't understand you later comment about infinite execution; there are various mathematical formalisms used when discussing properties of languages; I am using the term as a way of putting a minimum limit on the expressive power of a programming language). Or are we going to duck this issue entirely?

Could you check Turing completeness and machine that always halts? My comment below referred to the fact that if you require programming languages to be Turing complete you artificially leave out programming languages that are somehow able to limit programs to always halt. I say we duck this issue entirely because it's about a mathematical theory which isn't directly applicable in practice, in programming. We can duck for example by restricting programming languages to those that are used to describe data structures and algorithms. -- TuukkaH 09:31, 17 February 2006 (UTC)

People are familiar with what a natural language is, so let's make use of this knowledge.

A programming language is a language designed to allow people to create programs that control the actions of a machine (which is often a computer). The syntax and semantics of programming languages are much more restrictive than natural languages. The written, human accessible, form of a programming language is known as source code and is often translated into a different form before being executed on a machine.
Programming languages take many forms, the two major divisions being imperative programming and declarative programming. Some languages are intended to be used within specific domains (eg, banking), while others are intended for more general usage.
Languages differ in their expressive power and a language powerful enough to be Turing complete would be regarded as a programming language. Markup languages such a pure html are not Turing complete, but can be made so with the addition of extensions such as PHP, or Javascript.

Derek farn 00:28, 17 February 2006 (UTC)

Turing complete is not enough

I don't think that beeing Turing complete is enough to make a programming language. The language should also be used for general programming. That's why at Wikibooks we draw the line between Wikibooks:Programming languages bookshelf and Wikibooks:Domain-specific languages bookshelf. A "domain-specific language" may as well be Turing complete but it is not used for general programming bu only in a specific domain. i.E. PostScript is considered turing complete however I would not consider it a programming language as it not used for general programming - Or has anybody seen a text-editor or an excel clone written in PostScript?

-- Krischik  T 07:11, 16 February 2006 (UTC)

A "domain-specific language" can very well be a programming language but not necessary one suitable for general purposes. The article on PostScript has links to game of life, a webserver, fractals, barcodes, HTML renderer, raytracer. Even if a language isn't the best tool for the job it can still be a programming language. -- TuukkaH 19:23, 16 February 2006 (UTC)
Now that I find intersting - especialy the webserver part. And unlike the akademic brabble below this could change my mind about PostScript. In which case I have choosen a bad example. -- Krischik  T 12:00, 17 February 2006 (UTC)

Yes it is

What difference does it make what a language is actually used for? And what exactly is general programming?

Postscript is very much like Forth, would you say that Forth is not a programming language?

Forth is used by humans to create programms for computers - so certanly it is programming language. -- Krischik  T 11:53, 17 February 2006 (UTC)

Most languages are only used by a handful of people. Does that mean they are not programming languages because they are only used for specific tasks (whatever it is that the handful of people write with them)?

To take your example, I know people who work with printers and formatting software who spend large amounts of time writing code in postscript. I even have a program that prints out a calendar that is written in postscript. Just because lots of people choose not to write their software in postscript does not stop it being a programming language.

Derek farn 12:46, 16 February 2006 (UTC)

In your edit comment you claim that a programming language is equivalent to Turing complete. There are a lot of programming tasks that don't need infinite execution, so if a language would otherwise be a programming language but it has only finite execution you wouldn't call it a programming language. Now I wouldn't call it a Turing-complete programming language. To take another example, primitive lambda calculus is a Turing-complete model of computation but it lacks data types and IO. I don't really know if it should be classified as a programming language but I think it's better to keep the models of computation separate from the introduction of programming language. -- TuukkaH 19:23, 16 February 2006 (UTC)
Actualy: How many programming languages are turing complete? int in C is either 16, 32 or 64 bit - hardly infinite. C demands that there is an integer type intptr_t to which a pointer can be cast to - here goes support for infinite memory. "Turing complete" is the theoretical concept for academic use only. -- Krischik  T 11:53, 17 February 2006 (UTC)
The assertion that PostScript is not a programming language is utterly nonsensical. What is it, if it is not a programming language?
A page description language. Because that is what it is used for in 99.99% of all cases. Even the calendar programm mentioned above is only that - it prints a calender - it is not an outlook clone. If it starts having a database to keep appointments then it aproaches programming. -- Krischik  T 11:53, 17 February 2006 (UTC)
Look. Do you even know about the PostScript language at all? It sounds like you think that PostScript's capabilities are solely devoted to graphics output. Yes, it does this well, but it is not its only functionality.
Furthermore, it can be argued that nearly all programming languages can in some sense be page description languages. What do you think a GUI display does when it prints a calendar?
Just because PostScript does not have the regular idea of user I/O that other programming languages have does not mean the language is suddenly "not" a programming language. Dysprosia
I never tought that a programming language needs "user i/o". Many embedded programms or batch processing programms do well without. -- Krischik  T 13:01, 17 February 2006 (UTC)
So what on earth are you basing your argument upon that PostScript is not a programming language? You seem to use some sort of arbitrary definition that because some programming language doesn't "keep appointments" or is an "excel clone", it's not a programming language? Dysprosia 13:17, 17 February 2006 (UTC)
Does a programming language have to have curses or GUI capabilities to be considered a programming language? Of course not. Just because PostScript does not have a lot of user-interaction libraries available to it doesn't mean it fails to be a programming language. If PostScript had such libraries, it would be rather easy to create a "text-editor" or "excel clone"! Dysprosia 09:49, 17 February 2006 (UTC)
I think we are all in agreement that PostScript is a programming language, and even a Turing-complete one, and you can even write different kinds of programs in it in practice. Perhaps you meant this answer to Krischik above, he was the one who didn't want to think of PostScript as a programming language? -- TuukkaH 10:37, 17 February 2006 (UTC)
Of course. Dysprosia 11:05, 17 February 2006 (UTC)
No, I am not in agreement - I still think it is a page description language. -- Krischik  T 11:53, 17 February 2006 (UTC)
And Lisp is a list processing language and Visual Basic a painting program. — Ruud 13:29, 17 February 2006 (UTC)

Definition

I propose the following definition, distilled from many sources throughout my education and career. This definition disposes of the qualitative assessments of languages, and focuses on quantifiable features:

General purpose programming language
A general purpose programming language must meet three criteria: it must be Turing complete; it must have at least a prototype compiler/translator/interpreter implementation; it must be in use, or targetted for use, independent of any particular application software.

Examples of general purpose programming languages:

C
Clearly satisfies all three requirements.
Perl 6
While its implementations are currently shakey at best, they exist, and it otherwise meets 1 & 3.
JavaScript
It is rare that JavaScript is used outside of a Web browser, but it is used that way.
PostScript
Implementations of PostScript as a stand-alone language have existed since the 80s, though they are rarely used outside of research and QA for embedded applications.

Examples of non-general purpose programming languages:

SQL
Misses on item 1
pseudocode
The typical pseudocode used in computer science books and papers has no formal implementation that I am aware of.
vi macros
Misses on item 3 (yes, vi macros are Turing complete).

If we can agree to this definition, then we could agree to limit the scope of the article to such languages, and then create articles to collect the exceptions. - Harmil 13:48, 6 March 2006 (UTC)

SQL is not Turing complete

SQL has no looping construct. Ok, the support software behind it contains lots of loops, but that is not the same thing.

I have always thought of SQL as a programming language, but on reflection I guess it should be called something like a database query language and along with markup language's not be included in the list of programming languages. The SQL Standard is not maintained by SC22, the ISO committee responsible for programming languages.

Derek farn 13:08, 16 February 2006 (UTC)

Have you had a look at PL/SQL? Krischik  T 13:03, 17 February 2006 (UTC)

PL/SQL adds programming language features to SQL (because it does not have any), hence the name Programming Language/SQL.

Derek farn 14:17, 17 February 2006 (UTC)

Doesn't the acronym "SQL" stand for "Structured Query Language"? I would not call it a programming language in its own right, but possibly a tool for other languages. -- BBM 22:33, 16 May 2006 (UTC)

Removed comments

I just removed the following comments from the end of "History of programming languages" (reformatted to shorten lines), as their presence (with newlines in between) was creating excessive vertical whitespace, and in any case they really belonged here. Hairy Dude 05:14, 2 February 2006 (UTC)

<!--- Chaotic and not to the point of the section, i.e., "history of
comp lang." I move this piece in "Talk" according to wikipedia principle:
better no article than bad article --->

<!--- Changed "teached" to "taught." I also disagree with the claim
that Java was the first programming language taught in universities.
Languages like FORTRAN, Cobol, C, etc., were all extensively taught
before Java came on the scene. shrao@acm.org, 2005-02-02 --->

<!-- The assertion: "Java...became...the first programming language
taught at the universities" is intended to convey that Java has become
the programming language of choice for 100 level language classes in
university curricula. It's a rather badly worded sentence. Changed "first"
to "initial." tim@fourstonesExpressions.com, 2005-03-24 -->

<!-- The sentence is still poorly worded; changed "the initial" to
"an introductory". danb@cs.utexas.edu, 2005-10-04 -->

"Millions" of what?

http://hopl.murdoch.edu.au/ reads "This site lists 8276 languages... It has delivered more than 1,720,000 programming languages descriptions in the last 14 months". (I think this latter bit is their way of saying web-hits.) Ewlyahoocom 20:15, 11 March 2006 (UTC)

I think it was deliberatly done to confuse people. — Ruud 20:43, 11 March 2006 (UTC)
I'm pretty sure it was a good faith mistake. Haven't we all discovered some new bit of information, rushed to add it to Wikipedia, only to find out later (usually only a few minutes) that we misread it? Hmmm... maybe we'll have to keep our eye on you. Ewlyahoocom 20:48, 11 March 2006 (UTC)
Oy... what happened to WP:AGF? — Ruud 20:57, 11 March 2006 (UTC)
When the editor in question writes "I think it was deliberatly", then one no longer has to assume, yes? However... if I may assume that you're joking, then you may assume that I'm joking, too. Ewlyahoocom 21:11, 11 March 2006 (UTC)
My "it being done" refered to HOPL stating such a large number without sopecifing what "delivered" mean, not that my addition was done to confuse people. — Ruud 21:18, 11 March 2006 (UTC)
Oh, I see! HAHAHA! Wait a sec... where's your good faith for those guys!? HAHAHAHA! :-) Ewlyahoocom 05:49, 12 March 2006 (UTC)

Article image

Is there a better, more representative image of computer code than the current one?

What would it ideally depict? Certainly not HTML like currently. Perhaps a simple function in a typical, popular programming language such as C. Or something more readable such as Python. -- TuukkaH 12:32, 16 January 2006 (UTC)
Instead of cropping the picture to make it fit, maybe it's better to have a scaled-down image that conveys the shape and layout of some chunk of code, without necessarily trying to be legible in detail.
As for the language, i'm sure everyone has their favorite (also Python, in my case). But for Wikipedia, the language's historical recognition, heritage, and influence are probably more important factors than current popularity. To me, the two most obvious contenders are C, and Lisp/Scheme. -- Piet Delport 16:58, 16 January 2006 (UTC)
I don't have a problem with C/C++ or LISP. However, to keep source code language agnostic I'd propose using a psuedo-language that just shows block structure layout of programming languages, which is universal to all languages.-- Capi crimm 04:30, 6 March 2006 (UTC)
Pseudocode would feel a bit like giving the Cola article an artificial " Acme Cola" image. -- Piet Delport 10:43, 8 March 2006 (UTC)
To be accurate, block structure layout is not universal to all languages. – Zawersh 02:13, 5 June 2006 (UTC)
What about a segment of MediaWiki PHP code? MediaWiki powers Wikipedia, so I think it is appropriate, should one specific language have need to be used. -- BBM 22:42, 16 May 2006 (UTC)
My inner programmer cringes at the thought of being represented by PHP... but the idea has merit. -- Piet Delport 07:22, 17 May 2006 (UTC)
I've added the reqimageother template above this page if you guys wouldn't mind. I agree that this article deserves something better. -- Face 09:47, 2 May 2006 (UTC)
P.S.: Maybe we can do something with this too?
What is the cryptic "High-Level Language" block supposed to represent, though? -- Piet Delport 14:41, 2 May 2006 (UTC)
What about the misleading steps/indentation? If we use something like that, we need to do it differently. The indentation gives the impression that higher-level languages are somehow less than lower-level ones, even to the point that hardware is better than software. How about a ladder diagram instead. Jaxad0127 00:10, 6 June 2006 (UTC)

What about this one about the history of programming languages: Image:Historie.png? If not as main image, at least could be used in the History section. ManuelGR 22:44, 14 May 2006 (UTC)

The German text probably rules it out speedily, but beyond that, the selection of languages strike me as unbalanced: Smalltalk gets no less than five entries, while many major languages are left out. There are also some... creative arrows, such as the Prolog -> Lisp one. -- Piet Delport 01:20, 15 May 2006 (UTC)

Assessment comment

The comment(s) below were originally left at Talk:Programming language/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

Needs fixing of a few {{ fact}}s. Tito xd( ?!?) 17:01, 11 September 2006 (UTC)

Last edited at 17:01, 11 September 2006 (UTC). Substituted at 21:56, 3 May 2016 (UTC)

huezo202 [1] The Parent Trap (1998 film) 5.56×45mm NATO 3Y (disambiguation)

  1. ^ {{ cite journal}}: Empty citation ( help)

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook