This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
{{reqimageother|a representative image of computer code (see discussion)}}
I have little difficulty understanding the feature specification. I may have it missed it entirely.
I see the need for artwork to match the standard form (fashion) of other articles. An image representative of 'computer code' is simple, photograph some representative structured code that displays an editing style of say indenting. That is code, and its Boring, hence appropriate?.
If the goal is a little larger, to for instance portray the archetypal process of creating a program in a computer programming language using the artistic medium, image. I would suggest an photo of an actual 80 char screen, possibly green, with individual pixels large enough to be seen, the one line of text would read "Syntax Error: redo from start". It could, if one could be thought of, include the actual text that was a classic, obvious yet confusing problem in say basic syntax. This would to me encapsulate the constant process of humans trying battle the computers rigid interpretation of the humans otherwise reasonable requests. The prior syntax error could also be the humorous "please help" instead of just the more likely but terse "help". This idea that the human has mistaken (anthropomorphised) the computer for someone that can be persuaded with politeness is the point of desperation. If the image context was to be larger an undrunk cup of coffee and an empty pizza box would round out the stereotype as these imply both the multitude of hours spent already and yet the incompleteness of the task.
I think we need to point out that programming languages, like other languages, are for humans to express human ideas in. The unique thing about programming languages is that we can automatically translate these expressions into the ones and zeros that computers use. Still, the primary purpose that should be stressed is that these are human languages, for humans to express solutions in which are meant to be understandable by other humans. Since most of the cost of software across the useful lifespan of a program is invested in enhancements and maintenance, the human-readability of programs is much more important than their nature as a "technique for expressing instructions to a computer".
Low-level languages such as machine code are also programming languages. All programming languages are in principle both human- and machine-readable, but the relative emphasis varies.
producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.
Many people believe this. Quite a few programmers disagree very, very strongly. We believe that producing readable code is mostly through renaming, refactoring, etc. so that the name of an variable communicates (to humans) what it is, the name of a method communicates (to humans) what it does, etc.
See WikiWikiWeb:TreatCommentsWithSuspicion, WikiWikiWeb:ToNeedComments ("Refactor the code properly and you won't need comments.")
-- DavidCary 23:36, 5 Jul 2004 (UTC)
As computers grow more complex, our ability to translate solutions into executable binary has become more abstract, enabling us to express solutions in terms of objects, templates, patterns and aspects. Such abstractions enable a more natural translation from human needs, often expressed as "use cases", into executable solutions. It is this trend toward greater abstraction in the expression of programming solutions that enables programmer productivity to double, despite programmers being locked into fixed biological hardware.
If we gaze deep into the crystal ball, we see the logical extension of this trend as computers that are capable of conversing with humans and creating executable binary programs from desires or solutions expressed in pure human language. The shift away from computer-centric aspects of programming languages toward more human-solution-centric aspects will continue to be the defining characteristic of near-future programming languages.
The "holy grail" of programming language development from this viewpoint would be the creation of a transparent interface to a computing substrate that can extract requirements from the user and instantiate an executable solution. Of course, at this level of abstraction, there is no "programming language" any longer, merely a somewhat pedantic conversation required to define the essential complexity of the problem the user wishes to solve.
At any rate, some mention should be made of this shift from computer-centric aspects toward human-centric aspects and how this affects programmer productivity and how it will shape the role of programming languages going forward.
(moved to Talk:List_of_programming_languages)
This article seems to be written largely from the point of view of a programmer in mainstream languages. For example, interactive use is attributed to interpreters, without considering that eg. many Smalltalk and Lisp systems have native compilers that are used interactively. Sorry for not bothering to work this rant into a considered and balanced edit of the article.
-- han
I disagree. I don't think we need to perpetuate the prejudices of "programmers in mainstream languages" (read C/C++, Java). That would be about as stupid as rewriting the operating systems entry from the point of view of a windows user.
Anyways, someone who has a copy of 'Programming Language Concepts and Paradigms' handy, an exceedingly comprehensible book on the subject, should rewrite this article. -- Ark
This entire section needs to be rewritten from scratch. This includes this topic plus those for the various languages and language concept articles. This is going to be a big project but I think its important. Computer programming is too much a part of modern life to be half covered in an encyclopedia so I have to agree with Ark. Rlee0001 05:25 Jul 27, 2002 (PDT)
On another note: I would limit the list of programming languages here to just the main languages and not all the dialects. For example, there are something like 15-20 dialects of BASIC listed in the BASIC programming language page. Instead of listing all of them, one link for the entire language would suffice. If the user want's a dialect, he/she can stiff get to it from the BASIC page. Same goes for all the languages. Further, I fail to see why people are listing such obscure languages and dialects in an encyclopedia. Some languages have historical or technological significance. Others are just current brandnames for half-written freeware with a source forge page and no user base. Should "Applesoft BASIC" really get its own topic? What did it do to revolutionize the language? Did it have a particularly large user base? Did it establish any conventions which are widely in use today? If not its probably not worthy of its own topic. Even worse is articles like ibasic. This is a basic interpreter for the mac. It has no historical significance: it was just created within the last year by an ameture developer who lives in some small cottage in sweden somewhere. It gets it's own encyclopedia article? Rlee0001
---
I have added two Wikibook links which allready have the texts which where suggested - The first link has an alphabetical and category list of languages - the 2nd link points to short introductions. I hope that helps. -- Krischik T 16:52, 18 September 2005 (UTC)
FYI: For some time I've been working on a ground-up rewrite of this article, because its current state does not make me happy. It's not ready to go live, but I've finally posted my current draft in my user space. I welcome comment on my rewrite; also feel free to edit it directly. It's taking me a long time to do the rewrite, but I plan to replace the entire current article eventually. k.lee 02:28, 27 Aug 2003 (UTC)
The link seems to be working now. :-)
I would like to ask is there a clear concensus that the original article is unsatisfactory to the extent that it needs a re-write? TonyClarke 11:38, 27 Aug 2003 (UTC)
The current main page definitely needs to be re-considered. While it is quite accurate (it seems to me), it is mostly a summary of the topic using the terminology of the discipline, and so is quite inscrutable to a newcomer. It occurs to me that an encyclopedia needs both a specialist and non-specialist version of the general information articles. The specialists need a means to agree on the theoretical structure of the topic, and the newcomers need to learn about it from scratch.
To Do: this is just an outline to get started; add some descriptive text (or put in '/' links) and add a few representative languages to the descriptions
Rlee0001 01:51 Oct 20, 2002 (UTC)
Do Ruby and Scheme really have several hundred thousand users, as in programmers who use them regularly? I doubt it, but I've been wrong before. Wesley
I believe so. But no one can prove either point anyway. --TakuyaMurata
I think it's probably true. For example, OCaml has at least 10^3 vocal users, probably 10^4 real users and probably 10^5 people who've played with it. However, such things are so difficult to quantify (e.g. look at Tiobe's silly estimates, which see huge bias from big business) and even to define (e.g. should we be talking about the total running time of programs written in different languages in order to combat, for example, the majority of Sourceforge projects "written in C++" that have yet to see an alpha release?) that I don't think such (mis)information belongs on Wikipedia. -- Jon Harrop
Hi,
why have we put virtually every programming language on "Foo programming language", and not on "Foo" if "Foo" is reasonably unique? "Programming language" is disambiguation, and that should only be used when there is ambiguity, should it not? -- Eloquence 00:08 Jan 24, 2003 (UTC)
Heck, I'd forgotten about that. The convention as stated at Wikipedia:Naming conventions does now say not to add "programming language" if the name of the language is unique, but I've not done any work in moving pages to reflect this new convention yet. I don't have time to start on this right now, but now that I've been reminded about it, I'll get to it when I have time (others, of course, are more than welcome - indeed encouraged, nay begged - to get there before me). -- Camembert
Great. I'll start some moving, although fixing double redirs will be an annoyance and we'll probably lose some page histories .. -- Eloquence 00:19 Jan 24, 2003 (UTC)
Some people including me might take exception to Python being classified as procedural with bolt-on OO technology, this has been extensively discussed in the Python community
I would like to point out that the programming language list above misses Objective-C. Brent Gulanowski 15:54, 15 Oct 2003 (UTC)
One thing the article appears to miss is the basic elements that all languages must share to be able to express any computable algorithm. I was taught this as "Sequence, Selection, Repetition" but it may be known in a number of ways. I feel this is important, as well as correct attribution to whoever proved it mathematically - was it Turing maybe? I feel that once it is clear that all languages must support this basic elements, then they can be discussed in the abstract without having to say language X has this feature, language Y has this feature, etc. (though that can be added as an extension). GRAHAMUK 23:29, 9 Nov 2003 (UTC)
Umm, very interesting. I think you are talking about what an minimum requirement to make the language capable of simulating the Turing machine. And probably the three criteria Sequence, Selection, Repetition are right. I was thinking like programming language as the mean of abstraction. The Turing machine is the most powerful computer we know today and we don't need any programming languages or such to perform computable algorithms. Programming languages were in my view needed and then invented because human beings need something abstract to make programming easier. Mnemonics in assembly languages are completely meaningless to the computer but are only to we human. This is why I claimed data and code abstraction are basic elements of programming language.
But you are right. Brainfuck is considered as programming language generally, if it misses my picture of programming languages. SImilar small languages like PostScript are also among them. In other words, programming languages are not only for human or are they? -- Taku 06:40, Nov 13, 2003 (UTC)
-- Taku 06:40, Nov 13, 2003 (UTC)
Is "sequence/selection/repetition" what they're teaching the kiddies nowadays? :-) Functional programming folks might prefer to say it's abstraction/evaluation/recursion. Turing equivalence can't be used as a precise criterion, because for instance it assumes infinite storage, and classic Fortran requires fixed-size allocation, yet few would say it's not a programming language. I would call Turing-equivalent languages "general-purpose programming languages", while leaving "programming language" as a more general moniker for any linguistic form of expression that instructs a computer, irrespective of generality. Stan 08:23, 13 Nov 2003 (UTC)
"Sequence, selection, and repetition" are not enough to make a language Turing-complete. A finite-state automaton is capable of all of the above (which correspond to the concatenation, union, and Kleene star operator of regular languages). You also need arbitrary memory allocation (or infinite storage, which is the same thing).
Also, Stan's point re: FORTRAN is a good one. The primary distinction between markup and programming languages is one of emphasis. They overlap --- any Turing-complete markup language (e.g., LaTeX) can also be considered a programming language --- but "everyone knows" when something is primarily a programming language or a markup language.
Finally, it's pretty obvious that any non-joke programming language is intended primarily for human consumption. Joke languages like Brainfuck or Unlambda are simply exceptions that prove the rule --- they're designed by humans for human amusement. k.lee 05:27, 17 Nov 2003 (UTC)
That audience is not other programmers - they know this stuff already -, it is the "intelligent layman" who may not realise that that's the case. Without some proper foundations, the abstraction/evaluation/recursion thing is still too "high concept". Another example I clearly remember from my own early steps with programming (I've been a professional programmer for 20 years, so it's a while ago!) is parameter passing - programmers take for granted that parameters map from caller to callee based on the position of the parameter in a list, but I can remember thinking how error prone that seemed - back then I thought it would be better to "find" the parameter based on its name and ignore its position. Of course knowing now how a CPU actually implements a subroutine, the "position" thing is clearly far more efficient and sensible, and there are likely other undesirable side effects that name binding might have. The point to make here is that to the uninitiated, what seems obvious to a programmer may not be at all obvious to someone else, so starting with implicit fundamentals in order to eliminate any misunderstandings seems a good way to go with an article such as this. Given this approach, I'm not sure that referring to Turing completeness is even a good idea - is the intelligent layman that bothered about the mathematics? I suspect most people coming to WP are looking for a solid, precise but not necessarily complete discussion of the subject. If it grabs them sufficiently, they can look into the maths further if they want. GRAHAMUK 12:05, 18 Nov 2003 (UTC)
I know you guys know much about programming languages and theories in computer science than I do but I was wondering how about historical approach. RISC is a very good article. Although I don't have much expertise in hardware, the article makes a lot of sense to me. The nice thing is that it doesn't know give particular examples of RISC like a list of instruction code but it focuses on why computer science comes up with the idea of RISC in historical and technical context and also gives a plenty of practical examples of architectures. I think the same strategy can be applied to this article. In some ways, the article simply gives a summary of concepts which really doesn't make sense unless you know it before and sometimes goes to too much detailed. For example, I don't see why it is so important to spend a lot of space to discuss type system while some important concepts such as lazy evalution, side effects and referentical transparency are completely omited.
As I keep repeating, I think it is very important to avoid the article is like a textbook. RISC article is completely useless if you want to learn an assembly language and how to make a code generator for RISC architectures. Let alone Wikibook for such case. Well, just a thought. I am just hoping I am any help at all. -- Taku 07:15, Nov 19, 2003 (UTC)
Umm, can we have a slight summary of different programming paradigms? I think it is important to show that why we have come to have several languages and what is difference between them. I am not suggesting to have complete discussion of specific topics like lazy evaluation but I don't know how to put, more like how different languages approach the problem of programming in different ways. I think such discussion can have the article more focused on making sense to the general public about what programming is like. The imperative approach is not only one and surprisingly many even computer programmers know little about how problems are done in programming with many different ways. Many people just learn how to do programming in a particular language like C or LISP and not sure programming language as general. For example, I think it would be nice to see how to reverse a string in many ways as an example. While it is very unnecessary to discuss how arguments are passed or how type system works. The bottom line is that the article must not be a summary of programming languages topics, but should discuss actual problems. I know it is easy to say and hard to achive, just chating about my idealistic view. -- Taku 08:40, Nov 20, 2003 (UTC)
I agree with
Taku -- if you think something is "too much" for an encyclopedia article, please move it to one of the Wiki Books
http://en.wikibooks.org/wiki/IT_bookshelf --
DavidCary 15:36, 26 Jul 2004 (UTC)
(moved to User talk:Dysprosia)
The following piece is cut out of section "History of programming languages".
As the cost of computers has dropped significantly and the complexity of computer programs has increased dramatically, development time is now seen as a more costly consideration than computer time.
Newer integrated, visual development environments have brought clear progress. They have reduced expenditure of time, money (and nerves). Regions of the screen that control the program can often be arranged interactively. Code fragments can be invoked just by clicking on a control. The work is also eased by prefabricated components and software libraries with re-usable code, primarily object-oriented.
Object-oriented methodology was introduced to reduce the complexity of programs, making code easier to write and to maintain. However, some argue that programs have, despite this, continued to increase in complexity. Recent languages are emphasising new features, like meta classes, mix-ins, delegation, program patterns and aspects.
All the above is true, but... This rant is good for a pop-sci article in an online magazine, but not for encyclopedia: chaotic, no *history*, and no *programming languages* Mikkalai 00:37, 13 Dec 2003 (UTC)
" Computer language" is not synonymous with " programming language". A programming language is a computer language used for programming.- Doradus 00:14, 2 Jan 2004 (UTC)
This Section Programming Language needs a total wash and be written from scracth. I volunteer myslef to devote some time to it. As I am new to this site and I am learning how to edit things so this section will be online within a week and with a new style yana209 22:01, Jun 22, 2004 (UTC)
Some languages such as MUMPS and is called dynamic recompilation; emulators and other virtual machines exploit this technique for greater performance.
The clause before the semicolon isn't even complete. I'd fix it, but I'm not sure how exactly it should read. - Furrykef 15:17, 9 Sep 2004 (UTC)
The reason that I deleted the sentence
is that I don't understand it. I suppose it means that the poor design of some programming languages is causing crashes, which raises the question: which languages is the author refering to? A possible interpretation is that programs written in low-level languages like C are prone to crashes, but that is not poor design in my opinion, but a conscious design choice to prefer speed, ease of compilation and flexibility at the price of allowing more crashes. Putting this sentence in the history section confuses me even more, because that implies that the problem of poor design no longer exists, which is at odds with my interpretation. So, please explain what you mean if you reinsert the above sentence. Thanks, Jitse Niesen ( talk) 16:30, 28 July 2005 (UTC)
You're right it shouldn't be in history section because it is such a fundamental point and the bane of any decent computer scientist. We are swamped with poorly written junk languages (and operating systems) that gain prominence via clever marketing rather than on merit. Yes I'm referring to design and unfortunately don't have time right now to flesh it out (though, again, anyone who knows the field should be able to do so). Treat it as a stub and add to the list. No I'm not referring to C and I recognise it's horses for courses. Thanks for the feedback. And please add to the "stub" rather than delete again. Mccready 01:22, 1 August 2005 (UTC)
Of all major programming languages I would only say of C++ and C#, that they are poorly constructed. This could hardly be considered many, and should be discussed in their respecive articles, not here. Your statement also lacks any form of argument. -- R.Koot 01:44, 1 August 2005 (UTC)
I would like to add to the discussion here. In fact there aren't many badly constructed languages at all (I know none, and I've programmed in many). What one can say is the lower the languages (closer to the machine language) the more stress is put on the knowledge of the developer to create a good working program (You need to know your language's do's and don'ts). Take for instance the difference between C++ and Java. Although they both are Modern Languages, C++ still can cause buffer over and underruns while this is pretty difficult to achieve in Java. Although memory leaks still can appear in Java this is probably one of the major flaws in C++ Programs. It's not the language that's bad, but the code that's written in them! -- Paul Sinnema 09:44, 16 September 2005 (UTC)
Programming languages don't cause crashes. Programmers who write bad code, or faulty compilers, runtimes, etc. do. Dysprosia 08:08, 17 September 2005 (UTC) {{Wikibooks Module|Computer programming|Error handling}}
X / 0
will crash a C and C++ programm but will be a CONSTRAINT_ERROR in Ada. So there are differences in how languages tread error conditions and it is rightfull that we describe that to the reader - preaps a comparison table might be helpfull to the reader in that respect. --
Krischik
T 09:08, 18 September 2005 (UTC)X / Y
. Is still the programmers fault or prehaps faulty data delivered to the programm. If you say "programmer" then I ask: Have you realy got a if (Y != 0)
in front of every division - or made a full static analysis proving that Y can't be 0 - for all programms you have ever coded?Should we realy have TIOBE Programming Community Index a link to a biased statistic without telling the readers in which way the index is flawed?
TIOBE's company statement is: We offer out-of-the-box solutions for the programming languages C, C++, C# and Java. — which oviously means they want those languages to look "good" on there statistic.
We should either explain he flaw or remove the link. Of course it is tricky to keep an NPOV when explaining an flaw in some way.
The search query '+"<language> programming" -tv -channel' is used to calculate the TPC Index.
Since you are probably reading this discussing because you are language advocate of some sort you can just google for '+"<language> programming" +tv +channel' with <language> beeing your favorite programming language - and then decide if those pages where rightfully excluded from the index.
My first google hit for my favorite language is: This association is aimed at promoting ADA programming language to the software... TV channel, producer) on any type of platform (OpenTV, MediaHighway). ... — rightfully excluded — don't think so.
The article " buffer overflow" currently claims that
While I suspect this is correct, I wonder how that author found out?
Is it even possible to rank programming languages according to "popularity" (or in some other, more objective way) in a NPOV way? If so, should we discuss "popularity" here in the programming language article, or split it off into a popular programming language article? See C2: Programming Language Usage Statistics. -- DavidCary 05:19, 11 November 2005 (UTC)
I have given User:K.lee two weeks to put up or give up with his rewrite. His "rewrite" approach is anti-collaborative and he has been claiming that his rewrite is pending for over two years. The "reqeustrewrite" template is only used for k.lee's claim for this article. Wikipedia itself is not much older than that, which meas that no else has had a real and equal "turn" at this article since most of the work for the past two years will be lost when/if k.lee ever commits his version. Fplay 19:55, 9 December 2005 (UTC)
Minor note, but the spelling was changed from "Fortran" to "FORTRAN" since the article was referring to the first version of the language which was indeed spelled that way. The current accepted convention (see the Fortran page) is as follows: FORTRAN, FORTRAN II, FORTRAN IV, FORTRAN 66, FORTRAN 77 are in upper-case, with the new versions (such as Fortran 90) are in lower-case "Fortran" as per their convention. The FORTRAN spelling is a very important issue for FORTRAN programmers worldwide.
I am willing to agree that my first stab at an introductory paragraph for "programming language" might not be ideal. However, the current first paragraph is likely to be incomprehensible to all but the most knowledgable of people.
The first paragraph should provide a brief summary/definition for a reasonably intelligent person who knows nothing about computers. Later material can get technical and dense.
Let's work out some good wording that encapulates this rather nebulous of entities.
Derek farn 14:15, 15 February 2006 (UTC) (Copied here from my talk page -- TuukkaH 16:31, 15 February 2006 (UTC))
A programming language is a language designed to allow programmers to specify a sequence of operations to be performed (usually by a computer). The syntax and semantics of programming languages are much more restricted than natural languages. The written, human accessible, form of a programming language is known as source code and may be translated by a compiler into a form that can be executed by the cpu of a computer.
I'm not sure I would have made the programming/computer language distinction. Yes html is a markup language, but some people think it is a programming language. Who are we to disagree? Do we require programming languages to be Turing complete? Is calling html a computer language just a way of deflecting critisism of not calling it a programming language?
Anyway, back to the problem at hand. I now appreciate that my definition was overly restrictive. Some languages require programmers to specify what, not how. For instance, SQL requires a set of conditions to be specified, not the nuts and bolts of finding the data. What about Prolog which consists of clauses (ok, most implementations have extensions that support a more imperative style). How about:
Derek farn 01:34, 16 February 2006 (UTC)
Yes, it is getting a bit bloated. I think that Turing completeness is a good way of distinguishing programming languages from other kinds of languages (I don't understand you later comment about infinite execution; there are various mathematical formalisms used when discussing properties of languages; I am using the term as a way of putting a minimum limit on the expressive power of a programming language). Or are we going to duck this issue entirely?
People are familiar with what a natural language is, so let's make use of this knowledge.
Derek farn 00:28, 17 February 2006 (UTC)
I don't think that beeing Turing complete is enough to make a programming language. The language should also be used for general programming. That's why at Wikibooks we draw the line between Wikibooks:Programming languages bookshelf and Wikibooks:Domain-specific languages bookshelf. A "domain-specific language" may as well be Turing complete but it is not used for general programming bu only in a specific domain. i.E. PostScript is considered turing complete however I would not consider it a programming language as it not used for general programming - Or has anybody seen a text-editor or an excel clone written in PostScript?
-- Krischik T 07:11, 16 February 2006 (UTC)
What difference does it make what a language is actually used for? And what exactly is general programming?
Postscript is very much like Forth, would you say that Forth is not a programming language?
Most languages are only used by a handful of people. Does that mean they are not programming languages because they are only used for specific tasks (whatever it is that the handful of people write with them)?
To take your example, I know people who work with printers and formatting software who spend large amounts of time writing code in postscript. I even have a program that prints out a calendar that is written in postscript. Just because lots of people choose not to write their software in postscript does not stop it being a programming language.
Derek farn 12:46, 16 February 2006 (UTC)
I propose the following definition, distilled from many sources throughout my education and career. This definition disposes of the qualitative assessments of languages, and focuses on quantifiable features:
Examples of general purpose programming languages:
Examples of non-general purpose programming languages:
If we can agree to this definition, then we could agree to limit the scope of the article to such languages, and then create articles to collect the exceptions. - Harmil 13:48, 6 March 2006 (UTC)
SQL has no looping construct. Ok, the support software behind it contains lots of loops, but that is not the same thing.
I have always thought of SQL as a programming language, but on reflection I guess it should be called something like a database query language and along with markup language's not be included in the list of programming languages. The SQL Standard is not maintained by SC22, the ISO committee responsible for programming languages.
Derek farn 13:08, 16 February 2006 (UTC)
PL/SQL adds programming language features to SQL (because it does not have any), hence the name Programming Language/SQL.
Derek farn 14:17, 17 February 2006 (UTC)
Doesn't the acronym "SQL" stand for "Structured Query Language"? I would not call it a programming language in its own right, but possibly a tool for other languages. -- BBM 22:33, 16 May 2006 (UTC)
I just removed the following comments from the end of "History of programming languages" (reformatted to shorten lines), as their presence (with newlines in between) was creating excessive vertical whitespace, and in any case they really belonged here. Hairy Dude 05:14, 2 February 2006 (UTC)
<!--- Chaotic and not to the point of the section, i.e., "history of comp lang." I move this piece in "Talk" according to wikipedia principle: better no article than bad article ---> <!--- Changed "teached" to "taught." I also disagree with the claim that Java was the first programming language taught in universities. Languages like FORTRAN, Cobol, C, etc., were all extensively taught before Java came on the scene. shrao@acm.org, 2005-02-02 ---> <!-- The assertion: "Java...became...the first programming language taught at the universities" is intended to convey that Java has become the programming language of choice for 100 level language classes in university curricula. It's a rather badly worded sentence. Changed "first" to "initial." tim@fourstonesExpressions.com, 2005-03-24 --> <!-- The sentence is still poorly worded; changed "the initial" to "an introductory". danb@cs.utexas.edu, 2005-10-04 -->
http://hopl.murdoch.edu.au/ reads "This site lists 8276 languages... It has delivered more than 1,720,000 programming languages descriptions in the last 14 months". (I think this latter bit is their way of saying web-hits.) Ewlyahoocom 20:15, 11 March 2006 (UTC)
Is there a better, more representative image of computer code than the current one?
What about this one about the history of programming languages: Image:Historie.png? If not as main image, at least could be used in the History section. ManuelGR 22:44, 14 May 2006 (UTC)
The comment(s) below were originally left at Talk:Programming language/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Needs fixing of a few {{ fact}}s. Tito xd( ?!?) 17:01, 11 September 2006 (UTC) |
Last edited at 17:01, 11 September 2006 (UTC). Substituted at 21:56, 3 May 2016 (UTC)
huezo202 [1] The Parent Trap (1998 film) 5.56×45mm NATO 3Y (disambiguation)
{{
cite journal}}
: Empty citation (
help)
This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
{{reqimageother|a representative image of computer code (see discussion)}}
I have little difficulty understanding the feature specification. I may have it missed it entirely.
I see the need for artwork to match the standard form (fashion) of other articles. An image representative of 'computer code' is simple, photograph some representative structured code that displays an editing style of say indenting. That is code, and its Boring, hence appropriate?.
If the goal is a little larger, to for instance portray the archetypal process of creating a program in a computer programming language using the artistic medium, image. I would suggest an photo of an actual 80 char screen, possibly green, with individual pixels large enough to be seen, the one line of text would read "Syntax Error: redo from start". It could, if one could be thought of, include the actual text that was a classic, obvious yet confusing problem in say basic syntax. This would to me encapsulate the constant process of humans trying battle the computers rigid interpretation of the humans otherwise reasonable requests. The prior syntax error could also be the humorous "please help" instead of just the more likely but terse "help". This idea that the human has mistaken (anthropomorphised) the computer for someone that can be persuaded with politeness is the point of desperation. If the image context was to be larger an undrunk cup of coffee and an empty pizza box would round out the stereotype as these imply both the multitude of hours spent already and yet the incompleteness of the task.
I think we need to point out that programming languages, like other languages, are for humans to express human ideas in. The unique thing about programming languages is that we can automatically translate these expressions into the ones and zeros that computers use. Still, the primary purpose that should be stressed is that these are human languages, for humans to express solutions in which are meant to be understandable by other humans. Since most of the cost of software across the useful lifespan of a program is invested in enhancements and maintenance, the human-readability of programs is much more important than their nature as a "technique for expressing instructions to a computer".
Low-level languages such as machine code are also programming languages. All programming languages are in principle both human- and machine-readable, but the relative emphasis varies.
producing readable code that other humans (and not merely computers) can easily understand is one of the hallmarks of a good programmer. But - this is accomplished mostly through adding comments in a natural human language to the source at key points, and mostly not through the direct use of the programming language itself.
Many people believe this. Quite a few programmers disagree very, very strongly. We believe that producing readable code is mostly through renaming, refactoring, etc. so that the name of an variable communicates (to humans) what it is, the name of a method communicates (to humans) what it does, etc.
See WikiWikiWeb:TreatCommentsWithSuspicion, WikiWikiWeb:ToNeedComments ("Refactor the code properly and you won't need comments.")
-- DavidCary 23:36, 5 Jul 2004 (UTC)
As computers grow more complex, our ability to translate solutions into executable binary has become more abstract, enabling us to express solutions in terms of objects, templates, patterns and aspects. Such abstractions enable a more natural translation from human needs, often expressed as "use cases", into executable solutions. It is this trend toward greater abstraction in the expression of programming solutions that enables programmer productivity to double, despite programmers being locked into fixed biological hardware.
If we gaze deep into the crystal ball, we see the logical extension of this trend as computers that are capable of conversing with humans and creating executable binary programs from desires or solutions expressed in pure human language. The shift away from computer-centric aspects of programming languages toward more human-solution-centric aspects will continue to be the defining characteristic of near-future programming languages.
The "holy grail" of programming language development from this viewpoint would be the creation of a transparent interface to a computing substrate that can extract requirements from the user and instantiate an executable solution. Of course, at this level of abstraction, there is no "programming language" any longer, merely a somewhat pedantic conversation required to define the essential complexity of the problem the user wishes to solve.
At any rate, some mention should be made of this shift from computer-centric aspects toward human-centric aspects and how this affects programmer productivity and how it will shape the role of programming languages going forward.
(moved to Talk:List_of_programming_languages)
This article seems to be written largely from the point of view of a programmer in mainstream languages. For example, interactive use is attributed to interpreters, without considering that eg. many Smalltalk and Lisp systems have native compilers that are used interactively. Sorry for not bothering to work this rant into a considered and balanced edit of the article.
-- han
I disagree. I don't think we need to perpetuate the prejudices of "programmers in mainstream languages" (read C/C++, Java). That would be about as stupid as rewriting the operating systems entry from the point of view of a windows user.
Anyways, someone who has a copy of 'Programming Language Concepts and Paradigms' handy, an exceedingly comprehensible book on the subject, should rewrite this article. -- Ark
This entire section needs to be rewritten from scratch. This includes this topic plus those for the various languages and language concept articles. This is going to be a big project but I think its important. Computer programming is too much a part of modern life to be half covered in an encyclopedia so I have to agree with Ark. Rlee0001 05:25 Jul 27, 2002 (PDT)
On another note: I would limit the list of programming languages here to just the main languages and not all the dialects. For example, there are something like 15-20 dialects of BASIC listed in the BASIC programming language page. Instead of listing all of them, one link for the entire language would suffice. If the user want's a dialect, he/she can stiff get to it from the BASIC page. Same goes for all the languages. Further, I fail to see why people are listing such obscure languages and dialects in an encyclopedia. Some languages have historical or technological significance. Others are just current brandnames for half-written freeware with a source forge page and no user base. Should "Applesoft BASIC" really get its own topic? What did it do to revolutionize the language? Did it have a particularly large user base? Did it establish any conventions which are widely in use today? If not its probably not worthy of its own topic. Even worse is articles like ibasic. This is a basic interpreter for the mac. It has no historical significance: it was just created within the last year by an ameture developer who lives in some small cottage in sweden somewhere. It gets it's own encyclopedia article? Rlee0001
---
I have added two Wikibook links which allready have the texts which where suggested - The first link has an alphabetical and category list of languages - the 2nd link points to short introductions. I hope that helps. -- Krischik T 16:52, 18 September 2005 (UTC)
FYI: For some time I've been working on a ground-up rewrite of this article, because its current state does not make me happy. It's not ready to go live, but I've finally posted my current draft in my user space. I welcome comment on my rewrite; also feel free to edit it directly. It's taking me a long time to do the rewrite, but I plan to replace the entire current article eventually. k.lee 02:28, 27 Aug 2003 (UTC)
The link seems to be working now. :-)
I would like to ask is there a clear concensus that the original article is unsatisfactory to the extent that it needs a re-write? TonyClarke 11:38, 27 Aug 2003 (UTC)
The current main page definitely needs to be re-considered. While it is quite accurate (it seems to me), it is mostly a summary of the topic using the terminology of the discipline, and so is quite inscrutable to a newcomer. It occurs to me that an encyclopedia needs both a specialist and non-specialist version of the general information articles. The specialists need a means to agree on the theoretical structure of the topic, and the newcomers need to learn about it from scratch.
To Do: this is just an outline to get started; add some descriptive text (or put in '/' links) and add a few representative languages to the descriptions
Rlee0001 01:51 Oct 20, 2002 (UTC)
Do Ruby and Scheme really have several hundred thousand users, as in programmers who use them regularly? I doubt it, but I've been wrong before. Wesley
I believe so. But no one can prove either point anyway. --TakuyaMurata
I think it's probably true. For example, OCaml has at least 10^3 vocal users, probably 10^4 real users and probably 10^5 people who've played with it. However, such things are so difficult to quantify (e.g. look at Tiobe's silly estimates, which see huge bias from big business) and even to define (e.g. should we be talking about the total running time of programs written in different languages in order to combat, for example, the majority of Sourceforge projects "written in C++" that have yet to see an alpha release?) that I don't think such (mis)information belongs on Wikipedia. -- Jon Harrop
Hi,
why have we put virtually every programming language on "Foo programming language", and not on "Foo" if "Foo" is reasonably unique? "Programming language" is disambiguation, and that should only be used when there is ambiguity, should it not? -- Eloquence 00:08 Jan 24, 2003 (UTC)
Heck, I'd forgotten about that. The convention as stated at Wikipedia:Naming conventions does now say not to add "programming language" if the name of the language is unique, but I've not done any work in moving pages to reflect this new convention yet. I don't have time to start on this right now, but now that I've been reminded about it, I'll get to it when I have time (others, of course, are more than welcome - indeed encouraged, nay begged - to get there before me). -- Camembert
Great. I'll start some moving, although fixing double redirs will be an annoyance and we'll probably lose some page histories .. -- Eloquence 00:19 Jan 24, 2003 (UTC)
Some people including me might take exception to Python being classified as procedural with bolt-on OO technology, this has been extensively discussed in the Python community
I would like to point out that the programming language list above misses Objective-C. Brent Gulanowski 15:54, 15 Oct 2003 (UTC)
One thing the article appears to miss is the basic elements that all languages must share to be able to express any computable algorithm. I was taught this as "Sequence, Selection, Repetition" but it may be known in a number of ways. I feel this is important, as well as correct attribution to whoever proved it mathematically - was it Turing maybe? I feel that once it is clear that all languages must support this basic elements, then they can be discussed in the abstract without having to say language X has this feature, language Y has this feature, etc. (though that can be added as an extension). GRAHAMUK 23:29, 9 Nov 2003 (UTC)
Umm, very interesting. I think you are talking about what an minimum requirement to make the language capable of simulating the Turing machine. And probably the three criteria Sequence, Selection, Repetition are right. I was thinking like programming language as the mean of abstraction. The Turing machine is the most powerful computer we know today and we don't need any programming languages or such to perform computable algorithms. Programming languages were in my view needed and then invented because human beings need something abstract to make programming easier. Mnemonics in assembly languages are completely meaningless to the computer but are only to we human. This is why I claimed data and code abstraction are basic elements of programming language.
But you are right. Brainfuck is considered as programming language generally, if it misses my picture of programming languages. SImilar small languages like PostScript are also among them. In other words, programming languages are not only for human or are they? -- Taku 06:40, Nov 13, 2003 (UTC)
-- Taku 06:40, Nov 13, 2003 (UTC)
Is "sequence/selection/repetition" what they're teaching the kiddies nowadays? :-) Functional programming folks might prefer to say it's abstraction/evaluation/recursion. Turing equivalence can't be used as a precise criterion, because for instance it assumes infinite storage, and classic Fortran requires fixed-size allocation, yet few would say it's not a programming language. I would call Turing-equivalent languages "general-purpose programming languages", while leaving "programming language" as a more general moniker for any linguistic form of expression that instructs a computer, irrespective of generality. Stan 08:23, 13 Nov 2003 (UTC)
"Sequence, selection, and repetition" are not enough to make a language Turing-complete. A finite-state automaton is capable of all of the above (which correspond to the concatenation, union, and Kleene star operator of regular languages). You also need arbitrary memory allocation (or infinite storage, which is the same thing).
Also, Stan's point re: FORTRAN is a good one. The primary distinction between markup and programming languages is one of emphasis. They overlap --- any Turing-complete markup language (e.g., LaTeX) can also be considered a programming language --- but "everyone knows" when something is primarily a programming language or a markup language.
Finally, it's pretty obvious that any non-joke programming language is intended primarily for human consumption. Joke languages like Brainfuck or Unlambda are simply exceptions that prove the rule --- they're designed by humans for human amusement. k.lee 05:27, 17 Nov 2003 (UTC)
That audience is not other programmers - they know this stuff already -, it is the "intelligent layman" who may not realise that that's the case. Without some proper foundations, the abstraction/evaluation/recursion thing is still too "high concept". Another example I clearly remember from my own early steps with programming (I've been a professional programmer for 20 years, so it's a while ago!) is parameter passing - programmers take for granted that parameters map from caller to callee based on the position of the parameter in a list, but I can remember thinking how error prone that seemed - back then I thought it would be better to "find" the parameter based on its name and ignore its position. Of course knowing now how a CPU actually implements a subroutine, the "position" thing is clearly far more efficient and sensible, and there are likely other undesirable side effects that name binding might have. The point to make here is that to the uninitiated, what seems obvious to a programmer may not be at all obvious to someone else, so starting with implicit fundamentals in order to eliminate any misunderstandings seems a good way to go with an article such as this. Given this approach, I'm not sure that referring to Turing completeness is even a good idea - is the intelligent layman that bothered about the mathematics? I suspect most people coming to WP are looking for a solid, precise but not necessarily complete discussion of the subject. If it grabs them sufficiently, they can look into the maths further if they want. GRAHAMUK 12:05, 18 Nov 2003 (UTC)
I know you guys know much about programming languages and theories in computer science than I do but I was wondering how about historical approach. RISC is a very good article. Although I don't have much expertise in hardware, the article makes a lot of sense to me. The nice thing is that it doesn't know give particular examples of RISC like a list of instruction code but it focuses on why computer science comes up with the idea of RISC in historical and technical context and also gives a plenty of practical examples of architectures. I think the same strategy can be applied to this article. In some ways, the article simply gives a summary of concepts which really doesn't make sense unless you know it before and sometimes goes to too much detailed. For example, I don't see why it is so important to spend a lot of space to discuss type system while some important concepts such as lazy evalution, side effects and referentical transparency are completely omited.
As I keep repeating, I think it is very important to avoid the article is like a textbook. RISC article is completely useless if you want to learn an assembly language and how to make a code generator for RISC architectures. Let alone Wikibook for such case. Well, just a thought. I am just hoping I am any help at all. -- Taku 07:15, Nov 19, 2003 (UTC)
Umm, can we have a slight summary of different programming paradigms? I think it is important to show that why we have come to have several languages and what is difference between them. I am not suggesting to have complete discussion of specific topics like lazy evaluation but I don't know how to put, more like how different languages approach the problem of programming in different ways. I think such discussion can have the article more focused on making sense to the general public about what programming is like. The imperative approach is not only one and surprisingly many even computer programmers know little about how problems are done in programming with many different ways. Many people just learn how to do programming in a particular language like C or LISP and not sure programming language as general. For example, I think it would be nice to see how to reverse a string in many ways as an example. While it is very unnecessary to discuss how arguments are passed or how type system works. The bottom line is that the article must not be a summary of programming languages topics, but should discuss actual problems. I know it is easy to say and hard to achive, just chating about my idealistic view. -- Taku 08:40, Nov 20, 2003 (UTC)
I agree with
Taku -- if you think something is "too much" for an encyclopedia article, please move it to one of the Wiki Books
http://en.wikibooks.org/wiki/IT_bookshelf --
DavidCary 15:36, 26 Jul 2004 (UTC)
(moved to User talk:Dysprosia)
The following piece is cut out of section "History of programming languages".
As the cost of computers has dropped significantly and the complexity of computer programs has increased dramatically, development time is now seen as a more costly consideration than computer time.
Newer integrated, visual development environments have brought clear progress. They have reduced expenditure of time, money (and nerves). Regions of the screen that control the program can often be arranged interactively. Code fragments can be invoked just by clicking on a control. The work is also eased by prefabricated components and software libraries with re-usable code, primarily object-oriented.
Object-oriented methodology was introduced to reduce the complexity of programs, making code easier to write and to maintain. However, some argue that programs have, despite this, continued to increase in complexity. Recent languages are emphasising new features, like meta classes, mix-ins, delegation, program patterns and aspects.
All the above is true, but... This rant is good for a pop-sci article in an online magazine, but not for encyclopedia: chaotic, no *history*, and no *programming languages* Mikkalai 00:37, 13 Dec 2003 (UTC)
" Computer language" is not synonymous with " programming language". A programming language is a computer language used for programming.- Doradus 00:14, 2 Jan 2004 (UTC)
This Section Programming Language needs a total wash and be written from scracth. I volunteer myslef to devote some time to it. As I am new to this site and I am learning how to edit things so this section will be online within a week and with a new style yana209 22:01, Jun 22, 2004 (UTC)
Some languages such as MUMPS and is called dynamic recompilation; emulators and other virtual machines exploit this technique for greater performance.
The clause before the semicolon isn't even complete. I'd fix it, but I'm not sure how exactly it should read. - Furrykef 15:17, 9 Sep 2004 (UTC)
The reason that I deleted the sentence
is that I don't understand it. I suppose it means that the poor design of some programming languages is causing crashes, which raises the question: which languages is the author refering to? A possible interpretation is that programs written in low-level languages like C are prone to crashes, but that is not poor design in my opinion, but a conscious design choice to prefer speed, ease of compilation and flexibility at the price of allowing more crashes. Putting this sentence in the history section confuses me even more, because that implies that the problem of poor design no longer exists, which is at odds with my interpretation. So, please explain what you mean if you reinsert the above sentence. Thanks, Jitse Niesen ( talk) 16:30, 28 July 2005 (UTC)
You're right it shouldn't be in history section because it is such a fundamental point and the bane of any decent computer scientist. We are swamped with poorly written junk languages (and operating systems) that gain prominence via clever marketing rather than on merit. Yes I'm referring to design and unfortunately don't have time right now to flesh it out (though, again, anyone who knows the field should be able to do so). Treat it as a stub and add to the list. No I'm not referring to C and I recognise it's horses for courses. Thanks for the feedback. And please add to the "stub" rather than delete again. Mccready 01:22, 1 August 2005 (UTC)
Of all major programming languages I would only say of C++ and C#, that they are poorly constructed. This could hardly be considered many, and should be discussed in their respecive articles, not here. Your statement also lacks any form of argument. -- R.Koot 01:44, 1 August 2005 (UTC)
I would like to add to the discussion here. In fact there aren't many badly constructed languages at all (I know none, and I've programmed in many). What one can say is the lower the languages (closer to the machine language) the more stress is put on the knowledge of the developer to create a good working program (You need to know your language's do's and don'ts). Take for instance the difference between C++ and Java. Although they both are Modern Languages, C++ still can cause buffer over and underruns while this is pretty difficult to achieve in Java. Although memory leaks still can appear in Java this is probably one of the major flaws in C++ Programs. It's not the language that's bad, but the code that's written in them! -- Paul Sinnema 09:44, 16 September 2005 (UTC)
Programming languages don't cause crashes. Programmers who write bad code, or faulty compilers, runtimes, etc. do. Dysprosia 08:08, 17 September 2005 (UTC) {{Wikibooks Module|Computer programming|Error handling}}
X / 0
will crash a C and C++ programm but will be a CONSTRAINT_ERROR in Ada. So there are differences in how languages tread error conditions and it is rightfull that we describe that to the reader - preaps a comparison table might be helpfull to the reader in that respect. --
Krischik
T 09:08, 18 September 2005 (UTC)X / Y
. Is still the programmers fault or prehaps faulty data delivered to the programm. If you say "programmer" then I ask: Have you realy got a if (Y != 0)
in front of every division - or made a full static analysis proving that Y can't be 0 - for all programms you have ever coded?Should we realy have TIOBE Programming Community Index a link to a biased statistic without telling the readers in which way the index is flawed?
TIOBE's company statement is: We offer out-of-the-box solutions for the programming languages C, C++, C# and Java. — which oviously means they want those languages to look "good" on there statistic.
We should either explain he flaw or remove the link. Of course it is tricky to keep an NPOV when explaining an flaw in some way.
The search query '+"<language> programming" -tv -channel' is used to calculate the TPC Index.
Since you are probably reading this discussing because you are language advocate of some sort you can just google for '+"<language> programming" +tv +channel' with <language> beeing your favorite programming language - and then decide if those pages where rightfully excluded from the index.
My first google hit for my favorite language is: This association is aimed at promoting ADA programming language to the software... TV channel, producer) on any type of platform (OpenTV, MediaHighway). ... — rightfully excluded — don't think so.
The article " buffer overflow" currently claims that
While I suspect this is correct, I wonder how that author found out?
Is it even possible to rank programming languages according to "popularity" (or in some other, more objective way) in a NPOV way? If so, should we discuss "popularity" here in the programming language article, or split it off into a popular programming language article? See C2: Programming Language Usage Statistics. -- DavidCary 05:19, 11 November 2005 (UTC)
I have given User:K.lee two weeks to put up or give up with his rewrite. His "rewrite" approach is anti-collaborative and he has been claiming that his rewrite is pending for over two years. The "reqeustrewrite" template is only used for k.lee's claim for this article. Wikipedia itself is not much older than that, which meas that no else has had a real and equal "turn" at this article since most of the work for the past two years will be lost when/if k.lee ever commits his version. Fplay 19:55, 9 December 2005 (UTC)
Minor note, but the spelling was changed from "Fortran" to "FORTRAN" since the article was referring to the first version of the language which was indeed spelled that way. The current accepted convention (see the Fortran page) is as follows: FORTRAN, FORTRAN II, FORTRAN IV, FORTRAN 66, FORTRAN 77 are in upper-case, with the new versions (such as Fortran 90) are in lower-case "Fortran" as per their convention. The FORTRAN spelling is a very important issue for FORTRAN programmers worldwide.
I am willing to agree that my first stab at an introductory paragraph for "programming language" might not be ideal. However, the current first paragraph is likely to be incomprehensible to all but the most knowledgable of people.
The first paragraph should provide a brief summary/definition for a reasonably intelligent person who knows nothing about computers. Later material can get technical and dense.
Let's work out some good wording that encapulates this rather nebulous of entities.
Derek farn 14:15, 15 February 2006 (UTC) (Copied here from my talk page -- TuukkaH 16:31, 15 February 2006 (UTC))
A programming language is a language designed to allow programmers to specify a sequence of operations to be performed (usually by a computer). The syntax and semantics of programming languages are much more restricted than natural languages. The written, human accessible, form of a programming language is known as source code and may be translated by a compiler into a form that can be executed by the cpu of a computer.
I'm not sure I would have made the programming/computer language distinction. Yes html is a markup language, but some people think it is a programming language. Who are we to disagree? Do we require programming languages to be Turing complete? Is calling html a computer language just a way of deflecting critisism of not calling it a programming language?
Anyway, back to the problem at hand. I now appreciate that my definition was overly restrictive. Some languages require programmers to specify what, not how. For instance, SQL requires a set of conditions to be specified, not the nuts and bolts of finding the data. What about Prolog which consists of clauses (ok, most implementations have extensions that support a more imperative style). How about:
Derek farn 01:34, 16 February 2006 (UTC)
Yes, it is getting a bit bloated. I think that Turing completeness is a good way of distinguishing programming languages from other kinds of languages (I don't understand you later comment about infinite execution; there are various mathematical formalisms used when discussing properties of languages; I am using the term as a way of putting a minimum limit on the expressive power of a programming language). Or are we going to duck this issue entirely?
People are familiar with what a natural language is, so let's make use of this knowledge.
Derek farn 00:28, 17 February 2006 (UTC)
I don't think that beeing Turing complete is enough to make a programming language. The language should also be used for general programming. That's why at Wikibooks we draw the line between Wikibooks:Programming languages bookshelf and Wikibooks:Domain-specific languages bookshelf. A "domain-specific language" may as well be Turing complete but it is not used for general programming bu only in a specific domain. i.E. PostScript is considered turing complete however I would not consider it a programming language as it not used for general programming - Or has anybody seen a text-editor or an excel clone written in PostScript?
-- Krischik T 07:11, 16 February 2006 (UTC)
What difference does it make what a language is actually used for? And what exactly is general programming?
Postscript is very much like Forth, would you say that Forth is not a programming language?
Most languages are only used by a handful of people. Does that mean they are not programming languages because they are only used for specific tasks (whatever it is that the handful of people write with them)?
To take your example, I know people who work with printers and formatting software who spend large amounts of time writing code in postscript. I even have a program that prints out a calendar that is written in postscript. Just because lots of people choose not to write their software in postscript does not stop it being a programming language.
Derek farn 12:46, 16 February 2006 (UTC)
I propose the following definition, distilled from many sources throughout my education and career. This definition disposes of the qualitative assessments of languages, and focuses on quantifiable features:
Examples of general purpose programming languages:
Examples of non-general purpose programming languages:
If we can agree to this definition, then we could agree to limit the scope of the article to such languages, and then create articles to collect the exceptions. - Harmil 13:48, 6 March 2006 (UTC)
SQL has no looping construct. Ok, the support software behind it contains lots of loops, but that is not the same thing.
I have always thought of SQL as a programming language, but on reflection I guess it should be called something like a database query language and along with markup language's not be included in the list of programming languages. The SQL Standard is not maintained by SC22, the ISO committee responsible for programming languages.
Derek farn 13:08, 16 February 2006 (UTC)
PL/SQL adds programming language features to SQL (because it does not have any), hence the name Programming Language/SQL.
Derek farn 14:17, 17 February 2006 (UTC)
Doesn't the acronym "SQL" stand for "Structured Query Language"? I would not call it a programming language in its own right, but possibly a tool for other languages. -- BBM 22:33, 16 May 2006 (UTC)
I just removed the following comments from the end of "History of programming languages" (reformatted to shorten lines), as their presence (with newlines in between) was creating excessive vertical whitespace, and in any case they really belonged here. Hairy Dude 05:14, 2 February 2006 (UTC)
<!--- Chaotic and not to the point of the section, i.e., "history of comp lang." I move this piece in "Talk" according to wikipedia principle: better no article than bad article ---> <!--- Changed "teached" to "taught." I also disagree with the claim that Java was the first programming language taught in universities. Languages like FORTRAN, Cobol, C, etc., were all extensively taught before Java came on the scene. shrao@acm.org, 2005-02-02 ---> <!-- The assertion: "Java...became...the first programming language taught at the universities" is intended to convey that Java has become the programming language of choice for 100 level language classes in university curricula. It's a rather badly worded sentence. Changed "first" to "initial." tim@fourstonesExpressions.com, 2005-03-24 --> <!-- The sentence is still poorly worded; changed "the initial" to "an introductory". danb@cs.utexas.edu, 2005-10-04 -->
http://hopl.murdoch.edu.au/ reads "This site lists 8276 languages... It has delivered more than 1,720,000 programming languages descriptions in the last 14 months". (I think this latter bit is their way of saying web-hits.) Ewlyahoocom 20:15, 11 March 2006 (UTC)
Is there a better, more representative image of computer code than the current one?
What about this one about the history of programming languages: Image:Historie.png? If not as main image, at least could be used in the History section. ManuelGR 22:44, 14 May 2006 (UTC)
The comment(s) below were originally left at Talk:Programming language/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Needs fixing of a few {{ fact}}s. Tito xd( ?!?) 17:01, 11 September 2006 (UTC) |
Last edited at 17:01, 11 September 2006 (UTC). Substituted at 21:56, 3 May 2016 (UTC)
huezo202 [1] The Parent Trap (1998 film) 5.56×45mm NATO 3Y (disambiguation)
{{
cite journal}}
: Empty citation (
help)