This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
I don't see how the image contributes to the article. It gives the misleading impression that many C programming mistakes manifest themselves in those symptoms, which is not the case. Therefore I removed the image. Neilc 00:55, 29 January 2006 (UTC)
As the originator of the image, I take exception (pun intended) to it's removal and the reason given. I contend that the image perfectly illustrates what I think of as a feature of C, rather than a criticism - in C, if you don't program robustly, you don't get robust programs. I've been programming in C since 1988, and taught the language at college for three years, and while I have moved on to the beautiful abstractions and environment protection of ++ and .NET, I still like to code test-routines in good old vanilla C - if I can get it to work, and work every time, in the unforgiving world of C, I know it's gonna perform at the higher level. I got the image just yesterday, when I miss-cast a double to a long pointer, and ending up "pointing" at screen memory - but hey, sometimes you want to "point" to screen-memory! C enthusiasts love the language's brutal, down-to-earth honesty, and part of the fascination is that programming in C is difficult - but people don't build
Cutty Sark's out of match-sticks because it's easy, they don't build houses of cards without ever expecting them to fall; sometimes it's fun to bike without a crash helmet, and that's what C can be like. But with dedication and attention-to-detail, C forces you to program robustly, exposes you to spectacular falls, and you become a better programmer for it. (Even in ++, I never use automatic garbage collection, I always handle my own exceptions - I don't want the system to tidy up after me, I take responsibilty for that myself, as any good craftsman does).
In my experience of teaching C, I often found that students would initially baulk at C's rigour, but many ended up loving it - C's a hard task-master, but if you program diligently, the results can be highly rewarding.
I concede that the tag-line I added to the image may have been a bit flippant, and if it gave the impression that such symptoms are common, then that is misleading - in 18 years, I've
only seen such a crash twice. But I contend that the image adds an important idea to the
article - in C, if you program sloppily, you get sloppy programs.
So I am going to restore the image, with a better explanation. Please respect the wishes of
an experienced C programmer, and Wikipedia inclusionist.
Camillus
(talk) 11:37, 29 January 2006 (UTC)
>Parameters that are always passed to functions by value, never by reference
It is typical in C programming, and promoted by the C syntax, to pass pointers to functions. I agree that pointers are passed as parameters and by value, however, functions typically do end up with pointers. Therefore, a function may manipulate the value of variables, arrays and structures which have been passed to it using pointer parameters, which is the point of the issue. I suggest this to be more concise, and accurate:
Parameters and references are always passed to functions by value.
What ??? this whole item is silly and wrong. this isn't C++, its C. Pointers are refererences: I make a struct. I pass the address of the struct to a function. What about this isn't pass by reference?
Some features that C lacks that are found in other languages include: >Automatic garbage collection
Local variables defined in a function are unallocated when the function is exited, unless defined static. Thus, C implementations do automatically free memory without the programmer needing to explicitly unallocate memory. Unless the memory has been allocated dynamically using malloc(), in which case, specific deallocation is required. I suggest:
Automatic garbage collection of dynamically allocated memory
new
keyword to create an object, and Java will free it when it goes out of scope. You can do the same thing in C++, but you have to explicitly free
each object. Local variables defined within a function are automatic variables, not dynamically created variables.
wrp103 (Bill Pringle) -
Talk 01:38, 13 September 2005 (UTC)I propose a reorganization of the series on the C programming language. I put forth the following four general article headings:
I think that the current situation is a mess and needs reform. Comments welcome, of course.— Kbolino 06:00, Apr 7, 2005 (UTC)
Any reorganization should probably reflect that an encyclopedia article on a programming language might have several goals to address different kinds of readers, but that being in any way a tutorial (or even a reference manual) is probably *not* one of those goals. I can think of three worthy sections, all worthy of inclusion (i.e. as subsections):
There's definitely a place for a few examples, if a reader wants to be able to recognize a C program when he sees one. But I doubt that a full syntactic or semantic definition of the entire language is necessary or appropriate. Steve Summit 05:53, 13 July 2005 (UTC)
Am I the only one that thinks this article needs a serious trim? Large parts of it should be split onto their own page; the main page should be about 20% of what it currently is.
Akihabara 09:23, 22 September 2005 (UTC)
can somebody please expand on the O(1) req on operators? O(1) as a function of what? Is a C impl on x86 for example is not complaint if for example the expr:
' x >>= c; ' is O(c) ? (since x86 shl reg,cl is O(cl)), how about mult/div? --Oyd11
I would try to explain it if I could find it, but I can't. Does this comment refer to some obsolete text? Steve Summit
Do you think it's good to use Evolution for writing down the changes, or K and R C will suffice? Thanks, Uriyan
Actually I don't think either topic deserves a sub-page. I think a section on the main page would do. -- drj
Numbering from 0 is not eccentricity. It's how computers think. -- Taw
I don't even think thats an eccentricity since its fairly common in programming to do it that way --Alan D
Java uses the same numbering scheme. Possibly because there is a zero-based array of month names. Although numbering from zero in both C and Java is more of a convenience for the routines that perform array handling than anything else. If your array starts at location AC00, the address of the 0th element is AC00, the address of element 1 is AC00 + element_length, the address of the 2nd element is AC00 + 2 * element_length, and so on. I prefer to start at element 1, but we're all pretty much stuck with the convention. (Perhaps I'll step up and start writing about software engineering, something I actually have some expertise in.) Ed Poor
Example date (works for C and Java): May 29
Month=4, day=29
Why is the month shifted but not the day of the month? Besides, if it is "the month is not a number", you deserve a slap in the face from your Korean (or Japanese or Chinese) secretary; ask her about it! In all these languages, May is literally "five month". (I think.) -- Juuitchan
Yes, but you'd translate it into english as 'month five' if you didn't want to use the name - 五月 is how you'd write it in chinese characters.
If you want to blame someone for the system in use, I'd suggest starting with the Jesuits, who as the scientific wing of the catholic church spread clockwork and steel cannons across much of asia. They also brought their date systems with them, which is why a 24 hour clock and seven day week is pretty much universally accepted.
The difference between day and month in terms of indexing can be reduced to that of the difference between both nominal and cardinal values and ordinal values.
The system we use uses nominal values for months, and ordinal values for days and years. It should probably use ordinal values for the lot. Note, though that Korean (and I expect Japanese and Chinese) uses cardinal values, rather than ordinal values for month. O-Ueol (五月) vs' O-Beon-JJae-Ueol (五番째月), except that normally the chinese characters aren't used in the second case.
This was probably far more than you wanted to know, but the point is that given that there are trivial mappings between these forms, the particular representation format chosen isn't that big a deal.
If you want to see _real_ problems with computational notions of time, I'd refer you to 'A Long, Painful History of Time' (Erik Naggum [1]).
-- 203.231.161.129
You know, some research shows the idea of clock work and units of 12 come from a much older basis even if promotion of that method was performed by others: This page examines the history of the 24 hour analog dial
-- laundrypowder
The claim that C is the dominant microcomputer applications language is now somewhat dubious, IMHO. In the Windows world, it's probably a toss-up between Microsoft's C++ and Visual Basic, I'd guess. C still rules for embedded systems (that is, the ones not written in Assembler), in the Unix world (particularly for apps that don't have a GUI), and people who can't be bothered remembering C++'s arcane semantics for multiple inheritance and operator overloading :) -- Robert Merkel
Recent stats of what percent of code in RedHat–dash is written in which programing language, clearly shows that C# is dominant, at least on Unices. It's very probable that C++ is much more popular on Windows Word, but I seriously doubt that many apps are written in VB. -- Taw
It would be nice to have links to an online manual and online tutorials. There probably are some with under the GNU Free Documentation License. --Hirzel
comp.lang.c (a wonderful resource for C, btw -- some very competent people there) recommends Tom Torfs' tutorial at http://www.geocities.com/tom_torfs/c.html and Steve Summit's class notes at http://www.eskimo.com/~scs/cclass/cclass.html.
Honestly, most online tutorials for C are terrible and demonstrate that the author has little clue about the actual C standard. -- mgmei
Moved from article:
I'm not sure what 203.231.161.129 means about "virtual machinehood", but I can see that under this definition, x86 assembly language could be considered a high level language due to the existence of emulators. -- Tim Starling 07:54 26 Jun 2003 (UTC)
Modern x86 implementations are more and more moving toward risc cores running virtual machines (in microcode or whatever) which provide x86 compatible instruction sets. Hyperthreading for example, is a case of attempting to exploit such an underlying architecture without affecting the definition of the x86 machine (ie, mapping many registers to the x86's few, so that you can increase some hparallelism).
In this regard the x86 architecture is shifting toward defining a virtual machine, much like JVM, rather than specifying a hardware cpu. Not that there is a meaningful distinction in any case.
Likewise, we can see C programs running in a virtual machine defined by the C standard, and supported by the runtime structure of the binary produced.
High and low level are fundamentally ideological terms, and have almost no objective meaning, nor objective definition what-so-ever. So far the only meaningful definition of level that I've found has been in terms of 'the ability to express invariant structure' with more being higher level. Note that by this definition, python ends up being only slightly higher than assembly, since it has almost no ability to define invariant structure.
I wish people would stop using these terms, anyhow, as they are very silly.
-- 203.231.161.129
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
I don't see how the image contributes to the article. It gives the misleading impression that many C programming mistakes manifest themselves in those symptoms, which is not the case. Therefore I removed the image. Neilc 00:55, 29 January 2006 (UTC)
As the originator of the image, I take exception (pun intended) to it's removal and the reason given. I contend that the image perfectly illustrates what I think of as a feature of C, rather than a criticism - in C, if you don't program robustly, you don't get robust programs. I've been programming in C since 1988, and taught the language at college for three years, and while I have moved on to the beautiful abstractions and environment protection of ++ and .NET, I still like to code test-routines in good old vanilla C - if I can get it to work, and work every time, in the unforgiving world of C, I know it's gonna perform at the higher level. I got the image just yesterday, when I miss-cast a double to a long pointer, and ending up "pointing" at screen memory - but hey, sometimes you want to "point" to screen-memory! C enthusiasts love the language's brutal, down-to-earth honesty, and part of the fascination is that programming in C is difficult - but people don't build
Cutty Sark's out of match-sticks because it's easy, they don't build houses of cards without ever expecting them to fall; sometimes it's fun to bike without a crash helmet, and that's what C can be like. But with dedication and attention-to-detail, C forces you to program robustly, exposes you to spectacular falls, and you become a better programmer for it. (Even in ++, I never use automatic garbage collection, I always handle my own exceptions - I don't want the system to tidy up after me, I take responsibilty for that myself, as any good craftsman does).
In my experience of teaching C, I often found that students would initially baulk at C's rigour, but many ended up loving it - C's a hard task-master, but if you program diligently, the results can be highly rewarding.
I concede that the tag-line I added to the image may have been a bit flippant, and if it gave the impression that such symptoms are common, then that is misleading - in 18 years, I've
only seen such a crash twice. But I contend that the image adds an important idea to the
article - in C, if you program sloppily, you get sloppy programs.
So I am going to restore the image, with a better explanation. Please respect the wishes of
an experienced C programmer, and Wikipedia inclusionist.
Camillus
(talk) 11:37, 29 January 2006 (UTC)
>Parameters that are always passed to functions by value, never by reference
It is typical in C programming, and promoted by the C syntax, to pass pointers to functions. I agree that pointers are passed as parameters and by value, however, functions typically do end up with pointers. Therefore, a function may manipulate the value of variables, arrays and structures which have been passed to it using pointer parameters, which is the point of the issue. I suggest this to be more concise, and accurate:
Parameters and references are always passed to functions by value.
What ??? this whole item is silly and wrong. this isn't C++, its C. Pointers are refererences: I make a struct. I pass the address of the struct to a function. What about this isn't pass by reference?
Some features that C lacks that are found in other languages include: >Automatic garbage collection
Local variables defined in a function are unallocated when the function is exited, unless defined static. Thus, C implementations do automatically free memory without the programmer needing to explicitly unallocate memory. Unless the memory has been allocated dynamically using malloc(), in which case, specific deallocation is required. I suggest:
Automatic garbage collection of dynamically allocated memory
new
keyword to create an object, and Java will free it when it goes out of scope. You can do the same thing in C++, but you have to explicitly free
each object. Local variables defined within a function are automatic variables, not dynamically created variables.
wrp103 (Bill Pringle) -
Talk 01:38, 13 September 2005 (UTC)I propose a reorganization of the series on the C programming language. I put forth the following four general article headings:
I think that the current situation is a mess and needs reform. Comments welcome, of course.— Kbolino 06:00, Apr 7, 2005 (UTC)
Any reorganization should probably reflect that an encyclopedia article on a programming language might have several goals to address different kinds of readers, but that being in any way a tutorial (or even a reference manual) is probably *not* one of those goals. I can think of three worthy sections, all worthy of inclusion (i.e. as subsections):
There's definitely a place for a few examples, if a reader wants to be able to recognize a C program when he sees one. But I doubt that a full syntactic or semantic definition of the entire language is necessary or appropriate. Steve Summit 05:53, 13 July 2005 (UTC)
Am I the only one that thinks this article needs a serious trim? Large parts of it should be split onto their own page; the main page should be about 20% of what it currently is.
Akihabara 09:23, 22 September 2005 (UTC)
can somebody please expand on the O(1) req on operators? O(1) as a function of what? Is a C impl on x86 for example is not complaint if for example the expr:
' x >>= c; ' is O(c) ? (since x86 shl reg,cl is O(cl)), how about mult/div? --Oyd11
I would try to explain it if I could find it, but I can't. Does this comment refer to some obsolete text? Steve Summit
Do you think it's good to use Evolution for writing down the changes, or K and R C will suffice? Thanks, Uriyan
Actually I don't think either topic deserves a sub-page. I think a section on the main page would do. -- drj
Numbering from 0 is not eccentricity. It's how computers think. -- Taw
I don't even think thats an eccentricity since its fairly common in programming to do it that way --Alan D
Java uses the same numbering scheme. Possibly because there is a zero-based array of month names. Although numbering from zero in both C and Java is more of a convenience for the routines that perform array handling than anything else. If your array starts at location AC00, the address of the 0th element is AC00, the address of element 1 is AC00 + element_length, the address of the 2nd element is AC00 + 2 * element_length, and so on. I prefer to start at element 1, but we're all pretty much stuck with the convention. (Perhaps I'll step up and start writing about software engineering, something I actually have some expertise in.) Ed Poor
Example date (works for C and Java): May 29
Month=4, day=29
Why is the month shifted but not the day of the month? Besides, if it is "the month is not a number", you deserve a slap in the face from your Korean (or Japanese or Chinese) secretary; ask her about it! In all these languages, May is literally "five month". (I think.) -- Juuitchan
Yes, but you'd translate it into english as 'month five' if you didn't want to use the name - 五月 is how you'd write it in chinese characters.
If you want to blame someone for the system in use, I'd suggest starting with the Jesuits, who as the scientific wing of the catholic church spread clockwork and steel cannons across much of asia. They also brought their date systems with them, which is why a 24 hour clock and seven day week is pretty much universally accepted.
The difference between day and month in terms of indexing can be reduced to that of the difference between both nominal and cardinal values and ordinal values.
The system we use uses nominal values for months, and ordinal values for days and years. It should probably use ordinal values for the lot. Note, though that Korean (and I expect Japanese and Chinese) uses cardinal values, rather than ordinal values for month. O-Ueol (五月) vs' O-Beon-JJae-Ueol (五番째月), except that normally the chinese characters aren't used in the second case.
This was probably far more than you wanted to know, but the point is that given that there are trivial mappings between these forms, the particular representation format chosen isn't that big a deal.
If you want to see _real_ problems with computational notions of time, I'd refer you to 'A Long, Painful History of Time' (Erik Naggum [1]).
-- 203.231.161.129
You know, some research shows the idea of clock work and units of 12 come from a much older basis even if promotion of that method was performed by others: This page examines the history of the 24 hour analog dial
-- laundrypowder
The claim that C is the dominant microcomputer applications language is now somewhat dubious, IMHO. In the Windows world, it's probably a toss-up between Microsoft's C++ and Visual Basic, I'd guess. C still rules for embedded systems (that is, the ones not written in Assembler), in the Unix world (particularly for apps that don't have a GUI), and people who can't be bothered remembering C++'s arcane semantics for multiple inheritance and operator overloading :) -- Robert Merkel
Recent stats of what percent of code in RedHat–dash is written in which programing language, clearly shows that C# is dominant, at least on Unices. It's very probable that C++ is much more popular on Windows Word, but I seriously doubt that many apps are written in VB. -- Taw
It would be nice to have links to an online manual and online tutorials. There probably are some with under the GNU Free Documentation License. --Hirzel
comp.lang.c (a wonderful resource for C, btw -- some very competent people there) recommends Tom Torfs' tutorial at http://www.geocities.com/tom_torfs/c.html and Steve Summit's class notes at http://www.eskimo.com/~scs/cclass/cclass.html.
Honestly, most online tutorials for C are terrible and demonstrate that the author has little clue about the actual C standard. -- mgmei
Moved from article:
I'm not sure what 203.231.161.129 means about "virtual machinehood", but I can see that under this definition, x86 assembly language could be considered a high level language due to the existence of emulators. -- Tim Starling 07:54 26 Jun 2003 (UTC)
Modern x86 implementations are more and more moving toward risc cores running virtual machines (in microcode or whatever) which provide x86 compatible instruction sets. Hyperthreading for example, is a case of attempting to exploit such an underlying architecture without affecting the definition of the x86 machine (ie, mapping many registers to the x86's few, so that you can increase some hparallelism).
In this regard the x86 architecture is shifting toward defining a virtual machine, much like JVM, rather than specifying a hardware cpu. Not that there is a meaningful distinction in any case.
Likewise, we can see C programs running in a virtual machine defined by the C standard, and supported by the runtime structure of the binary produced.
High and low level are fundamentally ideological terms, and have almost no objective meaning, nor objective definition what-so-ever. So far the only meaningful definition of level that I've found has been in terms of 'the ability to express invariant structure' with more being higher level. Note that by this definition, python ends up being only slightly higher than assembly, since it has almost no ability to define invariant structure.
I wish people would stop using these terms, anyhow, as they are very silly.
-- 203.231.161.129