This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
Hi, everyone. I have been trying to avoid this article owing to the character of the talk-page discussion and in regards to certain individuals who are using fowl language and for those who are trying to push “original” conceptions onto this page. For the record, Nonsuch, Jheald, and Yevgeny are correct in their viewpoint and I commend them for their efforts. As for User:Jim62sch, who said “Helllooooo???!!!! It really doesn't fucking matter”, you are disqualified from any further intellectual discourse. As for everyone else, let me remind you that the point of wikipedia is to publish established information. Presently, we have accumulated over 200 kilobytes of talk-page discussion all because of the personal views of one person, whom I won’t name, who states that he has previously taught entropy to thousands of humanities students even though, as he states, he didn’t really know what it was, and that he has never read any of the original manuscripts of either Sadi Carnot, Emile Clapeyron, or Rudolf Clausius, and who says that he is poor, or in his own words a “baby”, at math and statistics, but that after reading this article by Kelvin:
He suddenly figured it all out. The point of a talk page is not to debate someone’s personal theories.
I do not want to get involved in this debate, but I happen to have glanced at the entropy page a few days ago and I notice that the un-named individual is referenced nine times in the entropy article as well as two external links, all essentially going to the views of one person. There is even a “religious-viewpoint” entropy link in the external links? Now, I am no expert on entropy, but I do own about a dozen books on entropy:
And the unnamed person is not mentioned in these books nor is his “personal” theory on energy dispersal. Because of this, I would favor the recommendation of User Par who suggests removing all of the energy dispersal parts. However, although the un-named person has never published his “energy dispersal” views neither in book form nor in peer-reviews article form, I do remember having his seen his name mentioned in one chemistry book, somewhere?
Thus, to remedy this awkward situation, I am going to be bold and move all of this “energy dispersal” stuff as well as all of the related talk page stuff to it’s own page, i.e. entropy (energy dispersal) and Talk:Entropy (energy dispersal), and cite the unknown person as the primary author of the concept. In this manner, the introductory or novice reader can get the core, basic, historical, published, sourced entropy view-point on the main page and go to the “see also” section for other such side-line “energy dispersal” theories. Please do not revert this change, this unnecessary original research discussion has gone on long enough. Thanks: -- Sadi Carnot 12:51, 6 October 2006 (UTC)
Gentleman, I have been working to build this entropy page for some time. My only interest is to see basic information on this page. I do not wish to dig into an argument with anyone. My change was only to move Lambert's entropy theories (which are self-sourced by his own websites) to their own page. Please let my revisions sit for a week to see what everyone else thinks according to vote. If, after we vote, the editors here favor keeping whole sections devoted to Lambert's theories on "energy dispersal" here with 9 links to his websites as references well than fine with me. Please let the changes sit for seven days and then we can revert or keep according to consensus. Thanks: -- Sadi Carnot 12:50, 7 October 2006 (UTC)
To aid Sadi, PJacobi, Nonsuch and others to assess the breadth of the acceptance of my views in the scientific community's area of chemistry, I am assembling the list of published books, textbooks, and peer-reviewed journals that refer to my writing about energy dispersal and entropy and the second law. It will be posted here by 23:00 GMT 8 October. FrankLambert 21:17, 7 October 2006 (UTC)
People following this page might like to express their views in the AfD, one way or the other. Jheald 03:55, 8 October 2006 (UTC).
Thanks for the heads-up on this AfD, Jheald. Your vote to keep the article seems right to me, and most of the other contributors appear to support this position so the page should not be in danger. ... dave souza, talk 19:01, 8 October 2006 (UTC)
To resolve this issue, I am going to type up the correct presentation of entropy by Nobelist Gilbert Lewis and Merle Randall from their famous 1923 textbook Thermodynamics and the Free Energy of Chemical Substances. According to both chemistry historian Henry Leicester, from his The Historical Background of Chemistry (1956), and Nobelist Ilya Prigogine, from his 1998 textbook Modern Thermodynamics, before 1923 chemists did not make use of “entropy” but instead used the concept of chemical affinity to calculate the driving force of chemical reactions. According to these sources, Lewis and Randall’s influential textbook led to the replacement of the term “affinity” by the term “free energy” in the English-speaking world.
Hence, all modern-day chemistry textbooks are based on Lewis and Randall’s description of entropy, which they define as, based on the work of Rudolf Clausius and Willard Gibbs, a “scale of irreversibility” that quantitatively measures the irreversible "work energy" or the "degree of degradation" of the system in the irreversible process. This is the correct perspective. I will type up the full presentation, from their textbook, over the next few days (Sorry, I ran out of time today). -- Sadi Carnot 18:01, 8 October 2006 (UTC)
see [2] :
Jim, I the page is only locked so that we can all come to an agreement in order to reasonably solve this issue. People have been trying to push Frank Lambert's personal theories, which do not accord with thermodynamics textbooks, into this article since June of '06. This is a long-standing issue. -- Sadi Carnot 18:17, 8 October 2006 (UTC)
Concerning this edit: [3] "Rv non-notable person's theory deleted again" This makes no sense. Either we should not have an article on this person, and we do, or he is notable. Pjacobi, please explain your reasoning - thanks. KillerChihuahua ?!? 20:03, 8 October 2006 (UTC)
[8], [9], [10], [11], [12], [13], [14] . Oh yeah, there are more -- I'm just "warming up". ;)
There are two authors of a popular scientific book who used my ideas as a starting point for a couple of pages and ascribed that start to me. Far more important, the names of some 36 textbook authors (who have been so convinced of the validity of my approach to entropy that they have risked their reputations on it) and their text titles are listed below.
Popular Scientific Book
“Into the Cool – Energy Flow, Thermodynamics, and Life” by Eric D. Schneider and Dorion Sagan; University of Chicago Press, 2005, 362 pages, ISBN 0-226-73936-8
Textbooks
“Chemistry, The Molecular Science”, Second Edition J. W. Moore, C. L. Stanistski, P. C. Jurs; Thompson Learning, 2005, 1248 pages, ISBN 0-534-42201-2
“Chemistry, The Molecular Nature of Matter and Change”, Fourth Edition M. S. Silberberg; McGraw-Hill, 2006, 1183 pages, ISBN 0-07-255820-2
“Conceptual Chemistry”, Second Edition J. Suchocki; Benjamin Cummings, 2004, 706 pages, ISBN 0-8053-3228-6
“Chemistry, Matter and Its Changes”, Fourth Edition J. E. Brady and F. Senese; John Wiley, 2004, 1256 pages, ISBN 0-471-21517-1
“General Chemistry”, Eighth Edition D. D. Ebbing and S. D. Gammon; Houghton-Mifflin, 2005, 1200 pages, ISBN 0-618-399410
“Chemistry: A General Chemistry Project of the American Chemical Society”, First Edition J. Bell, et al.;
W. H. Freeman, 2005, 820 pages,
ISBN
0-7167-3126-6
“Atkins’ Physical Chemistry”, Eighth Edition P. Atkins and J. de Paula;
W. H. Freeman, 1072 pages,
ISBN
0-7167-8759-8
Physical Chemistry for the Life Sciences, First Edition P. Atkins and J. de Paula; W. H. Freeman, 699 pages, ISBN 0-7167-8628-1
“Chemistry: The Central Science”, Tenth Edition T. L. Brown, H. E. LeMay, and B. E. Bursten;
Prentice Hall, 2006, 1248 pages,
ISBN
0-13-109686-9
I do not have copies of the seven other US chemistry textbooks that have changed their approach to entropy on the basis of my articles. If the above list is not convincing and such further evidence would be conclusive, I would be glad to supply it. However, the basic information (lacking only specific page numbers) is at www.entropysite.com/#whatsnew at December 2005 under the following authors/titles:
My major recent articles are all accessible under "articles" at www.entropysite.com . Their locations in issues of the Journal of Chemical Education ("JCE") are given below.
The following list is of peer-reviewed articles that cited one or more of mine.
I already have a category "thermodynamicists" list started of established thermodynamicists; Constantin Caratheodory is a good classic example, his theories are respected and written up in 100s of books and articles. -- Sadi Carnot 13:24, 9 October 2006 (UTC)
I have just noticed this current fracas on the topic of entropy. While I haven't yet had a chance to closely familiarize myself with the explanatory merits (if any) of Frank Lambert's approach, I need to say I am disturbed by the treatement he appears to be receiving from some editors. Although WP quidelines discourage self aggrandization through rules like WP:NOR and guidelines or policies such as WP:BIO, WP:OWN, etc., it is plain that the authors of WP:VER verified sources have a right to participate in the formation of articles. WP:NPOV#Undue_weight plainly allows for the inclusion of notable minority views in proportion to their comparative notability. While Mr. Lambert's approach does not at first blush square with classical thermodynamics (again, I haven't familiarized myself yet), I see absolutely no reason why it should be deleted out of hand as was done recently.
It seems to me the appropriate approach here would be to include this material in a reasonable summary form, linking to another main article if it's too lengthy to explain in a brief section, and note for the reader of the article that it's "an alternative modern approach", or some other way of accurately expressing it's positioning in the marketplace of ideas (per WP:VER of course). . ... Kenosis 16:00, 9 October 2006 (UTC)
Comment: for everyone to note, two of those who want to keep Lambert's article and related theories (e.g. "entropy=dispersion not disorder"), i.e. User:Jim62sch and User talk:Dave souza, seem to be only doing this, based on their edit and comment history, for intelligent design purposes, what ever they are? On the second law of thermodynamics talk page, for example, User:Dave souza has stated that the second law of thermodynamics as stated by Rudolf Clausius in 1854, i.e. “heat cannot of itself pass from a colder body to a hotter body”, is now incorrect due to recent talk page discussions. He seems to think that Lambert's website theories are the correct ones rather than Clausius and he is the one that started this mess by adding 9 ref links to Lambert's website. Whatever the case, Souza’s views and edits are not scientific.
Moreover, today I looked through my collection of entropy-related textbooks (70 thermodynamics textbooks, 7 physics textbooks, 5 biochemistry textbooks, 4 chemistry textbooks, 3 physical chemistry textbooks, and others) and the “energy dispersal” concept is not there neither is Frank Lambert. This is entirely a sideline website theory that happens to have good search rankings because the author has bought up all of the URLs related to entropy, e.g. ‘entropysimple.com’, ‘entropysite.com’, ‘2ndlaw.com’, ‘secondlaw.com’, etc., and in which he uses these to push out his own personal theories. Now, there is certainly nothing wrong with that. But, self-published website theories do not justify scientific correctness nor reasons to be included into the Wikipedia entropy article, which is from where all this mess is stemming. -- Sadi Carnot 10:18, 9 October 2006 (UTC)
I note that in your second paragraph you actually address the article, which is good - but as there is a list of textbooks above, it appears your collection may be out of date. KillerChihuahua ?!? 11:27, 9 October 2006 (UTC)
It appears to me that part of the problem here is over the common meaning of the word "disorder" or "disordered". Despite the use of the word to describe entropy, despite the involvement of chaotic factors and difficulties defining the boundary characteristics of closed systems, entropy is a process that is as orderly and quantifiable as are "permeability", "diffusion", "homeostasis", "absorption", "osmosis", "convection" and other such dynamics. It seems to me the use of analogies (such as were used in the recently deleted section) does not threaten the order of the cosmos or even the order of WP, nor should it threaten editors of the current article. Unlike the article on the Second law of thermodynamics, this article need not, in my judgment, be necessarily limited to formulaic and/or logical positivist explanations of the concept of entropy. Surely there must be room to gain a stable consensus on agreeable ways to explain this stuff to the reader in an analogical way, per WP:NOR, WP:VER, and WP:NPOV#undue_weight of course. ... Kenosis 16:34, 9 October 2006 (UTC)
Yup, been there done that, got tired of it and stopped editing 2LOT for about six months. See Talk:Second law of thermodynamics/creationism for the old discussion. Wade wanted it in, I didn't, vote on the straw poll was about split. Even Wade (a creationist and ID proponent) knew it was bogus, so there shouldn't be any editing wars about that, at least. And now I move we close this section and get back to the Entropy article here. KillerChihuahua ?!? 19:50, 9 October 2006 (UTC)
We have one mention of Lieb and two of Caratheodory as suggested additions to the article. Anyone else?
I find it difficult, as I must, to refrain from vigorous and unacceptable language. Please consider it interspersed between each word of the following.
My goal in spending a great deal of time on this Talk:Entropy list in July, and now, was only to have the Wikipedia Entropy section written so as to aid beginning students and the general public who might access Entropy. My contributions to the field, as you now can see in detail at "Non-notable?"(written due to YOUR DEMAND, not my choice) have been adopted by the majority of authors of US general chemistry texts to be the way entropy should be presented to beginners. (The change also in Atkins is not trivial.)
However, those views that were patiently and slowly developed in July and recently were not considered/deliberated seriously by Sadi Carnot, Kats and the three or four others -- a very small number of individuals, none with experience in educating chemistry students -- who have dominated the Talk:Entropy page since I first saw it in July.
I care NOTHING about my name being mentioned ANYWHERE! In a series of statements from Carnot, Pjacobi and others recently, anything I wrote at the time or in the past was denigrated because it was my POV or that I was not notable because my views had not "come from others". THAT is the only reason that I took a day -- I do not type rapidly, I am now 88 -- to prepare the formal list of texts and literature citations.
And yet -- is it conceivable? -- Sadi Carnot, after that detailed list of my peer-reviewed articles and the amazingly long list of textbooks that have changed in only four years, writes of me "....who has never published...for a peer-review.." Sadi demonstrates here that he does not read carefully. His ten year investment in reading [in the same careless manner he approaches reading my presentations? I hope not.] has resulted in his sense of ownership of the Entropy article in Wikipedia. My peer-review has not been only three seminal articles. It consists of 36 established textbook authors, my peers and superiors, whose jaundiced eyes are far more reliable, I believe, as to the educational worth of any statement about entropy than those of a wannabe Sadi Carnot.
Forget me and my name. What I think is important is that this approach involving energy dispersal be made an initial part of the Entropy page because young students are impatient -- some even desperate - when they hit Widipedia for a quick explanation of entropy. They are NOT, in general, going to do a scholarly scan of the whole article and then go to other links. I believe the same is true for most of the general public who will be accessing Entropy in Wikipedia. As I've said to some of you, "Does Wikipedia serve the learner as well as the learned?"
Forget me. Omit my name completely from anywhere. Literally. I do NOT care. But I DO care for what has been my life's devotion. So, for the sake of millions of students now and in the future. please don't omit an approach to entropy that is now seen as valid by so many text authors. FrankLambert 17:23, 9 October 2006 (UTC)
As I said above, I trust the personal frustrations can be put aside, and a stable consensus acheived about how to integrate a few useful analogic explanations into the article. Speaking for myself to all involved editors, in my judgment this article needn't be a totally positivist, technical explanation (as would be more arguable in the article on the second law, though even there laypersons may deserve a good common-sense analogy or two to help them along). Certainly there must be some reasonable way to accomplish a workable combination of technical accuracy and reasonable analogies that would be understandable to the average layperson. ... Kenosis 18:06, 9 October 2006 (UTC)
First, it is not the credibility of the theory that is at issue, but the usefulness of the explanation to the reader of the WP article as proposed to be drawn from Lambert's example of a growing contemporary pedagogical approach, that of describing entropy as "energy dispersal" rather than as "disorder".
Second, there is no contradiction at all between describing entropy as "energy dispersal" and the mixing example. The mixing example is a combination of two basic concerns, that of the two substances mixing physically and that of their respective energy seeking equilibrium among the two original sets of particles now mixed. ... Kenosis 03:02, 10 October 2006
Energy dispersal is quite well quantified by the formula , where Ω is the number of microscopic configurations, and is Boltzmann's constant. It is quantified as S in a thermodynamic system (the current state of energy dispersal throughout a defined environment), and changes in energy dispersal are defined by delta-S (ΔS, which is a derivative function of S). ... Kenosis 21:22, 10 October 2006 (UTC)
Folks, Please keep in mind that there are plenty of systems which have no concept or definition of energy or temperature, but for which one can explicitly and precisely define entropy. This is generally the case for information theory ( information entropy), and for many cases studied in dynamical systems ( Kolmogorov-Sinai entropy). Insofar as energy is a kind of constant of motion that is kind-of-like conjugate to time, then maybe the change of entropy over time is kind-of-like energy dispersal... Hmmm. I'm not really aware of any way to take the formal mathematical definition for a system with dissipation ( wandering set), and turn it into some theorem involving entropy. But this could be an interesting topic to explore ... Is entropy kind-of-like the dissipation of constants of motion over time? Hmmm. linas 06:24, 10 October 2006 (UTC)
I've removed the following sentence, because it's phrasing and placement in the section was confusing if not incorrect. It was impossible to tell what the "converse" was here. ... Kenosis 20:15, 10 October 2006 (UTC)
I replaced the reference to "entropy of mixing" at the end of that section, explaining that it's a specialized case of entropy described by its own formula. ... Kenosis 20:25, 10 October 2006 (UTC)
The definition of physical entropy (S=k ln(W)) takes into account both "thermodynamic entropy" and "configurational entropy". Its an unfortunate choice of names - it implies that configurational entropy is somehow "outside" of thermodynamics, which it is not. Jheald's thought experiment was an excellent way of seeing that. Also, you are disagreeing with Frank Lambert when you call configurational entropy a dispersal of particles and thermodynamic entropy a dispersal of energy. According to that reference provided by Dave Souza, his position is that even configurational entropy constitutes a dispersal of energy (with which I disagree). PAR 01:25, 11 October 2006 (UTC)
Regarding the insistence on stating that there is a net "decrease" of entropy in the hypothetical room in which ice is placed into a glass of water, please let's at least get the basics straight. When a room (a closed system) gives up some of its available heat energy to a glass of ice water within it, the room's entropy has increased in between the time the ice was placed into the glass and the time that the contents of the glass has returned to room temperature. That is, there is a net loss of available energy left to dissipate within that room. That is a most basic principle of entropy. The Second Law tells us that this room, having introduced the added component of the ice, will seek such an increase of entropy until it arrives at a steady-state maximum value of dispersal of its available heat within its boundaries. ... Kenosis 00:46, 11 October 2006 (UTC)
"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has increased."
"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased."
Moreover, several statements made just above betray a very fundamental misunderstanding of what entopy is: User:PAR's statements "Any object which simply grows colder, loses entropy. If it is cooled to absolute zero, its entropy is zero. (or really small anyway). The room grows colder, therefore its entropy decreases." are simply incorrect, and confuse entropy (the quantity S) with heat itself. ... Kenosis 01:28, 11 October 2006 (UTC)
.
No! The formula is:
; or
, where the SI unit of entropy is " joule per kelvin" (J·K−1), which is the same as the unit of heat capacity.
PAR, I believe you're confusing entropy with heat itself. Entropy is the dispersal of heat energy throughout a medium or into surrounding media. Delta-S is the change in dispersal of heat energy across a medium or into surrounding media, and not the change in temperature. ... Kenosis 02:39, 11 October 2006 (UTC)
Ok, regarding the sentence, just so we don't get mixed up, we agree that there are three systems here:
I think we can agree that R is a closed system. I'm ok with that. We agree that r and g are open systems, right? There's energy flowing from the room r to the ice water g as the ice water melts, so they are both open systems. I am saying that as the situation goes to equilibrium, the entropy of g increases, the entropy of r decreases, and the entropy of R, which is the sum of both of the above entropies, increases, as demanded by the second law, since R is closed. Maybe the sentence could read:
"the entropy of the system of ice and water has increased more than the entropy of the room and ice water together have INcreased."
or maybe:
"the entropy of the system of ice and water has increased more than the entropy of the room (excluding the ice water) has DEcreased."
PAR 03:14, 11 October 2006 (UTC)
Has nobody noticed that putting a particular theory on a separate page is completely in contradiction to NPOV? The only way to do something of the sort within the rules might be to link to an external source. It is really just the same as Origin of species (Biology) and origin of species (biblical) DGG 05:03, 13 October 2006 (UTC)
Professor emeritus Harvey Leff (Am. J. Phys. 1996, 64, 1261-1271; “Maxwell’s Demon 2, Entropy, Classical and Quantum Information, Computing” (IOP – now Taylor Francis, 2003) has read the many requests that PAR posted to Talk:Entropy for a mathematical quantification of the spatial dispersal of molecules in mixing. Leff stated that, although noble in principle, a more detailed quantitative treatment of molecular behavior in mixing is probably impossible, but also unnecessary.
Levine (“Physical Chemistry”, 5th Edition, 2002, McGraw-Hill; p. 90) states: “The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged). Meyer (J. Chem. Educ. 1987, 64, 676 ) stated, ”…processes generally described as ‘mixing’, that is, a combination of two different gases …has absolutely nothing to do with the mixing itself of either” (italics in the original). Craig ("Entropy Analysis", 1992, VGH Publishers; p. 92) says about the "entropy of mixing" (quotes are his), "It is the spreading out of the molecules in space that is crucial, not the intermixing of them."
When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume. From classic considerations of quantum mechanics of a moving particle in a box, energy levels of any Boltzmann distribution in a larger box are closer or ‘denser’ than in a smaller volume. Thus, if energy levels are closer in the final volume of A and B, their respective initial internal energies now have many more energy levels for their individual molecular energies (alternatively, ‘for their moving molecules having particular energies’). The result is an increase in accessible microstates for A and for B in the final mixture at equilibrium – just as there is the same increase in either A or B alone expanding to that final volume.
Using semipermeable membranes, a possible reversible process for changing the unmixed initial to the mixed final state is illustrated in Levine (Fig. 3.9). The macro equation of ΔS = R ln V2/V1 , for one mole of either A or B, is ΔS = R ln 2V1 / V1 = R ln 2 = 5.76 J/K. Thus, the total entropy increase for A and B in the final mixture is 11.5 J/K. The conceptual meaning of this entropy increase is identical to that of any substance’s internal energy becoming more dispersed by its molecules colliding and moving into a greater volume of three-dimensional space, if it is not constrained.
"... gulp ..." ... Kenosis 14:39, 17 October 2006 (UTC)
"The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged)"
"It is the spreading out of the molecules in space that is crucial, not the intermixing of them."
"When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume."
Sorry, I’ve been away and have to go quickly. Norman Craig (text: Entropy Analysis) says: “Entropy in thermodynamics always means entropy change.” S lacks meaning without its reference to a substance/system at T. S0 refers to a change from 0 K – the dispersal of thermal energy from the surroundings/T to a substance beginning at 0 K for incremental values of T up to 298.15 K. Specifically, the process involves calculation from 0-10 K, then summation from 10 K to 298.15 K of incremental/reversible measurement of heat transferred/T (shown in most phys chem. texts as Cp/T versus T, with the area under the curve to 298.15 K as the summation of ∫ Cp/T dT). S0 of course is the most telling example of the importance of viewing energy dispersal as integral to any consideration of entropy change. FrankLambert 23:06, 17 October 2006 (UTC)
When there is unhappiness about a piece of writing, I find it usefull to ask the questions, "Who will read this?" and "Who is the intended reader". Doing this about this article, it seems to me that very few people will read it usefully. There will be a group of well-trained physicists who might have a look at it, but they will learn very little because they already know it. Someone who knows nothing about entropy, but wants to know, will read the first paragraph and run away screaming. I think the same is likely to be true about chemistry students (I have taught Physical Chemistry for many years although my expertise is in quantum chemistry). I suspect that more students come across entropy in chemistry classes than in physics classes. It has been suggested, and I note a recent more detailed discussion on the citizendium forums, that all articles should appeal to everybody by being written at three levels - first a very general introduction for someone who knows nothing about the topic, second a discussion appropriate to a beginning undergraduate meeting the topic for the first time, and third a detailed discussion of the whole topic. It has to be said that this article fails this criteria.
What to be done? I welcome input from others as I only have a few ideas. First, there should be an introductory paragraph that contains no equations and no mathematics that says what entropy is and how it fits into the general picture. Second, the article and this discussion is full of very rigorous statements and equations. This is the way physicists think and rightly so. Of course they have a place. However, chemists often do not think in this way and I suggest we need to add some more vague points that appeal to chemists and perhaps to the general reader. Of course we can prefix them with a statement such as "a rough way of looking at this is ..". The suggestions of Frank Lambert fit in here. I do not think that energy dispersal is general. I prefer to think of entropy change on mixing as dispersal of matter. Nevertheless it is a good way of getting across what entropy is all about and it is proving usefull to chemists. It needs to be added, because quite frankly this article is very rigorous and exact but gives no real idea of what entropy is all about. -- Bduke 23:09, 17 October 2006 (UTC)
I am surprised that the regular editors of this article have not responded. Have you thought who actually reads this article and who should read it? Do you realise just how off-putting it is to the new comer to entropy who justs wants a simple introduction to the concept?
A new introduction is needed. I suggest that we add a new first paragraph that might read something like this:-
Could we delete this proposal here? -- Bduke 00:15, 24 October 2006 (UTC)
I looked at this again, and replaced the first paragraph of the intro with the following, pending the weigh-in of other knowledgeable editors:
I'm afraid it is. That is, "confusing". What is wanted in the introduction is some notion of what entropy is for or what it is about, before we start to define what it is. I do not understand your reasons for removing the first sentence. Why "never start with 'while'"? Would something like this be better:-
I am not so sure that conciseness is a virtue here. We want to ease someone into the article by giving them an idea where it is all going. Let us try to imagine someone who has no real idea of what entropy is about or what it is. -- Bduke 08:24, 24 October 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
Hi, everyone. I have been trying to avoid this article owing to the character of the talk-page discussion and in regards to certain individuals who are using fowl language and for those who are trying to push “original” conceptions onto this page. For the record, Nonsuch, Jheald, and Yevgeny are correct in their viewpoint and I commend them for their efforts. As for User:Jim62sch, who said “Helllooooo???!!!! It really doesn't fucking matter”, you are disqualified from any further intellectual discourse. As for everyone else, let me remind you that the point of wikipedia is to publish established information. Presently, we have accumulated over 200 kilobytes of talk-page discussion all because of the personal views of one person, whom I won’t name, who states that he has previously taught entropy to thousands of humanities students even though, as he states, he didn’t really know what it was, and that he has never read any of the original manuscripts of either Sadi Carnot, Emile Clapeyron, or Rudolf Clausius, and who says that he is poor, or in his own words a “baby”, at math and statistics, but that after reading this article by Kelvin:
He suddenly figured it all out. The point of a talk page is not to debate someone’s personal theories.
I do not want to get involved in this debate, but I happen to have glanced at the entropy page a few days ago and I notice that the un-named individual is referenced nine times in the entropy article as well as two external links, all essentially going to the views of one person. There is even a “religious-viewpoint” entropy link in the external links? Now, I am no expert on entropy, but I do own about a dozen books on entropy:
And the unnamed person is not mentioned in these books nor is his “personal” theory on energy dispersal. Because of this, I would favor the recommendation of User Par who suggests removing all of the energy dispersal parts. However, although the un-named person has never published his “energy dispersal” views neither in book form nor in peer-reviews article form, I do remember having his seen his name mentioned in one chemistry book, somewhere?
Thus, to remedy this awkward situation, I am going to be bold and move all of this “energy dispersal” stuff as well as all of the related talk page stuff to it’s own page, i.e. entropy (energy dispersal) and Talk:Entropy (energy dispersal), and cite the unknown person as the primary author of the concept. In this manner, the introductory or novice reader can get the core, basic, historical, published, sourced entropy view-point on the main page and go to the “see also” section for other such side-line “energy dispersal” theories. Please do not revert this change, this unnecessary original research discussion has gone on long enough. Thanks: -- Sadi Carnot 12:51, 6 October 2006 (UTC)
Gentleman, I have been working to build this entropy page for some time. My only interest is to see basic information on this page. I do not wish to dig into an argument with anyone. My change was only to move Lambert's entropy theories (which are self-sourced by his own websites) to their own page. Please let my revisions sit for a week to see what everyone else thinks according to vote. If, after we vote, the editors here favor keeping whole sections devoted to Lambert's theories on "energy dispersal" here with 9 links to his websites as references well than fine with me. Please let the changes sit for seven days and then we can revert or keep according to consensus. Thanks: -- Sadi Carnot 12:50, 7 October 2006 (UTC)
To aid Sadi, PJacobi, Nonsuch and others to assess the breadth of the acceptance of my views in the scientific community's area of chemistry, I am assembling the list of published books, textbooks, and peer-reviewed journals that refer to my writing about energy dispersal and entropy and the second law. It will be posted here by 23:00 GMT 8 October. FrankLambert 21:17, 7 October 2006 (UTC)
People following this page might like to express their views in the AfD, one way or the other. Jheald 03:55, 8 October 2006 (UTC).
Thanks for the heads-up on this AfD, Jheald. Your vote to keep the article seems right to me, and most of the other contributors appear to support this position so the page should not be in danger. ... dave souza, talk 19:01, 8 October 2006 (UTC)
To resolve this issue, I am going to type up the correct presentation of entropy by Nobelist Gilbert Lewis and Merle Randall from their famous 1923 textbook Thermodynamics and the Free Energy of Chemical Substances. According to both chemistry historian Henry Leicester, from his The Historical Background of Chemistry (1956), and Nobelist Ilya Prigogine, from his 1998 textbook Modern Thermodynamics, before 1923 chemists did not make use of “entropy” but instead used the concept of chemical affinity to calculate the driving force of chemical reactions. According to these sources, Lewis and Randall’s influential textbook led to the replacement of the term “affinity” by the term “free energy” in the English-speaking world.
Hence, all modern-day chemistry textbooks are based on Lewis and Randall’s description of entropy, which they define as, based on the work of Rudolf Clausius and Willard Gibbs, a “scale of irreversibility” that quantitatively measures the irreversible "work energy" or the "degree of degradation" of the system in the irreversible process. This is the correct perspective. I will type up the full presentation, from their textbook, over the next few days (Sorry, I ran out of time today). -- Sadi Carnot 18:01, 8 October 2006 (UTC)
see [2] :
Jim, I the page is only locked so that we can all come to an agreement in order to reasonably solve this issue. People have been trying to push Frank Lambert's personal theories, which do not accord with thermodynamics textbooks, into this article since June of '06. This is a long-standing issue. -- Sadi Carnot 18:17, 8 October 2006 (UTC)
Concerning this edit: [3] "Rv non-notable person's theory deleted again" This makes no sense. Either we should not have an article on this person, and we do, or he is notable. Pjacobi, please explain your reasoning - thanks. KillerChihuahua ?!? 20:03, 8 October 2006 (UTC)
[8], [9], [10], [11], [12], [13], [14] . Oh yeah, there are more -- I'm just "warming up". ;)
There are two authors of a popular scientific book who used my ideas as a starting point for a couple of pages and ascribed that start to me. Far more important, the names of some 36 textbook authors (who have been so convinced of the validity of my approach to entropy that they have risked their reputations on it) and their text titles are listed below.
Popular Scientific Book
“Into the Cool – Energy Flow, Thermodynamics, and Life” by Eric D. Schneider and Dorion Sagan; University of Chicago Press, 2005, 362 pages, ISBN 0-226-73936-8
Textbooks
“Chemistry, The Molecular Science”, Second Edition J. W. Moore, C. L. Stanistski, P. C. Jurs; Thompson Learning, 2005, 1248 pages, ISBN 0-534-42201-2
“Chemistry, The Molecular Nature of Matter and Change”, Fourth Edition M. S. Silberberg; McGraw-Hill, 2006, 1183 pages, ISBN 0-07-255820-2
“Conceptual Chemistry”, Second Edition J. Suchocki; Benjamin Cummings, 2004, 706 pages, ISBN 0-8053-3228-6
“Chemistry, Matter and Its Changes”, Fourth Edition J. E. Brady and F. Senese; John Wiley, 2004, 1256 pages, ISBN 0-471-21517-1
“General Chemistry”, Eighth Edition D. D. Ebbing and S. D. Gammon; Houghton-Mifflin, 2005, 1200 pages, ISBN 0-618-399410
“Chemistry: A General Chemistry Project of the American Chemical Society”, First Edition J. Bell, et al.;
W. H. Freeman, 2005, 820 pages,
ISBN
0-7167-3126-6
“Atkins’ Physical Chemistry”, Eighth Edition P. Atkins and J. de Paula;
W. H. Freeman, 1072 pages,
ISBN
0-7167-8759-8
Physical Chemistry for the Life Sciences, First Edition P. Atkins and J. de Paula; W. H. Freeman, 699 pages, ISBN 0-7167-8628-1
“Chemistry: The Central Science”, Tenth Edition T. L. Brown, H. E. LeMay, and B. E. Bursten;
Prentice Hall, 2006, 1248 pages,
ISBN
0-13-109686-9
I do not have copies of the seven other US chemistry textbooks that have changed their approach to entropy on the basis of my articles. If the above list is not convincing and such further evidence would be conclusive, I would be glad to supply it. However, the basic information (lacking only specific page numbers) is at www.entropysite.com/#whatsnew at December 2005 under the following authors/titles:
My major recent articles are all accessible under "articles" at www.entropysite.com . Their locations in issues of the Journal of Chemical Education ("JCE") are given below.
The following list is of peer-reviewed articles that cited one or more of mine.
I already have a category "thermodynamicists" list started of established thermodynamicists; Constantin Caratheodory is a good classic example, his theories are respected and written up in 100s of books and articles. -- Sadi Carnot 13:24, 9 October 2006 (UTC)
I have just noticed this current fracas on the topic of entropy. While I haven't yet had a chance to closely familiarize myself with the explanatory merits (if any) of Frank Lambert's approach, I need to say I am disturbed by the treatement he appears to be receiving from some editors. Although WP quidelines discourage self aggrandization through rules like WP:NOR and guidelines or policies such as WP:BIO, WP:OWN, etc., it is plain that the authors of WP:VER verified sources have a right to participate in the formation of articles. WP:NPOV#Undue_weight plainly allows for the inclusion of notable minority views in proportion to their comparative notability. While Mr. Lambert's approach does not at first blush square with classical thermodynamics (again, I haven't familiarized myself yet), I see absolutely no reason why it should be deleted out of hand as was done recently.
It seems to me the appropriate approach here would be to include this material in a reasonable summary form, linking to another main article if it's too lengthy to explain in a brief section, and note for the reader of the article that it's "an alternative modern approach", or some other way of accurately expressing it's positioning in the marketplace of ideas (per WP:VER of course). . ... Kenosis 16:00, 9 October 2006 (UTC)
Comment: for everyone to note, two of those who want to keep Lambert's article and related theories (e.g. "entropy=dispersion not disorder"), i.e. User:Jim62sch and User talk:Dave souza, seem to be only doing this, based on their edit and comment history, for intelligent design purposes, what ever they are? On the second law of thermodynamics talk page, for example, User:Dave souza has stated that the second law of thermodynamics as stated by Rudolf Clausius in 1854, i.e. “heat cannot of itself pass from a colder body to a hotter body”, is now incorrect due to recent talk page discussions. He seems to think that Lambert's website theories are the correct ones rather than Clausius and he is the one that started this mess by adding 9 ref links to Lambert's website. Whatever the case, Souza’s views and edits are not scientific.
Moreover, today I looked through my collection of entropy-related textbooks (70 thermodynamics textbooks, 7 physics textbooks, 5 biochemistry textbooks, 4 chemistry textbooks, 3 physical chemistry textbooks, and others) and the “energy dispersal” concept is not there neither is Frank Lambert. This is entirely a sideline website theory that happens to have good search rankings because the author has bought up all of the URLs related to entropy, e.g. ‘entropysimple.com’, ‘entropysite.com’, ‘2ndlaw.com’, ‘secondlaw.com’, etc., and in which he uses these to push out his own personal theories. Now, there is certainly nothing wrong with that. But, self-published website theories do not justify scientific correctness nor reasons to be included into the Wikipedia entropy article, which is from where all this mess is stemming. -- Sadi Carnot 10:18, 9 October 2006 (UTC)
I note that in your second paragraph you actually address the article, which is good - but as there is a list of textbooks above, it appears your collection may be out of date. KillerChihuahua ?!? 11:27, 9 October 2006 (UTC)
It appears to me that part of the problem here is over the common meaning of the word "disorder" or "disordered". Despite the use of the word to describe entropy, despite the involvement of chaotic factors and difficulties defining the boundary characteristics of closed systems, entropy is a process that is as orderly and quantifiable as are "permeability", "diffusion", "homeostasis", "absorption", "osmosis", "convection" and other such dynamics. It seems to me the use of analogies (such as were used in the recently deleted section) does not threaten the order of the cosmos or even the order of WP, nor should it threaten editors of the current article. Unlike the article on the Second law of thermodynamics, this article need not, in my judgment, be necessarily limited to formulaic and/or logical positivist explanations of the concept of entropy. Surely there must be room to gain a stable consensus on agreeable ways to explain this stuff to the reader in an analogical way, per WP:NOR, WP:VER, and WP:NPOV#undue_weight of course. ... Kenosis 16:34, 9 October 2006 (UTC)
Yup, been there done that, got tired of it and stopped editing 2LOT for about six months. See Talk:Second law of thermodynamics/creationism for the old discussion. Wade wanted it in, I didn't, vote on the straw poll was about split. Even Wade (a creationist and ID proponent) knew it was bogus, so there shouldn't be any editing wars about that, at least. And now I move we close this section and get back to the Entropy article here. KillerChihuahua ?!? 19:50, 9 October 2006 (UTC)
We have one mention of Lieb and two of Caratheodory as suggested additions to the article. Anyone else?
I find it difficult, as I must, to refrain from vigorous and unacceptable language. Please consider it interspersed between each word of the following.
My goal in spending a great deal of time on this Talk:Entropy list in July, and now, was only to have the Wikipedia Entropy section written so as to aid beginning students and the general public who might access Entropy. My contributions to the field, as you now can see in detail at "Non-notable?"(written due to YOUR DEMAND, not my choice) have been adopted by the majority of authors of US general chemistry texts to be the way entropy should be presented to beginners. (The change also in Atkins is not trivial.)
However, those views that were patiently and slowly developed in July and recently were not considered/deliberated seriously by Sadi Carnot, Kats and the three or four others -- a very small number of individuals, none with experience in educating chemistry students -- who have dominated the Talk:Entropy page since I first saw it in July.
I care NOTHING about my name being mentioned ANYWHERE! In a series of statements from Carnot, Pjacobi and others recently, anything I wrote at the time or in the past was denigrated because it was my POV or that I was not notable because my views had not "come from others". THAT is the only reason that I took a day -- I do not type rapidly, I am now 88 -- to prepare the formal list of texts and literature citations.
And yet -- is it conceivable? -- Sadi Carnot, after that detailed list of my peer-reviewed articles and the amazingly long list of textbooks that have changed in only four years, writes of me "....who has never published...for a peer-review.." Sadi demonstrates here that he does not read carefully. His ten year investment in reading [in the same careless manner he approaches reading my presentations? I hope not.] has resulted in his sense of ownership of the Entropy article in Wikipedia. My peer-review has not been only three seminal articles. It consists of 36 established textbook authors, my peers and superiors, whose jaundiced eyes are far more reliable, I believe, as to the educational worth of any statement about entropy than those of a wannabe Sadi Carnot.
Forget me and my name. What I think is important is that this approach involving energy dispersal be made an initial part of the Entropy page because young students are impatient -- some even desperate - when they hit Widipedia for a quick explanation of entropy. They are NOT, in general, going to do a scholarly scan of the whole article and then go to other links. I believe the same is true for most of the general public who will be accessing Entropy in Wikipedia. As I've said to some of you, "Does Wikipedia serve the learner as well as the learned?"
Forget me. Omit my name completely from anywhere. Literally. I do NOT care. But I DO care for what has been my life's devotion. So, for the sake of millions of students now and in the future. please don't omit an approach to entropy that is now seen as valid by so many text authors. FrankLambert 17:23, 9 October 2006 (UTC)
As I said above, I trust the personal frustrations can be put aside, and a stable consensus acheived about how to integrate a few useful analogic explanations into the article. Speaking for myself to all involved editors, in my judgment this article needn't be a totally positivist, technical explanation (as would be more arguable in the article on the second law, though even there laypersons may deserve a good common-sense analogy or two to help them along). Certainly there must be some reasonable way to accomplish a workable combination of technical accuracy and reasonable analogies that would be understandable to the average layperson. ... Kenosis 18:06, 9 October 2006 (UTC)
First, it is not the credibility of the theory that is at issue, but the usefulness of the explanation to the reader of the WP article as proposed to be drawn from Lambert's example of a growing contemporary pedagogical approach, that of describing entropy as "energy dispersal" rather than as "disorder".
Second, there is no contradiction at all between describing entropy as "energy dispersal" and the mixing example. The mixing example is a combination of two basic concerns, that of the two substances mixing physically and that of their respective energy seeking equilibrium among the two original sets of particles now mixed. ... Kenosis 03:02, 10 October 2006
Energy dispersal is quite well quantified by the formula , where Ω is the number of microscopic configurations, and is Boltzmann's constant. It is quantified as S in a thermodynamic system (the current state of energy dispersal throughout a defined environment), and changes in energy dispersal are defined by delta-S (ΔS, which is a derivative function of S). ... Kenosis 21:22, 10 October 2006 (UTC)
Folks, Please keep in mind that there are plenty of systems which have no concept or definition of energy or temperature, but for which one can explicitly and precisely define entropy. This is generally the case for information theory ( information entropy), and for many cases studied in dynamical systems ( Kolmogorov-Sinai entropy). Insofar as energy is a kind of constant of motion that is kind-of-like conjugate to time, then maybe the change of entropy over time is kind-of-like energy dispersal... Hmmm. I'm not really aware of any way to take the formal mathematical definition for a system with dissipation ( wandering set), and turn it into some theorem involving entropy. But this could be an interesting topic to explore ... Is entropy kind-of-like the dissipation of constants of motion over time? Hmmm. linas 06:24, 10 October 2006 (UTC)
I've removed the following sentence, because it's phrasing and placement in the section was confusing if not incorrect. It was impossible to tell what the "converse" was here. ... Kenosis 20:15, 10 October 2006 (UTC)
I replaced the reference to "entropy of mixing" at the end of that section, explaining that it's a specialized case of entropy described by its own formula. ... Kenosis 20:25, 10 October 2006 (UTC)
The definition of physical entropy (S=k ln(W)) takes into account both "thermodynamic entropy" and "configurational entropy". Its an unfortunate choice of names - it implies that configurational entropy is somehow "outside" of thermodynamics, which it is not. Jheald's thought experiment was an excellent way of seeing that. Also, you are disagreeing with Frank Lambert when you call configurational entropy a dispersal of particles and thermodynamic entropy a dispersal of energy. According to that reference provided by Dave Souza, his position is that even configurational entropy constitutes a dispersal of energy (with which I disagree). PAR 01:25, 11 October 2006 (UTC)
Regarding the insistence on stating that there is a net "decrease" of entropy in the hypothetical room in which ice is placed into a glass of water, please let's at least get the basics straight. When a room (a closed system) gives up some of its available heat energy to a glass of ice water within it, the room's entropy has increased in between the time the ice was placed into the glass and the time that the contents of the glass has returned to room temperature. That is, there is a net loss of available energy left to dissipate within that room. That is a most basic principle of entropy. The Second Law tells us that this room, having introduced the added component of the ice, will seek such an increase of entropy until it arrives at a steady-state maximum value of dispersal of its available heat within its boundaries. ... Kenosis 00:46, 11 October 2006 (UTC)
"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has increased."
"the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased."
Moreover, several statements made just above betray a very fundamental misunderstanding of what entopy is: User:PAR's statements "Any object which simply grows colder, loses entropy. If it is cooled to absolute zero, its entropy is zero. (or really small anyway). The room grows colder, therefore its entropy decreases." are simply incorrect, and confuse entropy (the quantity S) with heat itself. ... Kenosis 01:28, 11 October 2006 (UTC)
.
No! The formula is:
; or
, where the SI unit of entropy is " joule per kelvin" (J·K−1), which is the same as the unit of heat capacity.
PAR, I believe you're confusing entropy with heat itself. Entropy is the dispersal of heat energy throughout a medium or into surrounding media. Delta-S is the change in dispersal of heat energy across a medium or into surrounding media, and not the change in temperature. ... Kenosis 02:39, 11 October 2006 (UTC)
Ok, regarding the sentence, just so we don't get mixed up, we agree that there are three systems here:
I think we can agree that R is a closed system. I'm ok with that. We agree that r and g are open systems, right? There's energy flowing from the room r to the ice water g as the ice water melts, so they are both open systems. I am saying that as the situation goes to equilibrium, the entropy of g increases, the entropy of r decreases, and the entropy of R, which is the sum of both of the above entropies, increases, as demanded by the second law, since R is closed. Maybe the sentence could read:
"the entropy of the system of ice and water has increased more than the entropy of the room and ice water together have INcreased."
or maybe:
"the entropy of the system of ice and water has increased more than the entropy of the room (excluding the ice water) has DEcreased."
PAR 03:14, 11 October 2006 (UTC)
Has nobody noticed that putting a particular theory on a separate page is completely in contradiction to NPOV? The only way to do something of the sort within the rules might be to link to an external source. It is really just the same as Origin of species (Biology) and origin of species (biblical) DGG 05:03, 13 October 2006 (UTC)
Professor emeritus Harvey Leff (Am. J. Phys. 1996, 64, 1261-1271; “Maxwell’s Demon 2, Entropy, Classical and Quantum Information, Computing” (IOP – now Taylor Francis, 2003) has read the many requests that PAR posted to Talk:Entropy for a mathematical quantification of the spatial dispersal of molecules in mixing. Leff stated that, although noble in principle, a more detailed quantitative treatment of molecular behavior in mixing is probably impossible, but also unnecessary.
Levine (“Physical Chemistry”, 5th Edition, 2002, McGraw-Hill; p. 90) states: “The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged). Meyer (J. Chem. Educ. 1987, 64, 676 ) stated, ”…processes generally described as ‘mixing’, that is, a combination of two different gases …has absolutely nothing to do with the mixing itself of either” (italics in the original). Craig ("Entropy Analysis", 1992, VGH Publishers; p. 92) says about the "entropy of mixing" (quotes are his), "It is the spreading out of the molecules in space that is crucial, not the intermixing of them."
When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume. From classic considerations of quantum mechanics of a moving particle in a box, energy levels of any Boltzmann distribution in a larger box are closer or ‘denser’ than in a smaller volume. Thus, if energy levels are closer in the final volume of A and B, their respective initial internal energies now have many more energy levels for their individual molecular energies (alternatively, ‘for their moving molecules having particular energies’). The result is an increase in accessible microstates for A and for B in the final mixture at equilibrium – just as there is the same increase in either A or B alone expanding to that final volume.
Using semipermeable membranes, a possible reversible process for changing the unmixed initial to the mixed final state is illustrated in Levine (Fig. 3.9). The macro equation of ΔS = R ln V2/V1 , for one mole of either A or B, is ΔS = R ln 2V1 / V1 = R ln 2 = 5.76 J/K. Thus, the total entropy increase for A and B in the final mixture is 11.5 J/K. The conceptual meaning of this entropy increase is identical to that of any substance’s internal energy becoming more dispersed by its molecules colliding and moving into a greater volume of three-dimensional space, if it is not constrained.
"... gulp ..." ... Kenosis 14:39, 17 October 2006 (UTC)
"The entropy of a perfect gas mixture is equal to the sum of the entropies each pure gas would have if it alone occupied the volume of the mixture” (temperature unchanged)"
"It is the spreading out of the molecules in space that is crucial, not the intermixing of them."
"When A and B are mixed, their groups of different molecules can move to spread their “A” or “B” internal energy throughout the larger combined volume."
Sorry, I’ve been away and have to go quickly. Norman Craig (text: Entropy Analysis) says: “Entropy in thermodynamics always means entropy change.” S lacks meaning without its reference to a substance/system at T. S0 refers to a change from 0 K – the dispersal of thermal energy from the surroundings/T to a substance beginning at 0 K for incremental values of T up to 298.15 K. Specifically, the process involves calculation from 0-10 K, then summation from 10 K to 298.15 K of incremental/reversible measurement of heat transferred/T (shown in most phys chem. texts as Cp/T versus T, with the area under the curve to 298.15 K as the summation of ∫ Cp/T dT). S0 of course is the most telling example of the importance of viewing energy dispersal as integral to any consideration of entropy change. FrankLambert 23:06, 17 October 2006 (UTC)
When there is unhappiness about a piece of writing, I find it usefull to ask the questions, "Who will read this?" and "Who is the intended reader". Doing this about this article, it seems to me that very few people will read it usefully. There will be a group of well-trained physicists who might have a look at it, but they will learn very little because they already know it. Someone who knows nothing about entropy, but wants to know, will read the first paragraph and run away screaming. I think the same is likely to be true about chemistry students (I have taught Physical Chemistry for many years although my expertise is in quantum chemistry). I suspect that more students come across entropy in chemistry classes than in physics classes. It has been suggested, and I note a recent more detailed discussion on the citizendium forums, that all articles should appeal to everybody by being written at three levels - first a very general introduction for someone who knows nothing about the topic, second a discussion appropriate to a beginning undergraduate meeting the topic for the first time, and third a detailed discussion of the whole topic. It has to be said that this article fails this criteria.
What to be done? I welcome input from others as I only have a few ideas. First, there should be an introductory paragraph that contains no equations and no mathematics that says what entropy is and how it fits into the general picture. Second, the article and this discussion is full of very rigorous statements and equations. This is the way physicists think and rightly so. Of course they have a place. However, chemists often do not think in this way and I suggest we need to add some more vague points that appeal to chemists and perhaps to the general reader. Of course we can prefix them with a statement such as "a rough way of looking at this is ..". The suggestions of Frank Lambert fit in here. I do not think that energy dispersal is general. I prefer to think of entropy change on mixing as dispersal of matter. Nevertheless it is a good way of getting across what entropy is all about and it is proving usefull to chemists. It needs to be added, because quite frankly this article is very rigorous and exact but gives no real idea of what entropy is all about. -- Bduke 23:09, 17 October 2006 (UTC)
I am surprised that the regular editors of this article have not responded. Have you thought who actually reads this article and who should read it? Do you realise just how off-putting it is to the new comer to entropy who justs wants a simple introduction to the concept?
A new introduction is needed. I suggest that we add a new first paragraph that might read something like this:-
Could we delete this proposal here? -- Bduke 00:15, 24 October 2006 (UTC)
I looked at this again, and replaced the first paragraph of the intro with the following, pending the weigh-in of other knowledgeable editors:
I'm afraid it is. That is, "confusing". What is wanted in the introduction is some notion of what entropy is for or what it is about, before we start to define what it is. I do not understand your reasons for removing the first sentence. Why "never start with 'while'"? Would something like this be better:-
I am not so sure that conciseness is a virtue here. We want to ease someone into the article by giving them an idea where it is all going. Let us try to imagine someone who has no real idea of what entropy is about or what it is. -- Bduke 08:24, 24 October 2006 (UTC)