This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 |
In reference to this: " of energy transferred by heating is denoted by rather than , because Q is not a state function while the entropy is."
That seems really unclear to me. What is the significance of Q not being a state function. Also, it seems like Q *is* a state function, after all, it doesn't matter how the heat got to the system, as long as it is there, right? the δ notation still confuses me, as I've taken plenty of math and have never seen a δ be used in algebra or calculus. Fresheneesz 10:55, 4 April 2006 (UTC)
Heat cannot be a state function because it is defined in the first law of thermodynamics as a transfer, rather than a stored quantity. Internal energy is stored. The term heat is used to describe energy transfers induced by temperature differences.
In the Units section, perhaps one should also include the statistical definition of entropy, σ=lng, and maybe even the extended one. -- rhevin 19:36, 23 May 2006 (UTC)
In the article, it states, "Entropy is the measure of disorder. It is a key physical variable in describing a thermodynamic system," and I'm left perplexed as to how these two statements exist in the same conceptual context when they are directly preceeded by a formula that neither quantifies nor qualifies "disorder." In fact, while Entropy is a key physical variable that describes how heat moves between physical systems, "disorder" is not - in all my years of thermodynamics, from my classes in grad school to my career as a physicist, I have never measured or calculated "disorder." Putting the word disorder in this article is not a reflection of the variable Entropy as is defined in the Second Law of Thermodynamics. Entropy as disorder only applies to information theory, not physics. What is meant here is multiplicity, or degenerative states, not "disorder." This isn't a POV issue, it's a matter of appropriately describing what is common scientific knowledge. The formula provided describes Entropy as it is: energy flow at a specified temperature. If you are going to describe Entropy in terms of the multiplicity of degenerate states, then that formula should be provided as well, and properly described - not cast in terms of this misleading term, "disorder." For a really helpful guide to non-physicists or non-chemists, go to http://www.entropysimple.com/content.htm . I plan on making changes to this article to address these confusions soon. Any feedback would be greatly appreciated as the quality of this article is very important to me. Best regards, Astrobayes 06:55, 7 June 2006 (UTC)
I just finished added parts to the history section. The whole page seems to be too unwieldy. I propose breaking up the article as follows:
In this manner, the main entropy page will have mini-articles (with {{ see:main}} links attached). See: the thermodynamics and gravitation articles for example; these are one’s I’ve helped to break-up in the past. If anyone wants to do this, by all means do so; this is just a suggested outline. I’ll let this idea marinade for awhile.-- Sadi Carnot 04:20, 5 April 2006 (UTC)
Done with clean-up/break-up of article. Almost nothing was deleted; only moved around. The main article is now only 5 printable pages (instead of about 15) and gives a nice overview. I hope everyone likes the result.-- Sadi Carnot 03:40, 11 April 2006 (UTC)
The History section is a little unclear. Both Sadi and Lazare are mentioned but the timing doesn't quite mesh.
I don't know much about the subject, but to me it just doesn't quite mesh.
There doesn't seem to be any WP reference to Carnot's theorem (not this one). Perhaps all that needs to happen is this part/sentance is removed. I'll leave it to someone with more scientific history knowledge to decide. Frelke 06:19, 7 April 2006 (UTC)
Does reverting "Entropy" increase its entropy? Oneismany 05:29, 4 May 2006 (UTC)
“ | Please put this discussion in chronological order!!! | ” |
I removed the following paragraph as being misleading at best:
Stars are not more ordered than the gas from which they condensed because stars get very hot as they collapse. The sentence on evolution isn't wrong, but it doesn't demonstrate that disorder and entropy are different. Nonsuch 16:09, 9 May 2006 (UTC)
I'm cutting this paragraph, not least because IMO to get into the detail of this discussion is inappropriate for the introduction above the contents list. IMO the introduction is clearer, and gives a more balanced first introduction to the concept of entropy without this paragraph; and IMO even without the paragraph, it is already right at the limit for being as long as it desirably should be.
Also, IMO we already indicate how the intuitive notion of "disorder" or "mixedupness" is to be more precisely understood, namely as
That is a definition of how mixed up we consider the system to be. -- Jheald 09:13, 11 May 2006 (UTC).
Also I reverted the last sentence of this paragraph:
Unfortunately not true. Additional assumptions are necessary. Nonsuch 16:09, 9 May 2006 (UTC)
Dude, I look forward to reading your proof of the 2nd law starting from the classical dynamics of an isolated system and indistinguishavility in a main stream physics journal. In the mean time you're talking utter nonsense. There is no such proof, at least none that is widely accepted. One runs into all sorts of deep philosophical and physical difficulties. At some point you have to assume molecular chaos, or corse graining, or stochastic dynamics or some procedure that throws away information. And remember that your purported proof has to be resilient against attacks by Maxwell daemons and other intelligent opponents. This is a not a trivial problem. Nonsuch 03:17, 11 May 2006 (UTC)
Not true. My many definitions are now at the hearts of many articles of Wikipedia - which before me were incoherent mumbo-jumbo without clear definitions. I am proud that my expertise can improve the quality of Wikipedia which is quite poor (especially compared to real encyclopedias - like old but good soviet encyclopedia, or Britanica or to encyclopedia of physics I have).
By the way, many of my (later removed by Nosuch and others) definitions are exactly the same as in these encyclopedias - I recently checked . These encyclopedias were written not by amateurs as Wikipedia is written by, but by the experts in their fields - that is why they (printed encyclopedias) are so good and concise. Especially when I compare my definitions with good textbooks or US encyclopedia of physics. I intentionally do not quote these credible sources till enough ill-justified reverts accumulate and long discussions go nowhere exactly due to removal of my clear definitions and replacement them back by nonsense mumbo-jumbo).
If I have spelling and sometimes grammatic errors - yes, of course I do - because English is not my native language. Despite that, I have published (and continue to publish) plenty of scientific papers in English. Scientific journals accept them just fine. What surprises me that even you and many others who claim English to be their native also do a lot of spelling and grammar mistakes(!). Do you want to quote yours? Just let me know (in general, I do not like to get involved into personal fight and consider edit wars just a waste of time, but if you provoke me I may be forced to). So, feel free to correct my grammar and spelling instead of reverting back to undefined nonsense. That is the exact spirit of Wikipedia - to jointly increase knowledge rather than blindly delete it. Sincerely, Enormousdude 16:16, 11 May 2006 (UTC)
Guys, please first clearly DEFINE your terms to avoid endless (and sometimes incoherent) mumbling. Start with the definition of order/disorder (you are velcome to read my correct but deleted again statement about relationship between entropy and disorder). Sincerely,
Enormousdude
16:16, 11 May 2006 (UTC)
Finally, a sort of consensus on the horizon. That is exactly what I meant by adding the edit which says that entropy and disorder are different things. Indeed, you see - there is no accurate definition of disorder. Obviousely, without such definition one can not say that increase of entropy results in increase of disorder (or vice versa). My example about collapsing gas shows that if to use ill-defined "intuitive order" then it may appear that such "order" is sometimes decreasing but sometimes increasing with the increase of entropy. And all kind of misconceptions about second law, arrow of time, etc in relation to disorder arise - simply as a consequence of lack of accurate definition of disorder. Indeed, suppose a gas which was filling space (say, a container of a few light years across) gravitationally collapses into very small volume (say, into a star of a few light seconds across). Which system is more "ordered" then - the first one where gas was evenly distributed over entire container, or the second one in which practically all gas occupies only 1x10^-22 part of the same container? What does your "intuitive disorder definition" tells?
Sincerely, Enormousdude 22:06, 11 May 2006 (UTC)
Ice melting - a classic example of entropy increasing. Is it really? Can't one contrive reversible melting simply by putting ice in water at 0 deg C? Surely gas expanding into a vacuum is a much better example? LeBofSportif 19:48, 12 May 2006 (UTC)
"randomness and disorder" has once again been added to the definition. I must stress that we cannot leave "disorder" and "randomness" undefined in this context, simply because the informal use of such words does not in any way shape or form describe anything about entropy. Although I deeply dispise the attachment of the three words "disorder", "randomness" and "entropy", I will simply add a formal definition for randomess and disorder into the intro. Fresheneesz 07:16, 31 May 2006 (UTC)
This page says that the the book The Time machine is based on entropy, but theres no mention of that on the page for that book. Fresheneesz 09:02, 1 June 2006 (UTC)
According to this site http://www.entropysimple.com/content.htm Entropy is not related to information entropy and is not disorder. Since the first paragraph of the article directly contradicts this site I stopped reading in utter disgust. I almost threw up. I was worse than goatse man. —Preceding unsigned comment added by 64.40.60.1 ( talk • contribs) 18:29, 16 June 2006
According to the IEEE, the use of Negentropy and similar terms is now strongly deprecated.
Such words compound the conceptual difficulty people have with the meanings of entropy, with what is in effectively conceptually a double negative -- starting with entropy, ie the information you don't have, going to negentropy is then the information you don't not have.
Almost always this is not a good way to go, and such terms just get you into larger and larger knots. Schroedinger used negentropy in quite a specific way, for the order that could be built up inside a living system, possible because of the entropy increase being dumped outside the cell. But Brillouin's use was a mess, very confusing.
These coinages have never taken off, and deserve to wither. Negentropy perhaps rates a mention in "see also". The rest really just deserve the ash heap of history -- *most* definitely they do not deserve high-profile special highlighting up at the top of what is meant to be a very simple very mainstream overview article.
-- Jheald 05:09, 20 June 2006 (UTC).
Although the concept of entropy is primarily a thermodynamic construct, it has since expanded into many different fields of study, such as: statistical mechanics, statistical thermodynamics, thermal physics, information theory, psychodynamics, economics, business management, and evolution, to name a few.
Clausius coined the term based on the Greek entrepein meaning "energy turned to waste".
You could be right, but I am quite certain that there are more chemistry students who want to know about entropy than any other type of student. Also, why don't you merge these two articles into one, i.e. [1]+[2]="New name"? -- Sadi Carnot 17:06, 20 June 2006 (UTC)
It is a pleasure to read since it is pure physics taken to an easy reader's understanding. I was bad in physics and would have hoped my teachers would read that so they would actually teach this matter well. Thanks ;) Lincher 01:10, 22 June 2006 (UTC)
From the Nats article:
"When the Shannon entropy is written using a natural logarithm,
it is implicitly giving a number measured in nats."
It should be noted that units so defined, that are 'implicit' (and this is not uncommon - there are many in the natural sciences) are artificial units. For the Shannon Entropy above, H, to have units as fundamental quanta of a physical characteristic of some system, the variable must itself have units. But is a probability, which is always unitless. A more obvious problem with the nat is that is the argument of a logarithm, and the arguments of operators cannot have physical units. Another example of this is the Boltzman factor: is not a physical quantity because the product of the Boltzman factor Kb and the temperature T gives units of Energy, Joules. You cannot take the exponent of something with units, just as you cannot take the log of something with units. You could artificially define the units of this quantity, but they are physically meaningless. Instead, what you see in physics is the quantity where h is Planck's constant and v the frequency of photon emission of some physical system, the product of which is Energy, with units of Joules again. Now you have the exponent of an Energy over an Energy which is a unitless argument. The quantity now has a physical significance. The same situation is true for the Shannon Entropy. The nat is thus an artificial unit, and is therefore unphysical from the standpoint of fundamental physics. This is not to say it is not useful in some disciplines, (in particular signal processing).
Those who confuse the two should not feel bad. Confusing artificial, or constructed, units with physical units is common among non-scientists and even in first year physics classes I have to constantly encourage my students to watch their units so they do not carry quantities with units as the arguments of operators such as ln, log, exp, and so on. I remind them that only unitless quantities can be carried through these probabilistic quantities, and that in thermodynamics it is the Boltzman factor that properly scales statistical quantities. (In other words, the nat doesn't refer to anything physical i.e. "real" and you must therefore multiply it by the Boltzman factor to get a physical quantity). It should also be noted that it is quite common for idiosynchratic scientists and mathematicians to name often-used quantities for the ease of talking about them - and this happens with both physical and non-physical (i.e. probabilistic) quantities. I hope this helps give some perspective here. There are a lot of good discussions that can come from this Entropy article. As an instructor of physics for years, I have seen many of these arguments and I would enjoy discussing them all. Here is a great resource if you want to learn more! :)Cheers, Astrobayes 21:44, 23 June 2006 (UTC)
Paul, you made several semantic edits which change the context and tone of the article, but you added no citations for any of them. Your edits, if properly cited, could provide a valuable educational addition to the article. If not cited however, they seem an awful lot like POV. Astrobayes 20:24, 22 June 2006 (UTC)
Dear Astrobayes, if I cited Einstein, would that be his POV? I think a good article should at the very least consider feasible alternatives and not tout one POV. To consider what happens in an infinite universe, would be a sensible hedging of one's bets. Paul venter 22:12, 29 June 2006 (UTC)
At the bottom of the Overview section, it reads that the second law does not apply to the infinite Universe, yet further down the article in 'The Second Law' section, it reads that the entropy of the Universe does tend to increase. Maybe I'm missing something, but this may need clarifying in order to properly state whether or not the Second law applies to the universe or not.
The jury is still out on whether the universe we live in, is finite in size or infinite. If it is infinite, the Second Law does not apply, if finite it would. Any statement about the entropy of "the Universe" increasing, is premature and speculative. My bets are with the infinite - you choose. Paul venter 21:28, 29 June 2006 (UTC)
This article and Entropysimple.com are up for deletion. They merely contain 4 external links to web pages about entropy. I have added them to this article. If the regulars here like them, they can stay. If not they will go. The two articles concerned are most likely to be deleted and rightly so. -- Bduke 01:07, 27 June 2006 (UTC)
I removed this text: "(Note that this is a simplification: entropy can relate to all kinds of multiplicity of possible microstates. However, with a bit of a stretch, almost all of that multiplicity can be considered as due to different possible energy distributions)." It was not only confusing as it was not attached to a particular reference or section, but the statements it made both before and after it's most recent edits were given no verifiable source to support it. It sounded an awfully lot like POV. If it were to be put back into the article, a verifiable source (by Wiki's standards) where that quote can be found should be included... and it should be placed in an appropriate section. A comment such as that which appears at the end of an article is quite odd. Astrobayes 22:43, 29 June 2006 (UTC)
Coming…The Full Monty on Disorder
Learning about Wikipedia and the power of the ‘users’ in the past few days is a Kafka-esque experience! In the real world of science – as with non-anonymous ‘users’ here – an author knows that his work will be read very critically by competent individuals before publication. (But I find that, not mentioned by any of you, the two ‘articles’ referring to my sites were submitted by a high school youngster! This is an amusing indication that I reach an audience of beginners over whose heads you shoot, but also a reason for its deletion. I would not have defended it for the five minutes I did – if I had known how to trace ‘users’ as page originators.)
A far more important shock is that an anonymous ‘user’ not only can misread and then can misrepresent my serious writing without any check with me prior to a public pronouncement. I’m referring specifically to Jheald’s off-base interpretation of four textbooks’ error as a challenge to my entropy approach (As I glance back, this is just the last of a host of other outworn, unacceptable to chemists nonsense that now really must be challenged – although all his points have been rejected by current chemistry texts or recent/coming journal articles).
Click on http://www.entropysite.com/#whatsnew , scroll to December 2005 and down to the texts listed as 11 through 13, and their common serious misstatement. Those texts tell students literally that “Matter tends to disperse”. Period. How does matter, how do molecules or atoms disperse? No association, no involvement with or dependence on molecular kinetic movement or consequently their ‘internal’ energy? That is prima facie ridiculous and a return to the era of phlogiston and alchemy. That is why I downgraded them as inadequate and not recommended. In contrast and in error, Jheald says [they, the texts/authors] “go out of their way to specifically flag up that distribution of energy isn’t ‘quite’ the whole story.” The ‘whole story’, Jheald and other ‘users’, is that those authors have screwed up! Matter can only disperse if molecular energy is involved. There's no 'quite' other whiffle dust causing it!
I’m surprised Jheald is surprised by mixtures (alloys as one type) having residual entropy at 0 K. Only a perfect crystal is defined as having 0 entropy there. I'd enjoy discussing private email-wise with Jheald (to save others from trivia) the solution to the classic residual entropy 'problem' substances, such as CO, FClO3 and water -- it is the existence of multiple Boltzmann-energy distributions that are frozen in at the equilibrium fusion point. (Journal of Chemical Education, by Evguenii Kozliak, publication in press, due out in 3-5 months.)
Ah me. I have now finally read over the whole W entropy chapter -- evidently strongly influenced by Jheald. I’ll have to write a “Full Monty” to demonstrate for you others, if not convince you of joining, the trend in all chemistry – including the latest to delete that sad old “entropy is disorder’”, Atkins’ worldwide best selling physical chemistry. Why should Wikipedia ‘Entropy’ be a relic of the 19th century? FrankLambert 19:25, 30 June 2006 (UTC)
I hasten to thank you for discovering Styer (2000), and mentioning his conclusions to your W colleagues! Please glance at the start of my second paragraph in “Disorder…” and the first citation there: http://www.entropysite.com/cracked_crutch.html (He provided the physics support to me, and I used four of his illustrations. He later contacted me for backing him when Ohio was changing its HS exam requirements to include‘disorder’ in its question re entropy!) I disagree only with his suggestion of an additional ‘parameter’ for entropy: freedom. No way – students and we have suffered enough due to this kind of non-scientific obstacle (‘disorder') to understanding entropy!
I would also invite you to look at the seminal article of Harvey Leff, physicist and international authority on Maxwell’s demon) in Am. J. Phys. 1996, 64, 1261-1271. I discovered his paper only two days before I submitted the ms. for “Entropy Is Simple, Qualitatively” (that introduced the view of entropy change as a measure of energy dispersal rather than ‘disorder’: http://www.entropysite.com/entropy_is_simple/index.html ) Because of Leff's parallel views, I could smoothly and exactly cite points in his physics approach of “sharing and spreading of energy” to support the qualitative ideas of entropy change that I proposed and that have become accepted in chemistry textbooks. Finding that Leff and I live only ten minutes from each other, we have become scientific friends and associates. We are collaborating in a brief article for Nature. Last month he addressed a meeting of research physicists in San Diego on his spreading/ sharing concept of entropy change and a lengthy paper will be published in the report of that meeting in a few months.
You all would be well advised to read even Leff's 1996 article; it's an early statement of the physicist's counterpart to my low-level qualitative "entropy is a measure of the dispersal of motional molelcular energy" for chemistry.
Now to what I intended to use as an Introduction Thanks very much for your ref to reading that I had not known about, Wikipedia:Neutral _point_of_view. It certainly poses a double problem for me. You see, first, I am a Recovering Disordic, a RD. Just like an RA’s nerves are over-stimulated by only a whiff from the bar adjacent to a hotel lobby, the lengthy discussion in this Talk and the embedded disorder in the Entropy page unduly excite me. Second, I have an equally serious difficulty – but opposite, because it swamps me with serotonin: my POV that began and developed starting some eight years ago has become accepted dogma in chemistry. You in the entropy group seem not to understand that this has occurred. I am writing you about what WAS my POV and is now peer-accepted and in print. Here’s a brief of my sad story:
Never liking thermo, except for the power of the Gibbs free energy relationship, and never having to teach it, I was asked to develop a chem. course for non-science majors that would have more meaningful content and interest than any watered-down general chem. I came up with “Enfolding Entropy”, essentially a baby thermo course with minimal math. Marvelous fun for years. Good kids. In a very thorough ‘entropy via order-disorder’ that led to its (mis)application to human affairs in the last week or even two. I felt a bit uncomfortable. But not much. I put that aside because the students really could grasp why it required energy to produce hydrogen from water, and the problems in ammonia synthesis, etc.. etc.
After my second retirement, from a fascinating non-academic career, I had time to think. I began to wonder why Clausius’ q/T had begun to be applied to shuffled cards and, to start, I checked some history. Clausius, Boltzmann, Planck, and Gibbs , then Shannon and Jaynes – and things began to clear. I strongly advise it to each of you.
My first paper on shuffled cards as unrelated to entropy change caused the deletion of this silly example – my POV of 1999 is now the shared wisdom of all but one chemistry textbooks published since 2004 , whereas in 1999, it was the principal illustration of entropy increase in most general chemistry textbooks. In 2002, my POV discovery was that ‘disorder’ was a valiant try by Boltzmann but an untenable view because of his limitation to 1898 knowledge (prior to the Third Law, quantum mechanics, and full understanding of the energetic behavior of molecules).
As I have told you thrice, that seemingly personal POV of 2002 has, by 2006, resulted in the deletion of “entropy is related to disorder” from the great majority of new chemistry texts – including those in physical chemistry (although I can claim my direct influence only to two phys chems). This is perhaps an unprecedented rate of change in an ‘accepted’ concept in texts. It is not my famous name or power in the establishment that caused a total of 25 or so authors and hundreds of text reviewers to change their views! I am just the lucky guy who belled a giant cat that most chemist-authors had known was dead but wanted somebody else to say so.
My POV of 2002 that entropy is a measure of the dispersal of molecular motional energy has been equally well received and published in the texts mentioned in www.entropysite.com. It is no longer just MY "POV" !.
The newest developments are in the area of the coincidence of configurational entropy from classic statistical mechanics with energy dispersal: an article accepted for publication dealing with the residual entropy of CO, etc., one (in the second round of review) urging the deletion of ‘positional entropy’ from general chemistry, allowing emphasis only on ‘thermal’ entropy, and an article (Hanson, J.Chem. Educ.,83, 2006, 581. April issue. Check eq. 2 and 5, and read the quietly explosive conclusions beneath eq. 5. And Hanson had opposed me from the start in 2002!). My colleague, E. Kozliak at the University of N. Dakota has a fundamental theo paper in preparation showing why and when configurational entropy coincides with results from energy dispersal views.
Boltzmann was brilliant, probably a genius, far ahead of his time in theory but not flawless and, most important for us moderns to realize, still very limited by the science of his era. (Some interesting but minor details that are not too widely known: Even though he died in 1906, there is no evidence that he ever saw, and thus certainly never calculated entropy values via Planck’s 1900 equation, S= R/N ln W [in an article but not printed in a book until 1906, as] S = kB ln W , that is carved on Boltzmann’s tomb. Planck’s nobility in allowing R/N to be called ‘Boltzmann’s constant’, kB, was uncharacteristic of most scientists of that day, as well as now!)
The important question is “what are the bases for Boltzmann’s introduction of order to disorder as a key to understanding spontaneous entropy change?” That 1898 idea came from two to three pages of a conceptual description, a common language summary, that follow over 400 pages of detailed theory in Brush’s translation of Boltzmann’s 1896-1898 “Lectures in Gas Theory” (U. of California Press, 1964). The key paragraph should be quoted in full – preceding and following phrases and sentences primarily repeat or support it (disappointingly) without important technical or essential additional details:
“In order to explain the fact that the calculations based on this assumption correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered – and therefore very improbable – state. When this is the case, then whenever two of more small parts of it come into interaction with each other the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)
That slight, innocent paragraph of a sincere man -- but before modern understanding of q(rev)/T via knowledge of the details of molecular behavior, quantum mechanics, or the Third Law – that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. To it, you and uncounted thousands of others owe the endless hours you have spent in thought and argument. Never having read it, you believed that somewhere there was some profound base. Somewhere. There isn’t. B. was the source and no one challenged him. His words have just been accepted for a century. There is NO scientific basis for seeing entropy as involving order and disorder. Boltzmann’s conclusion was fallacious due to his primitive concept of “increased randomness” in a system at equilibrium (that we now recognize as a maximal number of accessible quantum states under the system’s constraints) naively coupled, because it is an even more primitive idea: there must be an opposite, namely, if the final state is disordered, the initial state must be ordered.
To the contrary, scientifically. If the final state has additional accessible quantum microstates – not to be confused by those of you who flip back and forth from stat mech locations in phase space that you call microstates – then the initial state merely has fewer accessible quantum microstates. The most obvious example, that was so often mistakenly used in elementary chemistry texts prior to my corrections is ice melting to water. Sparkling orderly crystalline ice to disorderly mobile liquid water. That’s the visual and that’s the crystallographic conclusion, but it is not the conclusion from entropy calculations.
The So for liquid water at 273 K = 63.34 J/K mole. Thus, via S = kB ln W = 10 to an exponent of 1,991,000,000,000,000,000,0000,000 (qm microstates). But for crystalline ice at 273 K with an So of 41.34 J/K mol, S = 10 ^ 1,299,000,000,000,000,000,000,000. This shows that ice is “orderly” compared to liquid water? Of course not. Ice simply has fewer accessible quantum microstates. And that is the universal modern criterion.
Gibbs “mixed-upness” is totally irrelevant to ‘order-disorder’ or any other discussion. It comes from a posthumous fragment of writing, unconnected with any detailed argument or supporting any concept proposed by Gibbs.
Aware of the thinking of Lazar Carnot and Kelvin, among others, Clausius enormously increased the power and applicability of their general idea of heat becoming dissipated in a process by including the essential factor of the temperature at which the energy dissipation/dispersal or spreading out occurred via dS = dq (rev)/T. Thermodynamic entropy cannot ever be separated from energy being ‘transformed’ (as Clausius viewed it) or somehow dispersed (in modern words) divided by T.
Even though Joule, Kelvin, Clausius and Maxwell thought about thermodynamics in terms of molecular motion, lacking knowledge of molecules other than bits such as velocity and mean free paths, Clausius’ conclusions could only be framed in/from macro parameters. The detailed nature of what was occurring in dq as a general symbol for a change in the internal energy of a system, was ignored for a century because of the remarkable success of classical statistical mechanics in predicting the magnitude of entropy change in many processes.
For example, such spontaneous events as gas expansion into a vacuum or mixing of gases or liquids, could only be treated by Clausian entropy by finding the quantity of work required to restore the system to its initial state (equaling –q for the forward process). In contrast, by allocating ‘cells’ or ‘microstates of location in phase space’ for molecules or all possible combinations of several molecules , statistical mechanical combinatorial methods could directly count and predict the entropy change in a forward process of expansion or mixing.
It appeared that, because only the number of locations of cells were being counted (and the temperature was unchanged), this ‘configurational or positional’ entropy was unrelated to Clausian ‘thermal entropy’ and apparently did not involve molecular energy. In any equation, the initial energy was unchanged and thus dropped out of the Sackur-Tetrode or ln WFinal/WInitial equations. But conceptually, there is an obvious error – the initial motional energy of the molecules in an expansion or in forming a mixture with another components’ molecules is becoming more spread out in a three-dimensional volume and more dispersed in changing to a new equilibrium. (This is precisely what happens to the energy of a hotter surroundings plus a cooler system. The initial total energy of system plus surroundings is not changed, but some motional energy of the hotter surroundings’ molecules has become spread out to those in the cooler system to raise it to some temperature that is an equilibrium of the universe of surroundings and system.)
The cause of the error also becomes obvious, once stated quantitatively: in phase space a cell ‘location in space’ is also its ‘energy in that space’ and thus any combinatorial measure of increased number of locations for a gas expanding to fill a greater volume is exactly the increased number for the dispersal of the initial motional energy of that gas. Both configurational entropy and thermal entropy measure what Clausius and Kelvin and Carnot tried to express with macro parameters for spontaneous processes – defined in modern terms as “changes occur in systems to yield an equilibrium wherein the motional energy of the molecules in a system is maximally dispersed at the final temperature in the sense of having the maximal number of accessible quantum microstates for the system’s energy at that temperature". (bad long sentence. good concept.)
I once wrote “…violently energetic, speeding gas molecules will spontaneously move into an evacuated chamber or mix with another gas…summarized as increases in entropy” Oberlin thermodynamicist, Norman Craig corrected me: “…in saying that, you are ‘smuggling in’ entropy. Movement of molecules does not cause an entropy increase even though [it] enables an entropy increase.”
He is right. Two factors are necessary for thermodynamic entropy change. An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from bond energy change). However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s motional energy, a greater probability of an increased distribution for the system’s energy.
The two factors, energy and probability, are both necessary for thermodynamic entropy change but neither is sufficient alone.
From this it can be seen the ‘configurational or positional’ entropy – when thought to be only as ‘a measure of the probability of locations’ is a smuggling-in of thermodynamic entropy – the essential enabling factor that makes actual physical/chemical change in matter possible and not just an exercise on paper or computer screen is missing. A student -- or an experienced person - as you all demonstrate -- can be misled and think of it as 'just due to arrangements in space'.
The misnomer of “entropy” should always be in quotation marks if the subject of thermodynamic entropy is present in the same article. The reason is clear (as stated above and repeated here):
Two factors are necessary for thermodynamic entropy change. An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from bond energy change). However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s motional energy, a greater probability of an increased distribution for the system’s energy.
Landauer? Check his "Minimal Energy Requirement IN COMMUNICATION" Science,272, 1996, 1914-1918. I'll ask Leff about this next week, but I am only concerned about thermodynamic entropy and, as stated above, the gulf between thermo entropy and info "entropy" is absolute and unbridgeable. von Neunamm has robbed you all of weeks of your lives or even more. Some cruel joke, that. FrankLambert 21:32, 1 July 2006 (UTC)
I have challenged Nonsuch, and urge others, to tell me what modern (2004 and after) university chemistry textbooks, including physical chemistry (the text that most emphasizes physics) (but excluding biochem; they have to be sloppy) still use 'disorder' in introducing entropy to students. The question is important because (1) it is not my POV; it's objective data re the deletion of 'disorder'as related to entropy in chemistry instruction. (2) entropy and the second law are far more central in beginning chemistry than in beginning physics. (Compare the pages devoted to each for the second law, entropy concept(s) and calculations, and use of the Gibbs free energy relationship with calculations.)
Although you and I were thoroughly taught that "entropy increase is 'disorder' increase", after learning of its origin in Boltzmann's innocent but unscientific statement from 1898, we can be free. The revolution in chemistry texts is real, objective, here now. A modern encyclopedia should certainly be as modern as elementary instruction about a topic! What other goal? Obsolete stuff? "Energy dispersal"? Kats (and others) perhaps have not read my description of entropy change as simply what lies behind the basic dq/T -- it represents the dispersal/spreading out of the motional energy of molecules (heat, to Clausius) to a greater volume of space/T (whether heating [surr.+ sys. always], gas expansion, fluid mixing, i.e. all types presented to beginners). That is inital and superficial BUT it can be readily communicated even to 'concrete-minded' students [or Wikipedia readers] almost instantly because, in chem, they have been told about fast-moving molecules (hey, they're going a thousand miles an hour on average at ordinary temps, five times faster than NASCARs, and continually colliding with each other, etc., etc.) But then, the advantage of this low-level approach is that it can seamlessly lead to the next level, scientifically.
The introduction of quantization of all energies, molecular energies on quant. levels specifically (simple QM deductions: more energies spread out on higher B-distribution levels in hotter systems, denser energy levels in systems with increased volume, etc.), the idea of QM microstates (NOT 'cell-model' stat mech 'microstates'!), and that an increase in entropy is related to a greater number of accessible QM microstates -- what Leff calls "temporal spreading of energy"; more choices of a different microstate at the next instant in time.
Compare that to the crap of 'disorder'. If you stick with 'disorder', you deserve the reward of being mired, don't you? :-) FrankLambert 06:19, 3 July 2006 (UTC)
Since the debate on Entropy and Disorder isn't really going anywhere, I thought I would try a different tack, and make abject appeals to authority. (Not my favorite approach.) Frank's appeal to authority is that recent physical chemistry textbooks use the term "disorder" far less than they used to. I find this unconvincing since introductory physical chemistry textbooks really suck at explaining the deeper principles. (I curse Atkins to this day.) In response, let me quote Callen, arguable one of the finest books on thermodynamics yet written:
Or how about Feynman's Lectures in Physics, another very good book :
Or "Thermal Physics" by Finn, another good, modern thermodynamics textbooks.
Whatever. Nonsuch 00:51, 6 July 2006 (UTC)
Nonsuch, first put in your contacts so you can read what you wrote, a powerful support for one principle that I have been pleading for: 'disorder' is a dead obstacle between entropy and a measure of the number of microstates in a system. Look:
Old Callen (1985) wisely said (you tell me): "...entropy is the quant. measure of...the relevant distribution of the sys. over its permissible microstates".
And even Feynman gets it right sometimes (!!) in his famous quotation: The logarithm of that number of ways [the number of ways that the insides can be arranged, so that from the outside it looks the same] is the entropy."
How many times have I told that to Jheal, the information "entropy" maven who is working the wrong side of the street from his day job at the 'Information "entropy" Wiki page! I'm not arguing from authority, I'm arguing from history -- you guys never read that lead paragraph of Boltzmann's that was the tiny, non-scientific cause of the whole mess for a hundred years. You're bound up in your obeisance to unsubstantiated dogma that was given you by people you respect, but who were as uncritical of what THEY were taught as you. I'm arguing from success: with 'disorder' dead -- and you're the hapless rear guard in its defense, just as all 19th century leaders had to die before the ideas of younger people were accepted (tetrahedral carbon, Boltzmann's theory, etc., etc., etc. -- with gone, what is the replacement?
You forget that you are writing for a wide range of backgrounds and abilities in Wikipedia about thermodynamic entropy. FIRST, should come the simple presentation, from which you build to the more sophisticated view of quantized energy states and then microstates. Only fourth, fifth or tenth should you hint about other interpretations of entropy and refer to your details in info "entropy".
As I've told you, and repeat (because you continue to lose your contacts), should come the simple examples of the second law, spontaneous events, THEN go to what's happening on the molecular level (Tell 'em about those incredibly speeding banging molecules. Hey this is chemistry in thermodynamics, not exclusively physics!)apply that to gas expansion into a vac, the old two bulb deal, then perhaps food dye in water or cream in coffee. Molec. motional energy disperses!
Only then begin to develop energetics, microstates, etc....for the boffo climax that we all know about -- THIS is what entropy IS FrankLambert 14:59, 6 July 2006 (UTC)
I am not knowledgeable re the bit size of words, sentences or other forms of information and communication. From too long ago, I recall some 'baby calculations' of 6 letters to form a word but couldn't find details in my files. Yesterday, glancing at Pitzer ('Thermodynamics', 3rd edition 1995 [McGraw-Hill], I was reminded of an important question that I have wanted to ask of a person competent in such matters, as some of you are. Here's the background:
Pitzer was interested in the number of QM microstates as close to 0 K as experimentally feasible, i.e., the W, where S(lowest possible) = k ln W. Essentially, he assumed that the smallest change in S would be 0.001 J/K mol (his S/R equalling 0.0001). I took from his development a different thought: the least change in entropy that can practically be considered/calculated would be S = 0.001 J/K mol, even though one never sees practical values of entropy tabulated beyond 0.01 J/K mol. The math-trivial solution leads to a W = 10 ^ 3,000,000,000,000,000,000, a rather modest number for W, compared to standard state or reaction entropies for a mole of a substance.
My information-bits question: "Is this figure of ca. 10^10^18 in the range of bits in information content or information communication?" The usual entropy change in chemical reactions is, of course, in the 10^10^24 range. Again, "Are there numbers in info content/communication calculations that are in this range?"
My question is important in evaluating a phrase in this and other discussions of entropy increase as involving an increased "loss of information". Glancing at the example that I gave you -- a few feet ago! in this Talk Entropy discussion of ice and water, do you see how it is beyond human comprehension -- except by numbers of microstates, W -- to evaluate the 'amount of information that entropy reveals is for ice: 10 ^ 1,299,000,000,000,000,000,000,000 microstates. Now, do you instantly understand the 'information that is lost' for the water at the same temperature, or 10^1,991,000,000,000,000,000,0000,000 microstates? It isn't 'appearances' or 'disorder' or 'information' that is vital in evaluating thermod entropy change in science -- it's numbers.
Do you now see how inadequate (a nice word) are descriptions such as 'information' and 'lost information' if you want to discuss thermodynamic entropy change scientifically? Numbers, quantitation via numbers of accessible microstates -- coming readily via the approach to understanding entropy that I am bringing to you, and a major reason for its rapid and wide acceptance in chemistry -- is the only way to cope with 'information' in thermodynamic entropy change. FrankLambert 16:48, 3 July 2006 (UTC)
FrankLambert said "the least change in entropy that can practically be considered/calculated would be S = 0.001 J/K mol". This is patently wrong. Thermodynamic entropy is only macroscopically large when you consider macroscopic numbers of molecules. But the biochemists (and molecular biologists and biophysicists) who's understanding of thermodynamics you deride are obliged to consider the thermodynamics of single molecules. Experimentally accessible entropies are on the order of a few bits per molecule.
Nonsuch
21:56, 4 July 2006 (UTC)
This article size is out of hand and, sadly, we have not arrived at a place where the fundamentals of science speak louder than misleading concepts as "informational entropy." As I stated far above, if you read the peer-reviewed literature on informational Entropy you'll find invented and renamed definitions whose creators have molded into something appearing like thermodynamic Entropy (I provided a link to a peer-reviewed source above describing this). Mathematical integrals are invented to embody statistical distributions and Entropy is redefined to conform with them. Entropy however is heat that moves from one system to another, at a given temperature. Period. Anything else you're calling Entropy is not Entropy - it's something else. Now we definately have more experts in physics and chemistry (i.e. individuals who are most qualified to improve the quality of this article) that are watching this article so it is my hope to see a future for this article that does not include POV edits by individuals not considering the whole of scientific research on this subject. The readers of Wiki deserve a fair and honest explanation of this important concept and calling Entropy "disorder" is akin to saying the Bohr model of the atom is correct in light of modern physics. It's really naive of us to continue equating them. This article deserves better, and I am still planning on making a significant contribution to it by providing an overwhelming wealth of educational resources showing that in the physical world, equating Entropy with disorder is not only unwise, it is a concept that fails actual experiment. And we must bear in mind that if in science a single experiment negates a thesis, the thesis must be augmented. Cheers, Astrobayes 09:53, 6 July 2006 (UTC)
Frank has made quite an impassioned appeal above (see eg under Authority, Shmathority) that the article should be made much more chatty, much more introductory, and much more focussed on getting over the idea of entropy to beginners, through a much more extensive exploration of the idea of energy naturally dispersing.
It may well be the case that such an article would be a useful addition to Wikipedia. But can I suggest that if someone does want to create it, they create it as a new breakout article from the current page. Don't scrap the current page to do it.
Because I think SadiCarnot did a very useful thing back in April, when he took what was then a very long, involved and discursive article, split most of it out onto separate pages, and put the article into essentially its current shape.
What the article represents now is a quite terse, quite tight, "helicopter view" pull-together in a nutshell; giving a quite short map of some of the different ways (all valid, all related) in which entropy can be analysed, pictured, discussed and understood.
These include, for example
Now I'm sure that there are all sorts of improvements that could be made. But at the moment, if that is the article's aim, I think it actually delivers it quite well. IMO, the many different mutually supporting ways that entropy can be thought about are presented so that they actually do cohere quite well; and the interested reader can rapidly navigate to more specific articles which explore particular facets in more detail.
I'm sure the article will continue to evolve and improve. But for myself, I wouldn't like to see the range of it being reduced; nor would I like to see the wordcount getting much longer. IMO, it has a particular purpose, and the shape and structure as it is at the moment helps to achieve it quite well.
I don't have a problem with somebody wanting to write a more gentle, discursive introduction for beginning chemistry students, either as a "breakout" article from the central page, or as an external link. But I do have a problem with articles that claim theirs is the only way to think about entropy. I think that is a problem with some of Frank's articles; and I think that some of the conflict that poor Astrobayes is in stands as quite a warning of what it can lead to.
The truth is that it is useful to be able to think about entropy in different ways; to know how (and why) they are equivalent; and to be able to move seamlessly between them. That's my 2 cents, anyway. Jheald 09:58, 7 July 2006 (UTC).
Fresheneesz, I don't think that given examples of how brilliant scientists described entropy in their brilliant prose in support of using similar phrasing here counts as original research. There are two related points at issue, one of terminology (Roughly, is "disorder" a useful word to use when trying to explain what entropy is) and a deeper, but related dispute over what exactly is entropy anyways.
Frank, thank you for your reply to the card shuffling example. I think I may more or less understand your POV. To summarize, you think that (thermodynamic) entropy is the dispersal of energy, whereas I think entropy is a 'measure' of the dispersal of energy, or more generally a measure of information (or roughly mix-up-ness, disorder, or randomness), there being examples in thermodynamics of entropy being carried by non-energetic degrees of freedom. These different viewpoints lead to the disagreement over describing entropy as "disorder" since that description, omitting any reference to energy, isn't compatible with thinking about entropy as "dispersal of energy"? [User; Nonsuch...]
You correctly identified the limitation of the card unshuffling example, namely aren't all orderings of the cards equivalent? So let me modify the example in order to removed that deficiency. Instead of one pack, I hand you 2 randomly shuffled packs and ask you to put the second pack in the same order as the first pack. This procedure clearly reduces the information entropy of the cards. [User: Nonsuch...]
To compensate I have to increase the entropy of some other part of the universe, perhaps adding energy to a heat bath to increase the thermodynamic entropy. The only fundamental difference between molecules and cards in this example is one of scale. The thermodynamic change due to unshuffling is tiny compared to any macroscopic scale. Nonsuch 16:41, 7 July 2006 (UTC)
First, I owe an apology to Nonsuch and to Jheald for speaking harshly to them when I felt that they had not taken the trouble to consider what they or I said. That's not my prerogative nor an objective judgment of their diligence. (I greatly appreciate their courtesy in not responding in kind.) I get overly excited by any discussion involving entropy. The correct presentation of a basic understanding of thermodynamic entropy to undergraduates in chemistry has been my primary focus for about 8 years. My good luck in discerning what is now accepted by most chemistry text authors has been as amazing to me as it is unknown to those not in chemistry instruction.
Second, I have written too lengthy pieces in this Talk page. That was certainly not from a desire to write [I have papers to do before I pass on!!], but to explain clearly a modern view of thermodynamic entropy in chemistry as measuring the spreading out of energy from being more localized to becoming more dispersed, spatially and temporally. [The latter 'temporally', ties the idea to Leff's interpretation of a system's constantly changing per instant from one accessible microstate to another as an 'entropy dance' over time.] One of 'the modern view's' many advantages, I believe, is that even concrete-thinking students can readily 'get' the idea of active molecules spreading out in space -- both in 'thermal' or 'configurational' entropy, while that 'half of the whole picture' seamlessly fits into the next level of understanding: quantization of energy levels, microstates, etc.
The profound newest view -- that entropy is a two-factor function (please click on the CalPoly seminar [3]) -- explicitly clarifies the difference between thermodynamic entropy and information 'entropy' in my opinion. (It is has passed the first barrier of article peer-review with only one dissenter now being countered.)
With Jheald, I agree that it is time for a focus on the actual Entropy page. Because I believe that naive readers will be the most frequent visitors, but NOT the only visitors of course, the initial introduction should be for them -- but NOT in any chatty style. (Sorry; the suggestion made in the Talk above was fast-typed chatty, but no way should it be such in the Entropy page.) I'll work on a 'possible' late today, of course subject to review. FrankLambert 18:12, 7 July 2006 (UTC)
The problem with this whole discussion is that (some of) the respondents are not fully reading what other people write. To Nonsuch, I did not cite any of Frank's work as a peer-reviewed source. Look above - far above - in this page and you'll find a link I left in a discussion (which everyone ignored because they were all too busy making their individual points to consider others) that highlights an example of physical systems whose Entropy increases as disorder decreases, and it was written by the chemist Brian B. Laird, not Frank Lambert, and was published in the peer-reviewed Journal of Chemical Education. How did you miss this? You were a bit busy making your point I guess to read what I wrote. I understand though, after reading all of your comments I can see you feel strongly about your conception of Entropy and you feel what you say has important authority above the rest of us. But please, read what others write before you say something that illuminates that you are neither reading nor seriously considering what everyone else is saying here. That you missed this link to the citation above shows this clearly. However, there are many of us who are specialists in this field and this article unevenly presents informational statistical disorder over Entropy (by which I mean energy transfer at a specified temperature) - and those Wikians who have expertise on this subject should not have their expertise denigrated because a few individuals have adopted this article and refuse to let the edits of others remain. There are those of us who have gone to school for years studying these subjects so please consider that there may be individuals in this world who, in addition to you, also have important contributions to add to an encyclopedic article such as this. I am curious, what is your educational background that gives you such authority on this subject? Knowing your educational background would really be illuminating for this discussion and would perhaps offer us all a perspective from which we could approach discussions with you (e.g. someone with a mathematics degree approaches topics differently than someone with a degree in computer science or history). I would love to be here every day but I work nearly 50 hours a week at a DoE research facility and I don't have time to get online repeat what I've said before and to spend my time arguing about something which is common scientific knowledge. I think it is time for a peer-review of this article itself. And we're wasting time in these discussions rather than making bold edits on the article. Cheers, Astrobayes 20:41, 7 July 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 |
In reference to this: " of energy transferred by heating is denoted by rather than , because Q is not a state function while the entropy is."
That seems really unclear to me. What is the significance of Q not being a state function. Also, it seems like Q *is* a state function, after all, it doesn't matter how the heat got to the system, as long as it is there, right? the δ notation still confuses me, as I've taken plenty of math and have never seen a δ be used in algebra or calculus. Fresheneesz 10:55, 4 April 2006 (UTC)
Heat cannot be a state function because it is defined in the first law of thermodynamics as a transfer, rather than a stored quantity. Internal energy is stored. The term heat is used to describe energy transfers induced by temperature differences.
In the Units section, perhaps one should also include the statistical definition of entropy, σ=lng, and maybe even the extended one. -- rhevin 19:36, 23 May 2006 (UTC)
In the article, it states, "Entropy is the measure of disorder. It is a key physical variable in describing a thermodynamic system," and I'm left perplexed as to how these two statements exist in the same conceptual context when they are directly preceeded by a formula that neither quantifies nor qualifies "disorder." In fact, while Entropy is a key physical variable that describes how heat moves between physical systems, "disorder" is not - in all my years of thermodynamics, from my classes in grad school to my career as a physicist, I have never measured or calculated "disorder." Putting the word disorder in this article is not a reflection of the variable Entropy as is defined in the Second Law of Thermodynamics. Entropy as disorder only applies to information theory, not physics. What is meant here is multiplicity, or degenerative states, not "disorder." This isn't a POV issue, it's a matter of appropriately describing what is common scientific knowledge. The formula provided describes Entropy as it is: energy flow at a specified temperature. If you are going to describe Entropy in terms of the multiplicity of degenerate states, then that formula should be provided as well, and properly described - not cast in terms of this misleading term, "disorder." For a really helpful guide to non-physicists or non-chemists, go to http://www.entropysimple.com/content.htm . I plan on making changes to this article to address these confusions soon. Any feedback would be greatly appreciated as the quality of this article is very important to me. Best regards, Astrobayes 06:55, 7 June 2006 (UTC)
I just finished added parts to the history section. The whole page seems to be too unwieldy. I propose breaking up the article as follows:
In this manner, the main entropy page will have mini-articles (with {{ see:main}} links attached). See: the thermodynamics and gravitation articles for example; these are one’s I’ve helped to break-up in the past. If anyone wants to do this, by all means do so; this is just a suggested outline. I’ll let this idea marinade for awhile.-- Sadi Carnot 04:20, 5 April 2006 (UTC)
Done with clean-up/break-up of article. Almost nothing was deleted; only moved around. The main article is now only 5 printable pages (instead of about 15) and gives a nice overview. I hope everyone likes the result.-- Sadi Carnot 03:40, 11 April 2006 (UTC)
The History section is a little unclear. Both Sadi and Lazare are mentioned but the timing doesn't quite mesh.
I don't know much about the subject, but to me it just doesn't quite mesh.
There doesn't seem to be any WP reference to Carnot's theorem (not this one). Perhaps all that needs to happen is this part/sentance is removed. I'll leave it to someone with more scientific history knowledge to decide. Frelke 06:19, 7 April 2006 (UTC)
Does reverting "Entropy" increase its entropy? Oneismany 05:29, 4 May 2006 (UTC)
“ | Please put this discussion in chronological order!!! | ” |
I removed the following paragraph as being misleading at best:
Stars are not more ordered than the gas from which they condensed because stars get very hot as they collapse. The sentence on evolution isn't wrong, but it doesn't demonstrate that disorder and entropy are different. Nonsuch 16:09, 9 May 2006 (UTC)
I'm cutting this paragraph, not least because IMO to get into the detail of this discussion is inappropriate for the introduction above the contents list. IMO the introduction is clearer, and gives a more balanced first introduction to the concept of entropy without this paragraph; and IMO even without the paragraph, it is already right at the limit for being as long as it desirably should be.
Also, IMO we already indicate how the intuitive notion of "disorder" or "mixedupness" is to be more precisely understood, namely as
That is a definition of how mixed up we consider the system to be. -- Jheald 09:13, 11 May 2006 (UTC).
Also I reverted the last sentence of this paragraph:
Unfortunately not true. Additional assumptions are necessary. Nonsuch 16:09, 9 May 2006 (UTC)
Dude, I look forward to reading your proof of the 2nd law starting from the classical dynamics of an isolated system and indistinguishavility in a main stream physics journal. In the mean time you're talking utter nonsense. There is no such proof, at least none that is widely accepted. One runs into all sorts of deep philosophical and physical difficulties. At some point you have to assume molecular chaos, or corse graining, or stochastic dynamics or some procedure that throws away information. And remember that your purported proof has to be resilient against attacks by Maxwell daemons and other intelligent opponents. This is a not a trivial problem. Nonsuch 03:17, 11 May 2006 (UTC)
Not true. My many definitions are now at the hearts of many articles of Wikipedia - which before me were incoherent mumbo-jumbo without clear definitions. I am proud that my expertise can improve the quality of Wikipedia which is quite poor (especially compared to real encyclopedias - like old but good soviet encyclopedia, or Britanica or to encyclopedia of physics I have).
By the way, many of my (later removed by Nosuch and others) definitions are exactly the same as in these encyclopedias - I recently checked . These encyclopedias were written not by amateurs as Wikipedia is written by, but by the experts in their fields - that is why they (printed encyclopedias) are so good and concise. Especially when I compare my definitions with good textbooks or US encyclopedia of physics. I intentionally do not quote these credible sources till enough ill-justified reverts accumulate and long discussions go nowhere exactly due to removal of my clear definitions and replacement them back by nonsense mumbo-jumbo).
If I have spelling and sometimes grammatic errors - yes, of course I do - because English is not my native language. Despite that, I have published (and continue to publish) plenty of scientific papers in English. Scientific journals accept them just fine. What surprises me that even you and many others who claim English to be their native also do a lot of spelling and grammar mistakes(!). Do you want to quote yours? Just let me know (in general, I do not like to get involved into personal fight and consider edit wars just a waste of time, but if you provoke me I may be forced to). So, feel free to correct my grammar and spelling instead of reverting back to undefined nonsense. That is the exact spirit of Wikipedia - to jointly increase knowledge rather than blindly delete it. Sincerely, Enormousdude 16:16, 11 May 2006 (UTC)
Guys, please first clearly DEFINE your terms to avoid endless (and sometimes incoherent) mumbling. Start with the definition of order/disorder (you are velcome to read my correct but deleted again statement about relationship between entropy and disorder). Sincerely,
Enormousdude
16:16, 11 May 2006 (UTC)
Finally, a sort of consensus on the horizon. That is exactly what I meant by adding the edit which says that entropy and disorder are different things. Indeed, you see - there is no accurate definition of disorder. Obviousely, without such definition one can not say that increase of entropy results in increase of disorder (or vice versa). My example about collapsing gas shows that if to use ill-defined "intuitive order" then it may appear that such "order" is sometimes decreasing but sometimes increasing with the increase of entropy. And all kind of misconceptions about second law, arrow of time, etc in relation to disorder arise - simply as a consequence of lack of accurate definition of disorder. Indeed, suppose a gas which was filling space (say, a container of a few light years across) gravitationally collapses into very small volume (say, into a star of a few light seconds across). Which system is more "ordered" then - the first one where gas was evenly distributed over entire container, or the second one in which practically all gas occupies only 1x10^-22 part of the same container? What does your "intuitive disorder definition" tells?
Sincerely, Enormousdude 22:06, 11 May 2006 (UTC)
Ice melting - a classic example of entropy increasing. Is it really? Can't one contrive reversible melting simply by putting ice in water at 0 deg C? Surely gas expanding into a vacuum is a much better example? LeBofSportif 19:48, 12 May 2006 (UTC)
"randomness and disorder" has once again been added to the definition. I must stress that we cannot leave "disorder" and "randomness" undefined in this context, simply because the informal use of such words does not in any way shape or form describe anything about entropy. Although I deeply dispise the attachment of the three words "disorder", "randomness" and "entropy", I will simply add a formal definition for randomess and disorder into the intro. Fresheneesz 07:16, 31 May 2006 (UTC)
This page says that the the book The Time machine is based on entropy, but theres no mention of that on the page for that book. Fresheneesz 09:02, 1 June 2006 (UTC)
According to this site http://www.entropysimple.com/content.htm Entropy is not related to information entropy and is not disorder. Since the first paragraph of the article directly contradicts this site I stopped reading in utter disgust. I almost threw up. I was worse than goatse man. —Preceding unsigned comment added by 64.40.60.1 ( talk • contribs) 18:29, 16 June 2006
According to the IEEE, the use of Negentropy and similar terms is now strongly deprecated.
Such words compound the conceptual difficulty people have with the meanings of entropy, with what is in effectively conceptually a double negative -- starting with entropy, ie the information you don't have, going to negentropy is then the information you don't not have.
Almost always this is not a good way to go, and such terms just get you into larger and larger knots. Schroedinger used negentropy in quite a specific way, for the order that could be built up inside a living system, possible because of the entropy increase being dumped outside the cell. But Brillouin's use was a mess, very confusing.
These coinages have never taken off, and deserve to wither. Negentropy perhaps rates a mention in "see also". The rest really just deserve the ash heap of history -- *most* definitely they do not deserve high-profile special highlighting up at the top of what is meant to be a very simple very mainstream overview article.
-- Jheald 05:09, 20 June 2006 (UTC).
Although the concept of entropy is primarily a thermodynamic construct, it has since expanded into many different fields of study, such as: statistical mechanics, statistical thermodynamics, thermal physics, information theory, psychodynamics, economics, business management, and evolution, to name a few.
Clausius coined the term based on the Greek entrepein meaning "energy turned to waste".
You could be right, but I am quite certain that there are more chemistry students who want to know about entropy than any other type of student. Also, why don't you merge these two articles into one, i.e. [1]+[2]="New name"? -- Sadi Carnot 17:06, 20 June 2006 (UTC)
It is a pleasure to read since it is pure physics taken to an easy reader's understanding. I was bad in physics and would have hoped my teachers would read that so they would actually teach this matter well. Thanks ;) Lincher 01:10, 22 June 2006 (UTC)
From the Nats article:
"When the Shannon entropy is written using a natural logarithm,
it is implicitly giving a number measured in nats."
It should be noted that units so defined, that are 'implicit' (and this is not uncommon - there are many in the natural sciences) are artificial units. For the Shannon Entropy above, H, to have units as fundamental quanta of a physical characteristic of some system, the variable must itself have units. But is a probability, which is always unitless. A more obvious problem with the nat is that is the argument of a logarithm, and the arguments of operators cannot have physical units. Another example of this is the Boltzman factor: is not a physical quantity because the product of the Boltzman factor Kb and the temperature T gives units of Energy, Joules. You cannot take the exponent of something with units, just as you cannot take the log of something with units. You could artificially define the units of this quantity, but they are physically meaningless. Instead, what you see in physics is the quantity where h is Planck's constant and v the frequency of photon emission of some physical system, the product of which is Energy, with units of Joules again. Now you have the exponent of an Energy over an Energy which is a unitless argument. The quantity now has a physical significance. The same situation is true for the Shannon Entropy. The nat is thus an artificial unit, and is therefore unphysical from the standpoint of fundamental physics. This is not to say it is not useful in some disciplines, (in particular signal processing).
Those who confuse the two should not feel bad. Confusing artificial, or constructed, units with physical units is common among non-scientists and even in first year physics classes I have to constantly encourage my students to watch their units so they do not carry quantities with units as the arguments of operators such as ln, log, exp, and so on. I remind them that only unitless quantities can be carried through these probabilistic quantities, and that in thermodynamics it is the Boltzman factor that properly scales statistical quantities. (In other words, the nat doesn't refer to anything physical i.e. "real" and you must therefore multiply it by the Boltzman factor to get a physical quantity). It should also be noted that it is quite common for idiosynchratic scientists and mathematicians to name often-used quantities for the ease of talking about them - and this happens with both physical and non-physical (i.e. probabilistic) quantities. I hope this helps give some perspective here. There are a lot of good discussions that can come from this Entropy article. As an instructor of physics for years, I have seen many of these arguments and I would enjoy discussing them all. Here is a great resource if you want to learn more! :)Cheers, Astrobayes 21:44, 23 June 2006 (UTC)
Paul, you made several semantic edits which change the context and tone of the article, but you added no citations for any of them. Your edits, if properly cited, could provide a valuable educational addition to the article. If not cited however, they seem an awful lot like POV. Astrobayes 20:24, 22 June 2006 (UTC)
Dear Astrobayes, if I cited Einstein, would that be his POV? I think a good article should at the very least consider feasible alternatives and not tout one POV. To consider what happens in an infinite universe, would be a sensible hedging of one's bets. Paul venter 22:12, 29 June 2006 (UTC)
At the bottom of the Overview section, it reads that the second law does not apply to the infinite Universe, yet further down the article in 'The Second Law' section, it reads that the entropy of the Universe does tend to increase. Maybe I'm missing something, but this may need clarifying in order to properly state whether or not the Second law applies to the universe or not.
The jury is still out on whether the universe we live in, is finite in size or infinite. If it is infinite, the Second Law does not apply, if finite it would. Any statement about the entropy of "the Universe" increasing, is premature and speculative. My bets are with the infinite - you choose. Paul venter 21:28, 29 June 2006 (UTC)
This article and Entropysimple.com are up for deletion. They merely contain 4 external links to web pages about entropy. I have added them to this article. If the regulars here like them, they can stay. If not they will go. The two articles concerned are most likely to be deleted and rightly so. -- Bduke 01:07, 27 June 2006 (UTC)
I removed this text: "(Note that this is a simplification: entropy can relate to all kinds of multiplicity of possible microstates. However, with a bit of a stretch, almost all of that multiplicity can be considered as due to different possible energy distributions)." It was not only confusing as it was not attached to a particular reference or section, but the statements it made both before and after it's most recent edits were given no verifiable source to support it. It sounded an awfully lot like POV. If it were to be put back into the article, a verifiable source (by Wiki's standards) where that quote can be found should be included... and it should be placed in an appropriate section. A comment such as that which appears at the end of an article is quite odd. Astrobayes 22:43, 29 June 2006 (UTC)
Coming…The Full Monty on Disorder
Learning about Wikipedia and the power of the ‘users’ in the past few days is a Kafka-esque experience! In the real world of science – as with non-anonymous ‘users’ here – an author knows that his work will be read very critically by competent individuals before publication. (But I find that, not mentioned by any of you, the two ‘articles’ referring to my sites were submitted by a high school youngster! This is an amusing indication that I reach an audience of beginners over whose heads you shoot, but also a reason for its deletion. I would not have defended it for the five minutes I did – if I had known how to trace ‘users’ as page originators.)
A far more important shock is that an anonymous ‘user’ not only can misread and then can misrepresent my serious writing without any check with me prior to a public pronouncement. I’m referring specifically to Jheald’s off-base interpretation of four textbooks’ error as a challenge to my entropy approach (As I glance back, this is just the last of a host of other outworn, unacceptable to chemists nonsense that now really must be challenged – although all his points have been rejected by current chemistry texts or recent/coming journal articles).
Click on http://www.entropysite.com/#whatsnew , scroll to December 2005 and down to the texts listed as 11 through 13, and their common serious misstatement. Those texts tell students literally that “Matter tends to disperse”. Period. How does matter, how do molecules or atoms disperse? No association, no involvement with or dependence on molecular kinetic movement or consequently their ‘internal’ energy? That is prima facie ridiculous and a return to the era of phlogiston and alchemy. That is why I downgraded them as inadequate and not recommended. In contrast and in error, Jheald says [they, the texts/authors] “go out of their way to specifically flag up that distribution of energy isn’t ‘quite’ the whole story.” The ‘whole story’, Jheald and other ‘users’, is that those authors have screwed up! Matter can only disperse if molecular energy is involved. There's no 'quite' other whiffle dust causing it!
I’m surprised Jheald is surprised by mixtures (alloys as one type) having residual entropy at 0 K. Only a perfect crystal is defined as having 0 entropy there. I'd enjoy discussing private email-wise with Jheald (to save others from trivia) the solution to the classic residual entropy 'problem' substances, such as CO, FClO3 and water -- it is the existence of multiple Boltzmann-energy distributions that are frozen in at the equilibrium fusion point. (Journal of Chemical Education, by Evguenii Kozliak, publication in press, due out in 3-5 months.)
Ah me. I have now finally read over the whole W entropy chapter -- evidently strongly influenced by Jheald. I’ll have to write a “Full Monty” to demonstrate for you others, if not convince you of joining, the trend in all chemistry – including the latest to delete that sad old “entropy is disorder’”, Atkins’ worldwide best selling physical chemistry. Why should Wikipedia ‘Entropy’ be a relic of the 19th century? FrankLambert 19:25, 30 June 2006 (UTC)
I hasten to thank you for discovering Styer (2000), and mentioning his conclusions to your W colleagues! Please glance at the start of my second paragraph in “Disorder…” and the first citation there: http://www.entropysite.com/cracked_crutch.html (He provided the physics support to me, and I used four of his illustrations. He later contacted me for backing him when Ohio was changing its HS exam requirements to include‘disorder’ in its question re entropy!) I disagree only with his suggestion of an additional ‘parameter’ for entropy: freedom. No way – students and we have suffered enough due to this kind of non-scientific obstacle (‘disorder') to understanding entropy!
I would also invite you to look at the seminal article of Harvey Leff, physicist and international authority on Maxwell’s demon) in Am. J. Phys. 1996, 64, 1261-1271. I discovered his paper only two days before I submitted the ms. for “Entropy Is Simple, Qualitatively” (that introduced the view of entropy change as a measure of energy dispersal rather than ‘disorder’: http://www.entropysite.com/entropy_is_simple/index.html ) Because of Leff's parallel views, I could smoothly and exactly cite points in his physics approach of “sharing and spreading of energy” to support the qualitative ideas of entropy change that I proposed and that have become accepted in chemistry textbooks. Finding that Leff and I live only ten minutes from each other, we have become scientific friends and associates. We are collaborating in a brief article for Nature. Last month he addressed a meeting of research physicists in San Diego on his spreading/ sharing concept of entropy change and a lengthy paper will be published in the report of that meeting in a few months.
You all would be well advised to read even Leff's 1996 article; it's an early statement of the physicist's counterpart to my low-level qualitative "entropy is a measure of the dispersal of motional molelcular energy" for chemistry.
Now to what I intended to use as an Introduction Thanks very much for your ref to reading that I had not known about, Wikipedia:Neutral _point_of_view. It certainly poses a double problem for me. You see, first, I am a Recovering Disordic, a RD. Just like an RA’s nerves are over-stimulated by only a whiff from the bar adjacent to a hotel lobby, the lengthy discussion in this Talk and the embedded disorder in the Entropy page unduly excite me. Second, I have an equally serious difficulty – but opposite, because it swamps me with serotonin: my POV that began and developed starting some eight years ago has become accepted dogma in chemistry. You in the entropy group seem not to understand that this has occurred. I am writing you about what WAS my POV and is now peer-accepted and in print. Here’s a brief of my sad story:
Never liking thermo, except for the power of the Gibbs free energy relationship, and never having to teach it, I was asked to develop a chem. course for non-science majors that would have more meaningful content and interest than any watered-down general chem. I came up with “Enfolding Entropy”, essentially a baby thermo course with minimal math. Marvelous fun for years. Good kids. In a very thorough ‘entropy via order-disorder’ that led to its (mis)application to human affairs in the last week or even two. I felt a bit uncomfortable. But not much. I put that aside because the students really could grasp why it required energy to produce hydrogen from water, and the problems in ammonia synthesis, etc.. etc.
After my second retirement, from a fascinating non-academic career, I had time to think. I began to wonder why Clausius’ q/T had begun to be applied to shuffled cards and, to start, I checked some history. Clausius, Boltzmann, Planck, and Gibbs , then Shannon and Jaynes – and things began to clear. I strongly advise it to each of you.
My first paper on shuffled cards as unrelated to entropy change caused the deletion of this silly example – my POV of 1999 is now the shared wisdom of all but one chemistry textbooks published since 2004 , whereas in 1999, it was the principal illustration of entropy increase in most general chemistry textbooks. In 2002, my POV discovery was that ‘disorder’ was a valiant try by Boltzmann but an untenable view because of his limitation to 1898 knowledge (prior to the Third Law, quantum mechanics, and full understanding of the energetic behavior of molecules).
As I have told you thrice, that seemingly personal POV of 2002 has, by 2006, resulted in the deletion of “entropy is related to disorder” from the great majority of new chemistry texts – including those in physical chemistry (although I can claim my direct influence only to two phys chems). This is perhaps an unprecedented rate of change in an ‘accepted’ concept in texts. It is not my famous name or power in the establishment that caused a total of 25 or so authors and hundreds of text reviewers to change their views! I am just the lucky guy who belled a giant cat that most chemist-authors had known was dead but wanted somebody else to say so.
My POV of 2002 that entropy is a measure of the dispersal of molecular motional energy has been equally well received and published in the texts mentioned in www.entropysite.com. It is no longer just MY "POV" !.
The newest developments are in the area of the coincidence of configurational entropy from classic statistical mechanics with energy dispersal: an article accepted for publication dealing with the residual entropy of CO, etc., one (in the second round of review) urging the deletion of ‘positional entropy’ from general chemistry, allowing emphasis only on ‘thermal’ entropy, and an article (Hanson, J.Chem. Educ.,83, 2006, 581. April issue. Check eq. 2 and 5, and read the quietly explosive conclusions beneath eq. 5. And Hanson had opposed me from the start in 2002!). My colleague, E. Kozliak at the University of N. Dakota has a fundamental theo paper in preparation showing why and when configurational entropy coincides with results from energy dispersal views.
Boltzmann was brilliant, probably a genius, far ahead of his time in theory but not flawless and, most important for us moderns to realize, still very limited by the science of his era. (Some interesting but minor details that are not too widely known: Even though he died in 1906, there is no evidence that he ever saw, and thus certainly never calculated entropy values via Planck’s 1900 equation, S= R/N ln W [in an article but not printed in a book until 1906, as] S = kB ln W , that is carved on Boltzmann’s tomb. Planck’s nobility in allowing R/N to be called ‘Boltzmann’s constant’, kB, was uncharacteristic of most scientists of that day, as well as now!)
The important question is “what are the bases for Boltzmann’s introduction of order to disorder as a key to understanding spontaneous entropy change?” That 1898 idea came from two to three pages of a conceptual description, a common language summary, that follow over 400 pages of detailed theory in Brush’s translation of Boltzmann’s 1896-1898 “Lectures in Gas Theory” (U. of California Press, 1964). The key paragraph should be quoted in full – preceding and following phrases and sentences primarily repeat or support it (disappointingly) without important technical or essential additional details:
“In order to explain the fact that the calculations based on this assumption correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered – and therefore very improbable – state. When this is the case, then whenever two of more small parts of it come into interaction with each other the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)
That slight, innocent paragraph of a sincere man -- but before modern understanding of q(rev)/T via knowledge of the details of molecular behavior, quantum mechanics, or the Third Law – that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. To it, you and uncounted thousands of others owe the endless hours you have spent in thought and argument. Never having read it, you believed that somewhere there was some profound base. Somewhere. There isn’t. B. was the source and no one challenged him. His words have just been accepted for a century. There is NO scientific basis for seeing entropy as involving order and disorder. Boltzmann’s conclusion was fallacious due to his primitive concept of “increased randomness” in a system at equilibrium (that we now recognize as a maximal number of accessible quantum states under the system’s constraints) naively coupled, because it is an even more primitive idea: there must be an opposite, namely, if the final state is disordered, the initial state must be ordered.
To the contrary, scientifically. If the final state has additional accessible quantum microstates – not to be confused by those of you who flip back and forth from stat mech locations in phase space that you call microstates – then the initial state merely has fewer accessible quantum microstates. The most obvious example, that was so often mistakenly used in elementary chemistry texts prior to my corrections is ice melting to water. Sparkling orderly crystalline ice to disorderly mobile liquid water. That’s the visual and that’s the crystallographic conclusion, but it is not the conclusion from entropy calculations.
The So for liquid water at 273 K = 63.34 J/K mole. Thus, via S = kB ln W = 10 to an exponent of 1,991,000,000,000,000,000,0000,000 (qm microstates). But for crystalline ice at 273 K with an So of 41.34 J/K mol, S = 10 ^ 1,299,000,000,000,000,000,000,000. This shows that ice is “orderly” compared to liquid water? Of course not. Ice simply has fewer accessible quantum microstates. And that is the universal modern criterion.
Gibbs “mixed-upness” is totally irrelevant to ‘order-disorder’ or any other discussion. It comes from a posthumous fragment of writing, unconnected with any detailed argument or supporting any concept proposed by Gibbs.
Aware of the thinking of Lazar Carnot and Kelvin, among others, Clausius enormously increased the power and applicability of their general idea of heat becoming dissipated in a process by including the essential factor of the temperature at which the energy dissipation/dispersal or spreading out occurred via dS = dq (rev)/T. Thermodynamic entropy cannot ever be separated from energy being ‘transformed’ (as Clausius viewed it) or somehow dispersed (in modern words) divided by T.
Even though Joule, Kelvin, Clausius and Maxwell thought about thermodynamics in terms of molecular motion, lacking knowledge of molecules other than bits such as velocity and mean free paths, Clausius’ conclusions could only be framed in/from macro parameters. The detailed nature of what was occurring in dq as a general symbol for a change in the internal energy of a system, was ignored for a century because of the remarkable success of classical statistical mechanics in predicting the magnitude of entropy change in many processes.
For example, such spontaneous events as gas expansion into a vacuum or mixing of gases or liquids, could only be treated by Clausian entropy by finding the quantity of work required to restore the system to its initial state (equaling –q for the forward process). In contrast, by allocating ‘cells’ or ‘microstates of location in phase space’ for molecules or all possible combinations of several molecules , statistical mechanical combinatorial methods could directly count and predict the entropy change in a forward process of expansion or mixing.
It appeared that, because only the number of locations of cells were being counted (and the temperature was unchanged), this ‘configurational or positional’ entropy was unrelated to Clausian ‘thermal entropy’ and apparently did not involve molecular energy. In any equation, the initial energy was unchanged and thus dropped out of the Sackur-Tetrode or ln WFinal/WInitial equations. But conceptually, there is an obvious error – the initial motional energy of the molecules in an expansion or in forming a mixture with another components’ molecules is becoming more spread out in a three-dimensional volume and more dispersed in changing to a new equilibrium. (This is precisely what happens to the energy of a hotter surroundings plus a cooler system. The initial total energy of system plus surroundings is not changed, but some motional energy of the hotter surroundings’ molecules has become spread out to those in the cooler system to raise it to some temperature that is an equilibrium of the universe of surroundings and system.)
The cause of the error also becomes obvious, once stated quantitatively: in phase space a cell ‘location in space’ is also its ‘energy in that space’ and thus any combinatorial measure of increased number of locations for a gas expanding to fill a greater volume is exactly the increased number for the dispersal of the initial motional energy of that gas. Both configurational entropy and thermal entropy measure what Clausius and Kelvin and Carnot tried to express with macro parameters for spontaneous processes – defined in modern terms as “changes occur in systems to yield an equilibrium wherein the motional energy of the molecules in a system is maximally dispersed at the final temperature in the sense of having the maximal number of accessible quantum microstates for the system’s energy at that temperature". (bad long sentence. good concept.)
I once wrote “…violently energetic, speeding gas molecules will spontaneously move into an evacuated chamber or mix with another gas…summarized as increases in entropy” Oberlin thermodynamicist, Norman Craig corrected me: “…in saying that, you are ‘smuggling in’ entropy. Movement of molecules does not cause an entropy increase even though [it] enables an entropy increase.”
He is right. Two factors are necessary for thermodynamic entropy change. An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from bond energy change). However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s motional energy, a greater probability of an increased distribution for the system’s energy.
The two factors, energy and probability, are both necessary for thermodynamic entropy change but neither is sufficient alone.
From this it can be seen the ‘configurational or positional’ entropy – when thought to be only as ‘a measure of the probability of locations’ is a smuggling-in of thermodynamic entropy – the essential enabling factor that makes actual physical/chemical change in matter possible and not just an exercise on paper or computer screen is missing. A student -- or an experienced person - as you all demonstrate -- can be misled and think of it as 'just due to arrangements in space'.
The misnomer of “entropy” should always be in quotation marks if the subject of thermodynamic entropy is present in the same article. The reason is clear (as stated above and repeated here):
Two factors are necessary for thermodynamic entropy change. An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from bond energy change). However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s motional energy, a greater probability of an increased distribution for the system’s energy.
Landauer? Check his "Minimal Energy Requirement IN COMMUNICATION" Science,272, 1996, 1914-1918. I'll ask Leff about this next week, but I am only concerned about thermodynamic entropy and, as stated above, the gulf between thermo entropy and info "entropy" is absolute and unbridgeable. von Neunamm has robbed you all of weeks of your lives or even more. Some cruel joke, that. FrankLambert 21:32, 1 July 2006 (UTC)
I have challenged Nonsuch, and urge others, to tell me what modern (2004 and after) university chemistry textbooks, including physical chemistry (the text that most emphasizes physics) (but excluding biochem; they have to be sloppy) still use 'disorder' in introducing entropy to students. The question is important because (1) it is not my POV; it's objective data re the deletion of 'disorder'as related to entropy in chemistry instruction. (2) entropy and the second law are far more central in beginning chemistry than in beginning physics. (Compare the pages devoted to each for the second law, entropy concept(s) and calculations, and use of the Gibbs free energy relationship with calculations.)
Although you and I were thoroughly taught that "entropy increase is 'disorder' increase", after learning of its origin in Boltzmann's innocent but unscientific statement from 1898, we can be free. The revolution in chemistry texts is real, objective, here now. A modern encyclopedia should certainly be as modern as elementary instruction about a topic! What other goal? Obsolete stuff? "Energy dispersal"? Kats (and others) perhaps have not read my description of entropy change as simply what lies behind the basic dq/T -- it represents the dispersal/spreading out of the motional energy of molecules (heat, to Clausius) to a greater volume of space/T (whether heating [surr.+ sys. always], gas expansion, fluid mixing, i.e. all types presented to beginners). That is inital and superficial BUT it can be readily communicated even to 'concrete-minded' students [or Wikipedia readers] almost instantly because, in chem, they have been told about fast-moving molecules (hey, they're going a thousand miles an hour on average at ordinary temps, five times faster than NASCARs, and continually colliding with each other, etc., etc.) But then, the advantage of this low-level approach is that it can seamlessly lead to the next level, scientifically.
The introduction of quantization of all energies, molecular energies on quant. levels specifically (simple QM deductions: more energies spread out on higher B-distribution levels in hotter systems, denser energy levels in systems with increased volume, etc.), the idea of QM microstates (NOT 'cell-model' stat mech 'microstates'!), and that an increase in entropy is related to a greater number of accessible QM microstates -- what Leff calls "temporal spreading of energy"; more choices of a different microstate at the next instant in time.
Compare that to the crap of 'disorder'. If you stick with 'disorder', you deserve the reward of being mired, don't you? :-) FrankLambert 06:19, 3 July 2006 (UTC)
Since the debate on Entropy and Disorder isn't really going anywhere, I thought I would try a different tack, and make abject appeals to authority. (Not my favorite approach.) Frank's appeal to authority is that recent physical chemistry textbooks use the term "disorder" far less than they used to. I find this unconvincing since introductory physical chemistry textbooks really suck at explaining the deeper principles. (I curse Atkins to this day.) In response, let me quote Callen, arguable one of the finest books on thermodynamics yet written:
Or how about Feynman's Lectures in Physics, another very good book :
Or "Thermal Physics" by Finn, another good, modern thermodynamics textbooks.
Whatever. Nonsuch 00:51, 6 July 2006 (UTC)
Nonsuch, first put in your contacts so you can read what you wrote, a powerful support for one principle that I have been pleading for: 'disorder' is a dead obstacle between entropy and a measure of the number of microstates in a system. Look:
Old Callen (1985) wisely said (you tell me): "...entropy is the quant. measure of...the relevant distribution of the sys. over its permissible microstates".
And even Feynman gets it right sometimes (!!) in his famous quotation: The logarithm of that number of ways [the number of ways that the insides can be arranged, so that from the outside it looks the same] is the entropy."
How many times have I told that to Jheal, the information "entropy" maven who is working the wrong side of the street from his day job at the 'Information "entropy" Wiki page! I'm not arguing from authority, I'm arguing from history -- you guys never read that lead paragraph of Boltzmann's that was the tiny, non-scientific cause of the whole mess for a hundred years. You're bound up in your obeisance to unsubstantiated dogma that was given you by people you respect, but who were as uncritical of what THEY were taught as you. I'm arguing from success: with 'disorder' dead -- and you're the hapless rear guard in its defense, just as all 19th century leaders had to die before the ideas of younger people were accepted (tetrahedral carbon, Boltzmann's theory, etc., etc., etc. -- with gone, what is the replacement?
You forget that you are writing for a wide range of backgrounds and abilities in Wikipedia about thermodynamic entropy. FIRST, should come the simple presentation, from which you build to the more sophisticated view of quantized energy states and then microstates. Only fourth, fifth or tenth should you hint about other interpretations of entropy and refer to your details in info "entropy".
As I've told you, and repeat (because you continue to lose your contacts), should come the simple examples of the second law, spontaneous events, THEN go to what's happening on the molecular level (Tell 'em about those incredibly speeding banging molecules. Hey this is chemistry in thermodynamics, not exclusively physics!)apply that to gas expansion into a vac, the old two bulb deal, then perhaps food dye in water or cream in coffee. Molec. motional energy disperses!
Only then begin to develop energetics, microstates, etc....for the boffo climax that we all know about -- THIS is what entropy IS FrankLambert 14:59, 6 July 2006 (UTC)
I am not knowledgeable re the bit size of words, sentences or other forms of information and communication. From too long ago, I recall some 'baby calculations' of 6 letters to form a word but couldn't find details in my files. Yesterday, glancing at Pitzer ('Thermodynamics', 3rd edition 1995 [McGraw-Hill], I was reminded of an important question that I have wanted to ask of a person competent in such matters, as some of you are. Here's the background:
Pitzer was interested in the number of QM microstates as close to 0 K as experimentally feasible, i.e., the W, where S(lowest possible) = k ln W. Essentially, he assumed that the smallest change in S would be 0.001 J/K mol (his S/R equalling 0.0001). I took from his development a different thought: the least change in entropy that can practically be considered/calculated would be S = 0.001 J/K mol, even though one never sees practical values of entropy tabulated beyond 0.01 J/K mol. The math-trivial solution leads to a W = 10 ^ 3,000,000,000,000,000,000, a rather modest number for W, compared to standard state or reaction entropies for a mole of a substance.
My information-bits question: "Is this figure of ca. 10^10^18 in the range of bits in information content or information communication?" The usual entropy change in chemical reactions is, of course, in the 10^10^24 range. Again, "Are there numbers in info content/communication calculations that are in this range?"
My question is important in evaluating a phrase in this and other discussions of entropy increase as involving an increased "loss of information". Glancing at the example that I gave you -- a few feet ago! in this Talk Entropy discussion of ice and water, do you see how it is beyond human comprehension -- except by numbers of microstates, W -- to evaluate the 'amount of information that entropy reveals is for ice: 10 ^ 1,299,000,000,000,000,000,000,000 microstates. Now, do you instantly understand the 'information that is lost' for the water at the same temperature, or 10^1,991,000,000,000,000,000,0000,000 microstates? It isn't 'appearances' or 'disorder' or 'information' that is vital in evaluating thermod entropy change in science -- it's numbers.
Do you now see how inadequate (a nice word) are descriptions such as 'information' and 'lost information' if you want to discuss thermodynamic entropy change scientifically? Numbers, quantitation via numbers of accessible microstates -- coming readily via the approach to understanding entropy that I am bringing to you, and a major reason for its rapid and wide acceptance in chemistry -- is the only way to cope with 'information' in thermodynamic entropy change. FrankLambert 16:48, 3 July 2006 (UTC)
FrankLambert said "the least change in entropy that can practically be considered/calculated would be S = 0.001 J/K mol". This is patently wrong. Thermodynamic entropy is only macroscopically large when you consider macroscopic numbers of molecules. But the biochemists (and molecular biologists and biophysicists) who's understanding of thermodynamics you deride are obliged to consider the thermodynamics of single molecules. Experimentally accessible entropies are on the order of a few bits per molecule.
Nonsuch
21:56, 4 July 2006 (UTC)
This article size is out of hand and, sadly, we have not arrived at a place where the fundamentals of science speak louder than misleading concepts as "informational entropy." As I stated far above, if you read the peer-reviewed literature on informational Entropy you'll find invented and renamed definitions whose creators have molded into something appearing like thermodynamic Entropy (I provided a link to a peer-reviewed source above describing this). Mathematical integrals are invented to embody statistical distributions and Entropy is redefined to conform with them. Entropy however is heat that moves from one system to another, at a given temperature. Period. Anything else you're calling Entropy is not Entropy - it's something else. Now we definately have more experts in physics and chemistry (i.e. individuals who are most qualified to improve the quality of this article) that are watching this article so it is my hope to see a future for this article that does not include POV edits by individuals not considering the whole of scientific research on this subject. The readers of Wiki deserve a fair and honest explanation of this important concept and calling Entropy "disorder" is akin to saying the Bohr model of the atom is correct in light of modern physics. It's really naive of us to continue equating them. This article deserves better, and I am still planning on making a significant contribution to it by providing an overwhelming wealth of educational resources showing that in the physical world, equating Entropy with disorder is not only unwise, it is a concept that fails actual experiment. And we must bear in mind that if in science a single experiment negates a thesis, the thesis must be augmented. Cheers, Astrobayes 09:53, 6 July 2006 (UTC)
Frank has made quite an impassioned appeal above (see eg under Authority, Shmathority) that the article should be made much more chatty, much more introductory, and much more focussed on getting over the idea of entropy to beginners, through a much more extensive exploration of the idea of energy naturally dispersing.
It may well be the case that such an article would be a useful addition to Wikipedia. But can I suggest that if someone does want to create it, they create it as a new breakout article from the current page. Don't scrap the current page to do it.
Because I think SadiCarnot did a very useful thing back in April, when he took what was then a very long, involved and discursive article, split most of it out onto separate pages, and put the article into essentially its current shape.
What the article represents now is a quite terse, quite tight, "helicopter view" pull-together in a nutshell; giving a quite short map of some of the different ways (all valid, all related) in which entropy can be analysed, pictured, discussed and understood.
These include, for example
Now I'm sure that there are all sorts of improvements that could be made. But at the moment, if that is the article's aim, I think it actually delivers it quite well. IMO, the many different mutually supporting ways that entropy can be thought about are presented so that they actually do cohere quite well; and the interested reader can rapidly navigate to more specific articles which explore particular facets in more detail.
I'm sure the article will continue to evolve and improve. But for myself, I wouldn't like to see the range of it being reduced; nor would I like to see the wordcount getting much longer. IMO, it has a particular purpose, and the shape and structure as it is at the moment helps to achieve it quite well.
I don't have a problem with somebody wanting to write a more gentle, discursive introduction for beginning chemistry students, either as a "breakout" article from the central page, or as an external link. But I do have a problem with articles that claim theirs is the only way to think about entropy. I think that is a problem with some of Frank's articles; and I think that some of the conflict that poor Astrobayes is in stands as quite a warning of what it can lead to.
The truth is that it is useful to be able to think about entropy in different ways; to know how (and why) they are equivalent; and to be able to move seamlessly between them. That's my 2 cents, anyway. Jheald 09:58, 7 July 2006 (UTC).
Fresheneesz, I don't think that given examples of how brilliant scientists described entropy in their brilliant prose in support of using similar phrasing here counts as original research. There are two related points at issue, one of terminology (Roughly, is "disorder" a useful word to use when trying to explain what entropy is) and a deeper, but related dispute over what exactly is entropy anyways.
Frank, thank you for your reply to the card shuffling example. I think I may more or less understand your POV. To summarize, you think that (thermodynamic) entropy is the dispersal of energy, whereas I think entropy is a 'measure' of the dispersal of energy, or more generally a measure of information (or roughly mix-up-ness, disorder, or randomness), there being examples in thermodynamics of entropy being carried by non-energetic degrees of freedom. These different viewpoints lead to the disagreement over describing entropy as "disorder" since that description, omitting any reference to energy, isn't compatible with thinking about entropy as "dispersal of energy"? [User; Nonsuch...]
You correctly identified the limitation of the card unshuffling example, namely aren't all orderings of the cards equivalent? So let me modify the example in order to removed that deficiency. Instead of one pack, I hand you 2 randomly shuffled packs and ask you to put the second pack in the same order as the first pack. This procedure clearly reduces the information entropy of the cards. [User: Nonsuch...]
To compensate I have to increase the entropy of some other part of the universe, perhaps adding energy to a heat bath to increase the thermodynamic entropy. The only fundamental difference between molecules and cards in this example is one of scale. The thermodynamic change due to unshuffling is tiny compared to any macroscopic scale. Nonsuch 16:41, 7 July 2006 (UTC)
First, I owe an apology to Nonsuch and to Jheald for speaking harshly to them when I felt that they had not taken the trouble to consider what they or I said. That's not my prerogative nor an objective judgment of their diligence. (I greatly appreciate their courtesy in not responding in kind.) I get overly excited by any discussion involving entropy. The correct presentation of a basic understanding of thermodynamic entropy to undergraduates in chemistry has been my primary focus for about 8 years. My good luck in discerning what is now accepted by most chemistry text authors has been as amazing to me as it is unknown to those not in chemistry instruction.
Second, I have written too lengthy pieces in this Talk page. That was certainly not from a desire to write [I have papers to do before I pass on!!], but to explain clearly a modern view of thermodynamic entropy in chemistry as measuring the spreading out of energy from being more localized to becoming more dispersed, spatially and temporally. [The latter 'temporally', ties the idea to Leff's interpretation of a system's constantly changing per instant from one accessible microstate to another as an 'entropy dance' over time.] One of 'the modern view's' many advantages, I believe, is that even concrete-thinking students can readily 'get' the idea of active molecules spreading out in space -- both in 'thermal' or 'configurational' entropy, while that 'half of the whole picture' seamlessly fits into the next level of understanding: quantization of energy levels, microstates, etc.
The profound newest view -- that entropy is a two-factor function (please click on the CalPoly seminar [3]) -- explicitly clarifies the difference between thermodynamic entropy and information 'entropy' in my opinion. (It is has passed the first barrier of article peer-review with only one dissenter now being countered.)
With Jheald, I agree that it is time for a focus on the actual Entropy page. Because I believe that naive readers will be the most frequent visitors, but NOT the only visitors of course, the initial introduction should be for them -- but NOT in any chatty style. (Sorry; the suggestion made in the Talk above was fast-typed chatty, but no way should it be such in the Entropy page.) I'll work on a 'possible' late today, of course subject to review. FrankLambert 18:12, 7 July 2006 (UTC)
The problem with this whole discussion is that (some of) the respondents are not fully reading what other people write. To Nonsuch, I did not cite any of Frank's work as a peer-reviewed source. Look above - far above - in this page and you'll find a link I left in a discussion (which everyone ignored because they were all too busy making their individual points to consider others) that highlights an example of physical systems whose Entropy increases as disorder decreases, and it was written by the chemist Brian B. Laird, not Frank Lambert, and was published in the peer-reviewed Journal of Chemical Education. How did you miss this? You were a bit busy making your point I guess to read what I wrote. I understand though, after reading all of your comments I can see you feel strongly about your conception of Entropy and you feel what you say has important authority above the rest of us. But please, read what others write before you say something that illuminates that you are neither reading nor seriously considering what everyone else is saying here. That you missed this link to the citation above shows this clearly. However, there are many of us who are specialists in this field and this article unevenly presents informational statistical disorder over Entropy (by which I mean energy transfer at a specified temperature) - and those Wikians who have expertise on this subject should not have their expertise denigrated because a few individuals have adopted this article and refuse to let the edits of others remain. There are those of us who have gone to school for years studying these subjects so please consider that there may be individuals in this world who, in addition to you, also have important contributions to add to an encyclopedic article such as this. I am curious, what is your educational background that gives you such authority on this subject? Knowing your educational background would really be illuminating for this discussion and would perhaps offer us all a perspective from which we could approach discussions with you (e.g. someone with a mathematics degree approaches topics differently than someone with a degree in computer science or history). I would love to be here every day but I work nearly 50 hours a week at a DoE research facility and I don't have time to get online repeat what I've said before and to spend my time arguing about something which is common scientific knowledge. I think it is time for a peer-review of this article itself. And we're wasting time in these discussions rather than making bold edits on the article. Cheers, Astrobayes 20:41, 7 July 2006 (UTC)