![]() | This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
I plan to remove all references to entropy increase as a dispersal of energy unless someone can explain how it relates to entropy of mixing. As far as "dispersion of energy" is concerned, that's wrong. In the case of entropy of mixing there is nothing that can be identified as a dispersion of energy, which goes to show that thinking of entropy as a dispersion of energy is wrong.
If you have a box divided in two by a partition, with type A gas on one side, type B on the other, same mass per particle, same temperature and pressure, then there will be no change in the energy density when the partition is removed and the gases mix, but there will be an increase in entropy. You could say that the energy held by gas A gets spread out, as does the energy held by gas B, but I don't think it is productive to think of energy as being "owned" by a particular molecule type. Energy is being exchanged at every collision, its not "owned" by any gas. Its the particles that are being spread out, not their energy. PAR 03:59, 28 September 2006 (UTC)
Intriguing. The thought experiment envisages a situation where there's no change in available energy, hence no change in the amount of energy that cannot be used to do thermodynamic work, the factor that entropy provides a measure of in relation to a reference temperature. Yet because restoring the previous order is assumed to require energy, this is interpreted as increasing entropy. So there's an increase in entropy with no thermodynamic effect. An apparent paradox, but the lead to Entropy of mixing indicates an answer where it states that "This entropy change must be positive since there is more uncertainty about the spatial locations of the different kinds of molecules when they are interspersed." Note that "uncertainty" links to Information entropy#Formal definitions. In other words this conflates Shannon "information entropy" with thermodynamic uncertainty. There's a loss in "information" without a thermodynamic change. ... dave souza, talk 18:50, 28 September 2006 (UTC)
Thanks, Jheald, for a really useful explanation of Entropy of mixing – would it be possible to add it to that article? Now it seems to me that you've shown a dispersal of energy as the particles diffuse and the pressure potential is lost. PAR stated at the outset that he didn't think it is productive to think of energy as being "owned" by a particular molecule type, but this illustration does just that. Looking now at Frank Lambert's pages, here he describes letting the motional energy of each gas spread out more widely into the larger volume, and here he describes the mixing of different ideal gases and of liquids as fundamentally involving an expansion of each component in the phase involved, and entropy increases because the energy [of each] is now more spread out or dispersed in the microstates of the considerably greater volume than its original state. He comments that the "Gibbs Paradox"... is no paradox at all in quantum mechanics where the numbers of microstates in a macrostate are enumerated, but doesn't treat that further. His entropysite lists over 15 textbooks that have deleted “entropy is disorder” and now describe the meaning of entropy in various terms of the spreading or dispersing of energy. I don't have access to these books, but it would be interesting to know how they tackle such issues. In terms of this article it's pretty clear that such description is useful, particularly for non experts or "beginning students". If there are significant problems with this approach, it should be possible to find critical appraisal of these textbooks. ... dave souza, talk 09:08, 29 September 2006 (UTC)
I do appreciate that you're finding it difficult to think in terms of energy, but as you state, the displacement of the particles is measurable so we have a measure of mass, of distance and of time, which are the dimensions required to measure energy. On a macro level, the example has the work and hence energy of the A particles moving the semi-permeable membrane by diffusing through that membrane. As is being stated, this work/energy is quantified as T times the entropy of mixing. Which statistical mechanics can predict. This approach to a basic understanding of entropy for students exists, and should be explained in the article: saying "lets forget about it" is not what an encyclopedia is for. ... dave souza, talk 14:52, 29 September 2006 (UTC)
Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend? I'm sure we will all answer NO to this question. The question then remains whether entropy increase as energy dispersal fits the above description. If it does, then we get rid of it. If it does not, then a definition of energy dispersal will be forthcoming, as well as a defense of the description. Please, I really would like to hear it. What is the definition of energy dispersal?. If we cannot answer this question, we must agree to eliminate energy dispersal as a description of entropy increase. PAR 17:41, 30 September 2006 (UTC)
Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend?
Thermal energy spreads rapidly and randomly throughout the various energetically accessible microstates of the system.
Ok - we've stated our positions for the record. PAR 13:33, 1 October 2006 (UTC)
Regarding the request to provide references which prove the above references wrong - this is a classic wikipedia problem - asking for negative reference. See this link for example. PAR 13:46, 1 October 2006 (UTC)
Dave - I looked at the reference you gave. It says:
"some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
I still don't agree. To "spread out" means to occupy a larger volume in some space. It is simply a nonsense statement until this space can be specified. I really believe that because energy dispersal in physical space is so often the signature of entropy increase, the proponents of this point of view are trying to use this intuitively accessible description to shoe-horn in all forms of entropy increase, with the idea that it is better to have a wrong idea than no idea at all. Sometimes that is true, but I think in this case we can arrive at the same value article without the warm fuzzy nonsensical statements.
I am in no way a self-proclaimed expert. I have had my head handed to me on this very page (or perhaps it was the second law page) by people who know this subject better than I and whom I expect are listening to us now. This was because of my insistence that I was right until it was shown to me with a logical argument that I was wrong. I will not ever resort to legal loopholes to insert a bunch of words that I don't understand and cannot defend into an article. Ever. And I expect the same of everyone else.
I am carrying on this discussion so that others can see exactly what our points are and make their decisions accordingly. If someone reverts my edit, as you have done, I will not revert it back. I leave that to someone who agrees with me, and if there is no such person, then so be it.
Meanwhile I will search for those negative references. PAR 16:51, 1 October 2006 (UTC)
I have modified the sentence in the introduction to avoid the misconception that energy dispersal is exclusively a spatial dispersal. I have also removed the nonsense about energy dispersing into microstates. Dave, you have developed it to a point that we cannot all agree on. We have to be very clear on two points:
1. No one can dispute the fact that Lambert's "energy dispersal" as a description of entropy increase does not always mean spatial energy dispersal. This follows from the sentence I quoted above from the Entropy is simple page:
some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
2. The point that IS in contention is the statement that energy disperses into a greater number of microstates, or to a state with a greater number of microstates, or some variation thereof.
As long as we state point 1 I'm ok. When point 2 is stated as fact I am not ok. My revision avoided discussion of point 2 and I object to any revision stating point 2 as fact, unless a clear, more or less quantitative definition of non-spatial "energy dispersal" can be given. PAR 02:41, 3 October 2006 (UTC)
Just yesterday, I looked at Talk:Entropy for the first time since late July and today I am terribly embarrassed to see this fire storm and even more because Leff (for an old ref, see end) in recent months has convinced me that I was absolutely wrong in limiting energy dispersal (in re earthly entropy) in 3-D space in any way. The following is a full explication that I hope will be helpful, but I apologize for not being right in 2002, nor correcting it before now on www.entropysite.com. FrankLambert 00:22, 4 October 2006 (UTC)
“Energy of all types spontaneously changes from being localized to becoming more dispersed or spread out in space if it is not constrained.” This is a modern view of 2LOT.
“Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T in two seemingly disparate kinds of process: how much energy becomes dispersed (as in thermal energy transfer between surroundings and system or the converse, or between two contiguous systems insulated from their surroundings), “thermal”; or by how widely spread out the initial energy within a system becomes (as in gas expansion into a vacuum, or by two or more fluids mixing), “positional”.”
Both kinds of process, “thermal” and “positional”, involve a change in the dispersal of energy from a smaller to a larger volume. Thus, fundamentally, the difference between them need not be mentioned to beginners. More important is their commonality: Spontaneous processes always involve energy dispersal in space. This is a unifying concept in presenting varieties of entropy change to beginners or non-scientists.
The unusual value of viewing entropy change as a dispersal of molecular energy is its seamless application from a beginner’s view of molecules’ behavior to a qualitative or a quantitative Clausius calculation as well as to more sophisticated qualitative views of quantized energy distributions and quantitative Boltzmann calculations of the number of accessible microstates.
In the traditionally named “positional” kind of entropy change, molecules that are constrained in a given volume are then allowed to move into an adjacent evacuated chamber. Qualitatively, their initial motional energy has become more spread out in the new total volume than it was in the smaller original chamber. That greater dispersal of the original energy of the gas in space should result in an increase in its entropy. The classical reversible reversal of the process proves that the work involved is indeed dependent on the volume change and related to a q(rev)/T for the original spontaneous expansion.
In no way does “a dispersal of energy” mean that the motional energy of a system is in any more than one microstate at one instant! Energy is quantized in one specific distribution on an enormous number of accessible energy levels in one microstate at one instant. “Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.
The same phenomenon of increased energy dispersal applies to allowing gas A with its initial motional energy in a given volume to mix with gas B with its initial energy in an equal or different volume. The total initial motional energy of each gas is unchanged but that initial energy of each is now more widely spread throughout the greater volume of the two; it has become more dispersed and thus, from our view of entropy as a quantitative measure of the spreading out of energy, there should be a distinct increase in entropy. Classical Gibbs’ calculation, or combinatorial calculations of the number of possible cells containing A and B and the Boltzmann entropy equation both lead to the same conclusion: Δ S = - n R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions.
[This equivalence of combinatorial locations of particles to the probable distribution of the final energy of a system is supported by Hanson, R.M., J. Chem. Educ. 2006. 83, 581-588; p. 587, 586; and Kozliak, E. I. J. Chem. Educ.[in press] 2006, 83. The spreading of energy in space as a key to understanding entropy change is from Leff, H.S. Am. J.Phys., 1996, 64, 1261-1271.] FrankLambert 00:22, 4 October 2006 (UTC)
This is what is meant by a “dispersal of molecular motional energy for a given system”: an increase in the number of accessible energy distributions for the system.
“Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.
to distribute (as fine particles) more or less evenly throughout a medium.
Don't have time before tomorrow but let me respond to your first paragraph and then make a quick connection that may be behind all/most of your questions.
First, all entropy increase in chem molec motion is DUE to real physical molecules moving in 3-D space, of course -- thus ALL I'm/beginning chem is focused on is ent. increase in phys space and the most obvious example is ent. of mixing.
Second, the beauty of the concept of energy dispersal (to me!) is that we focus first on real molec movement and then directly go to its abstract meaning like this: Consider the fast colliding behavior of a mole of real molecules in real space (the medium)-- yes, of course, 'evenly distributed' at one instant -- one instant, then, is the time to take one FREEZE FRAME -- that can be translated to an ABSTRACTION: the plot of ONE Boltzman distribution of the energy of each molecule (merely assuming that you and I could see the energy of each molec. in that freeze frame instant!). Then, in the next instant, in which perhaps only 1 or 1 trillion molec collisions occurred, we could snap another freeze frame, rapidly check each molecule and plot another.
Oh, if you believe in ergodicity, it would take many millennias times millennias times millenias to do a complete job -- but does that start to answer your question of whathehell energy dispersal means? It means literal spreading out of literal molecules in greater space than before (in mixing)and initially and then finally taking a myriad of freeze frames of all the possible B. distributions that they sequentially demonstrate in a few zillion eons or million seconds or seconds -- but our count, even though we're quick-eyed, won't be a bejeebers-worth compared to your ability as a skillled stat mech to count the number of microstates via your cell model of A and B and their relative moles -- that you now know is NOT a count of location "microstates"."configurations", but honest to God microstates of the different ENERGY distributions for that particular mixture system!
It's all based on simple molecules hitting each other and our checking their energies and counting the different possibilities....That's a start. I'm sure you'll tell me what's wrong with it and then we can go from there :-) FrankLambert 05:37, 4 October 2006 (UTC)
some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
OK, we seem to be getting somewhere! I have two questions, answers to which would help me to understand your above statements better. One, of two parts, is from my prior post:
Internal energy density? By that, do you mean the density in space of the motional energy of gas molecules of A, prior to their being admitted to a space occupied by gas molecules of B? Then (1), on the concrete/beginners level, the change from initial pure A to its mixing with B (and thus, in a larger final volume) is a radical change in its 'energy density' in space: fewer energetic A molecules per cubic cm. in that final state. Now (2): on a more sophisticated level of abstraction -- of energy levels (!) -- the 'energy density' (however you define it) is also radically changed because of the decreased spacing between energy levels common to any increase in volume. (Despite a possible confusion in terminology of the energy levels being considered "more dense" in some writing.)
Two: When you consider a mixture of two completely different gas molecules, A and B, by your usual statmech routines of combinatorially counting them and inserting the result in a Boltz. entropy equation, followed by conversion to moles/mole fractions, what would you tell students or me that you have been counting?
Thx! (Maybe we should break this up to "2.Questions" or such, for readier editing?) FrankLambert 21:21, 4 October 2006 (UTC)
PAR, we now have a well-coupled dialogue:
You said (about mixing A and B), “If energy dispersal exists in this case, how would you quantitatively define it and/or measure it?” (Previously, you said, “No problem with what we both know is actually happening” in re my 05:37 4 Oct. comments about ‘real molec movement'.)
Let’s take a more obvious example than A and B 'anonymous gases' – say, molecules that are responsible for the color in a liquid food dye. In their bottle (that is mainly water) they really are speeding, colliding and this kinetic activity (at an unchanged T) is their ‘internal energy’. The molecules of the dye in one drop placed in a beaker of water at T, even with no stirring or convective currents, will very slowly spontaneously move so that the whole beaker has a faint color. This, to me or any chemist, is only understandable as a dispersal of the internal energy (the kinetic movement but also including any potential energy) of those dye molecules. Mainly and ceaselessly colliding with water molecules, they reach an equilibrium state of greatly spread out energy distribution (compared to their state in the bottle) in the space of the enormously greater volume of the beaker of water. The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase (but I will discuss examples whose entropy change is easier to calculate).
For over a century, this has been a quantitative technique for determining the amount of a colored substance in a solution, fundamentally based on how spread out are the energetic molecules in a particular sample. (It can be equally successfully used to determine the concentration of red-orange bromine vapors in air et sim., of course.).
On the educationally lowest level, then, this is a quantitative definition and measure of molecular motional energy dispersal. Let’s move to a more fundamental level. My definition at the start of our discussion (with mixing as a principal case) was, “Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T …”
Consider a mole of argon in a 24.5 L chamber, connected by a stopcock to an identical chamber with a mole of helium, the two mounted in a thermostat at 298 K. The stopcock is opened and the two gases spontaneously mix.
Obviously, the internal energy of the argon – the motional energy of the total of all the molecules of argon -- that was concentrated in its 24.5 L is more spread out in the final 49 L. But the next more fundamental level, is the entropy increase in the mixing: Δ S = - R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions. With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.
But I have emphasized again and again, and assumed that readers would do the calculations, that the ultimate measure of dispersal of molecular motional energy in a process is the change in the number of accessible microstates as a result of that process. It is a trivial calculation thanks to Boltzmann’s (and Planck’s) genius: the WFinal from Δ S = k ln WFinal/WInitial.
Given that the standard entropy of argon is ~155 J/K mol (and, of course, its WInitial at 0 K is 1) there are two questions. 1. How many microstates are accessible for a mole of argon at 298 K in its original chamber? 2. How many are accessible for the argon after the two gases have been mixed.? (This question simply involves adding 5.76 J/K to argon’s standard state and recalculating W.))
The answers are: 1. 10 ^ 48,600,000,000,000,000,000,000,000. [This exponent is 48.6 x 10 ^ 24.] 2. The W for the argon in the mixture is 10 ^ 48,800,000,000,000,000,000,000,000.
The numbers of these accessible microstates are enormous to the level of incomprehensibility. They should be interpreted simply as the relative number of choices that a system has in one instant of being in a different microstate the next: the larger the number, the less the chance of the system remaining in its present microstate – i.e. the less the chance of being localized. Thus, in contrast to localization, the greater the number of accessible microstates, the greater the dispersal of the energy of the system. This is the definition of entropy change that I repeated in my 4 October summary here.
(A system never could explore ["in a temporal dance", as Harvey Leff says] even a tiny fraction of the calculated number of microstates in 'near-infinite' time, and in fact computer programs show a rapid narrowing down from ‘truly incredible 10^10^25’ to merely ‘gigantic’! Nevertheless, the greater is the entropy change, the greater is the number of calculated accessible microstates.)
The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase.
With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.
Some observations (made already, but I thought worth gathering together again) on "dispersedness of energy" as the be-all-and-end-all way of thinking about entropy:
1. The most straightforward interpretation of energy dispersal fails for entropy of mixing, because at the end of the day both originall separated parts of the system will have contents with exactly the same energy (and the same energy density) that they started with.
2. What happens if a structure with configurational entropy is cooled towards a limiting temperature of absolute zero? In the limit there's still a residual entropy, even though there's no energy.
3. The most dispersed arrangement of energy of all, a completely even distribution, actually has very low entropy, because there is only one way to achieve it. In reality in equilibrium (maximum entropy) there are significant deviations from perfect dispersal.
To even start to make the "dispersal of energy" story work, it seems to me you have to start making non-obvious special definitions that are at least as radical as defining "disorderedness" in terms of multiplicity.
Fundamentally, the problem is that it's not energy being dispersed between different microstates - it's probability. Whichever microstate the universe ends up in, each of those microstates must contain all the energy of the original microstate. It's not energy being dispersed, it's certainty being spread out into more and more broadly spread distributions of probability.
That's why ultimately, I fear tying entropy uniquely to the idea of energy dispersal can become a much bigger stumbling block to a good understanding of entropy even than the confusion which can be caused by the word "disorder". Jheald 23:44, 5 October 2006 (UTC).
Incidentally, I think we should also be careful with the phrase "accessible microstates". It can get quite confusing as to whether we're talking about the totality of ergodically accessible microstates in infinite time; or the "number of choices that a system has in one instant of being in a different microstate the next" ((where I assume we're thinking about a Heisenberg picture notion of states of the system, subject to "quantum jumps" from external perturbations)). Better usually I think to talk about the number of microstates compatible with the particular macroscopical and dynamical description we have in mind. Jheald 00:05, 6 October 2006 (UTC)
Thanks to Jheald for relegating older Talk:Entropy sections to an Archive. Skilled in information theory, Jheald has every right to express his understanding of that topic (e.g.,as in Archive 3 at 21:41 4 July). However, I chose to emphasize that word "Thermodynamic" in the heading of this section because his preceding comments may confuse readers who do not realize his divergent views about thermodynamic entropy.
Taking them in order and number: 1. Here, Jheald implies that "energy of A/Volume" exactly equals "energy of A/ 2Volumes". I'm sure it is an innocent error. The second paragraph in my "Conclusion" above corrects this kind of misunderstanding. I would urge readers to check it and its explication.
2. I told JHeald (19:25, 30 June) about a ms. from a colleague concerning the "problem" of residual entropy and, to save the list complex trivia, offered to give full details via email. He didn't respond. The ms. has since been accepted for publication (E.I Kozliak, J.Chem.Educ. "in press"; will be published within 6 months; email me and he might be willing to release a copy of the ms.) It is perfectly covered by my emphasis on energy as the proper focus in evaluating entropy change: multiple Boltzmann distributions (of ENERGY/energetic molecs.!) being frozen-in to solid CO, H2O, FClO3, etc.
3. This statement shows the confusion that arises when information 'entropy' ideas are misapplied to thermodynamic systems. All thermodynamic entropy numbers/calculations/evaluations are actually entropy changes -- from 0 K; from an initial state to a final state [each of them ultimately relatable to 0 K state]. There is only one situation in thermodynamics where any system can have "one way to achieve it" and that is a perfect crystal of a substance at 0 K. Thermodynamic equilibrium for any system indeed is a max distibution of energies, i.e., max dispersal, MaxEnt for that system under its particular equilib. constraints of P, V, T, etc.
The dispersal of energy evaluated by the Boltzmann entropy equation is coincident with the multiplicity of thermal physics. That's a major part of its value -- direct relationships that unite simple standard views of molecular behavior that then lead to stimulating new insights in stat mech and are consonant with QM.
JHeald: "the problem is that it's not energy being dispersed between different microstates - it's probability. Here we go again! (First, _I_ am not talkng about energy "being dispersed BETWEEN different microstates"!!! The energy of a system is ALWAYS and ONLY present in one microstate at one instant. If it stayed in that same microstate or had only a dozen or a million microstates TO WHICH IT COULD change in the next instant, that would be a more localized distribution of that energy than if there were the choice of any one of quadrillions or godzillions of microstates to which it could change. That is what 'energy dispersal' in a system means in terms of microstates.)
Then, JHeald's viewpoint is skewed toward info 'entropy' and diverts emphasis on thermodynamic entropy, the focus of this section of Wikipedia. Information theory, indeed, is half of thermodynamic entropy results; the probability measurement of energy's dispersal in the 'actualizing' half. It is important. (See 21:32 of 1 July for my presentation). But the function of probability is in describing the max number of ENERGY distributions that are accessible under the constraints of the process being examined. I don't think that a roaring gas flame heating a huge boiler or even a hot gas stove burner boiling a pan of water is well described by "certainty being spread out into distributions of probability". Thermodynamic entropy deserves a better press than that :-).
Energy dispersal a stumbling block? I don't know evidence of that. Wish I could quiz the 50,000 or more students who were exposed to it instead of 'disorder' last year.
"Accessible microstates"? We are indeed talking about the totality of ergodically accessible microstates in infinite time which is better described to beginners in the terms of 'choices..etc.' NO WAY are those Heisenberg QM susceptible states! The 'jumps' occur because each microstate describes a particular set of molecular energies of the system that certainly ARE compatible with what JHeald urges. FrankLambert 04:51, 6 October 2006 (UTC)
![]() | This page is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
I plan to remove all references to entropy increase as a dispersal of energy unless someone can explain how it relates to entropy of mixing. As far as "dispersion of energy" is concerned, that's wrong. In the case of entropy of mixing there is nothing that can be identified as a dispersion of energy, which goes to show that thinking of entropy as a dispersion of energy is wrong.
If you have a box divided in two by a partition, with type A gas on one side, type B on the other, same mass per particle, same temperature and pressure, then there will be no change in the energy density when the partition is removed and the gases mix, but there will be an increase in entropy. You could say that the energy held by gas A gets spread out, as does the energy held by gas B, but I don't think it is productive to think of energy as being "owned" by a particular molecule type. Energy is being exchanged at every collision, its not "owned" by any gas. Its the particles that are being spread out, not their energy. PAR 03:59, 28 September 2006 (UTC)
Intriguing. The thought experiment envisages a situation where there's no change in available energy, hence no change in the amount of energy that cannot be used to do thermodynamic work, the factor that entropy provides a measure of in relation to a reference temperature. Yet because restoring the previous order is assumed to require energy, this is interpreted as increasing entropy. So there's an increase in entropy with no thermodynamic effect. An apparent paradox, but the lead to Entropy of mixing indicates an answer where it states that "This entropy change must be positive since there is more uncertainty about the spatial locations of the different kinds of molecules when they are interspersed." Note that "uncertainty" links to Information entropy#Formal definitions. In other words this conflates Shannon "information entropy" with thermodynamic uncertainty. There's a loss in "information" without a thermodynamic change. ... dave souza, talk 18:50, 28 September 2006 (UTC)
Thanks, Jheald, for a really useful explanation of Entropy of mixing – would it be possible to add it to that article? Now it seems to me that you've shown a dispersal of energy as the particles diffuse and the pressure potential is lost. PAR stated at the outset that he didn't think it is productive to think of energy as being "owned" by a particular molecule type, but this illustration does just that. Looking now at Frank Lambert's pages, here he describes letting the motional energy of each gas spread out more widely into the larger volume, and here he describes the mixing of different ideal gases and of liquids as fundamentally involving an expansion of each component in the phase involved, and entropy increases because the energy [of each] is now more spread out or dispersed in the microstates of the considerably greater volume than its original state. He comments that the "Gibbs Paradox"... is no paradox at all in quantum mechanics where the numbers of microstates in a macrostate are enumerated, but doesn't treat that further. His entropysite lists over 15 textbooks that have deleted “entropy is disorder” and now describe the meaning of entropy in various terms of the spreading or dispersing of energy. I don't have access to these books, but it would be interesting to know how they tackle such issues. In terms of this article it's pretty clear that such description is useful, particularly for non experts or "beginning students". If there are significant problems with this approach, it should be possible to find critical appraisal of these textbooks. ... dave souza, talk 09:08, 29 September 2006 (UTC)
I do appreciate that you're finding it difficult to think in terms of energy, but as you state, the displacement of the particles is measurable so we have a measure of mass, of distance and of time, which are the dimensions required to measure energy. On a macro level, the example has the work and hence energy of the A particles moving the semi-permeable membrane by diffusing through that membrane. As is being stated, this work/energy is quantified as T times the entropy of mixing. Which statistical mechanics can predict. This approach to a basic understanding of entropy for students exists, and should be explained in the article: saying "lets forget about it" is not what an encyclopedia is for. ... dave souza, talk 14:52, 29 September 2006 (UTC)
Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend? I'm sure we will all answer NO to this question. The question then remains whether entropy increase as energy dispersal fits the above description. If it does, then we get rid of it. If it does not, then a definition of energy dispersal will be forthcoming, as well as a defense of the description. Please, I really would like to hear it. What is the definition of energy dispersal?. If we cannot answer this question, we must agree to eliminate energy dispersal as a description of entropy increase. PAR 17:41, 30 September 2006 (UTC)
Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend?
Thermal energy spreads rapidly and randomly throughout the various energetically accessible microstates of the system.
Ok - we've stated our positions for the record. PAR 13:33, 1 October 2006 (UTC)
Regarding the request to provide references which prove the above references wrong - this is a classic wikipedia problem - asking for negative reference. See this link for example. PAR 13:46, 1 October 2006 (UTC)
Dave - I looked at the reference you gave. It says:
"some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
I still don't agree. To "spread out" means to occupy a larger volume in some space. It is simply a nonsense statement until this space can be specified. I really believe that because energy dispersal in physical space is so often the signature of entropy increase, the proponents of this point of view are trying to use this intuitively accessible description to shoe-horn in all forms of entropy increase, with the idea that it is better to have a wrong idea than no idea at all. Sometimes that is true, but I think in this case we can arrive at the same value article without the warm fuzzy nonsensical statements.
I am in no way a self-proclaimed expert. I have had my head handed to me on this very page (or perhaps it was the second law page) by people who know this subject better than I and whom I expect are listening to us now. This was because of my insistence that I was right until it was shown to me with a logical argument that I was wrong. I will not ever resort to legal loopholes to insert a bunch of words that I don't understand and cannot defend into an article. Ever. And I expect the same of everyone else.
I am carrying on this discussion so that others can see exactly what our points are and make their decisions accordingly. If someone reverts my edit, as you have done, I will not revert it back. I leave that to someone who agrees with me, and if there is no such person, then so be it.
Meanwhile I will search for those negative references. PAR 16:51, 1 October 2006 (UTC)
I have modified the sentence in the introduction to avoid the misconception that energy dispersal is exclusively a spatial dispersal. I have also removed the nonsense about energy dispersing into microstates. Dave, you have developed it to a point that we cannot all agree on. We have to be very clear on two points:
1. No one can dispute the fact that Lambert's "energy dispersal" as a description of entropy increase does not always mean spatial energy dispersal. This follows from the sentence I quoted above from the Entropy is simple page:
some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
2. The point that IS in contention is the statement that energy disperses into a greater number of microstates, or to a state with a greater number of microstates, or some variation thereof.
As long as we state point 1 I'm ok. When point 2 is stated as fact I am not ok. My revision avoided discussion of point 2 and I object to any revision stating point 2 as fact, unless a clear, more or less quantitative definition of non-spatial "energy dispersal" can be given. PAR 02:41, 3 October 2006 (UTC)
Just yesterday, I looked at Talk:Entropy for the first time since late July and today I am terribly embarrassed to see this fire storm and even more because Leff (for an old ref, see end) in recent months has convinced me that I was absolutely wrong in limiting energy dispersal (in re earthly entropy) in 3-D space in any way. The following is a full explication that I hope will be helpful, but I apologize for not being right in 2002, nor correcting it before now on www.entropysite.com. FrankLambert 00:22, 4 October 2006 (UTC)
“Energy of all types spontaneously changes from being localized to becoming more dispersed or spread out in space if it is not constrained.” This is a modern view of 2LOT.
“Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T in two seemingly disparate kinds of process: how much energy becomes dispersed (as in thermal energy transfer between surroundings and system or the converse, or between two contiguous systems insulated from their surroundings), “thermal”; or by how widely spread out the initial energy within a system becomes (as in gas expansion into a vacuum, or by two or more fluids mixing), “positional”.”
Both kinds of process, “thermal” and “positional”, involve a change in the dispersal of energy from a smaller to a larger volume. Thus, fundamentally, the difference between them need not be mentioned to beginners. More important is their commonality: Spontaneous processes always involve energy dispersal in space. This is a unifying concept in presenting varieties of entropy change to beginners or non-scientists.
The unusual value of viewing entropy change as a dispersal of molecular energy is its seamless application from a beginner’s view of molecules’ behavior to a qualitative or a quantitative Clausius calculation as well as to more sophisticated qualitative views of quantized energy distributions and quantitative Boltzmann calculations of the number of accessible microstates.
In the traditionally named “positional” kind of entropy change, molecules that are constrained in a given volume are then allowed to move into an adjacent evacuated chamber. Qualitatively, their initial motional energy has become more spread out in the new total volume than it was in the smaller original chamber. That greater dispersal of the original energy of the gas in space should result in an increase in its entropy. The classical reversible reversal of the process proves that the work involved is indeed dependent on the volume change and related to a q(rev)/T for the original spontaneous expansion.
In no way does “a dispersal of energy” mean that the motional energy of a system is in any more than one microstate at one instant! Energy is quantized in one specific distribution on an enormous number of accessible energy levels in one microstate at one instant. “Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.
The same phenomenon of increased energy dispersal applies to allowing gas A with its initial motional energy in a given volume to mix with gas B with its initial energy in an equal or different volume. The total initial motional energy of each gas is unchanged but that initial energy of each is now more widely spread throughout the greater volume of the two; it has become more dispersed and thus, from our view of entropy as a quantitative measure of the spreading out of energy, there should be a distinct increase in entropy. Classical Gibbs’ calculation, or combinatorial calculations of the number of possible cells containing A and B and the Boltzmann entropy equation both lead to the same conclusion: Δ S = - n R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions.
[This equivalence of combinatorial locations of particles to the probable distribution of the final energy of a system is supported by Hanson, R.M., J. Chem. Educ. 2006. 83, 581-588; p. 587, 586; and Kozliak, E. I. J. Chem. Educ.[in press] 2006, 83. The spreading of energy in space as a key to understanding entropy change is from Leff, H.S. Am. J.Phys., 1996, 64, 1261-1271.] FrankLambert 00:22, 4 October 2006 (UTC)
This is what is meant by a “dispersal of molecular motional energy for a given system”: an increase in the number of accessible energy distributions for the system.
“Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.
to distribute (as fine particles) more or less evenly throughout a medium.
Don't have time before tomorrow but let me respond to your first paragraph and then make a quick connection that may be behind all/most of your questions.
First, all entropy increase in chem molec motion is DUE to real physical molecules moving in 3-D space, of course -- thus ALL I'm/beginning chem is focused on is ent. increase in phys space and the most obvious example is ent. of mixing.
Second, the beauty of the concept of energy dispersal (to me!) is that we focus first on real molec movement and then directly go to its abstract meaning like this: Consider the fast colliding behavior of a mole of real molecules in real space (the medium)-- yes, of course, 'evenly distributed' at one instant -- one instant, then, is the time to take one FREEZE FRAME -- that can be translated to an ABSTRACTION: the plot of ONE Boltzman distribution of the energy of each molecule (merely assuming that you and I could see the energy of each molec. in that freeze frame instant!). Then, in the next instant, in which perhaps only 1 or 1 trillion molec collisions occurred, we could snap another freeze frame, rapidly check each molecule and plot another.
Oh, if you believe in ergodicity, it would take many millennias times millennias times millenias to do a complete job -- but does that start to answer your question of whathehell energy dispersal means? It means literal spreading out of literal molecules in greater space than before (in mixing)and initially and then finally taking a myriad of freeze frames of all the possible B. distributions that they sequentially demonstrate in a few zillion eons or million seconds or seconds -- but our count, even though we're quick-eyed, won't be a bejeebers-worth compared to your ability as a skillled stat mech to count the number of microstates via your cell model of A and B and their relative moles -- that you now know is NOT a count of location "microstates"."configurations", but honest to God microstates of the different ENERGY distributions for that particular mixture system!
It's all based on simple molecules hitting each other and our checking their energies and counting the different possibilities....That's a start. I'm sure you'll tell me what's wrong with it and then we can go from there :-) FrankLambert 05:37, 4 October 2006 (UTC)
some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.
OK, we seem to be getting somewhere! I have two questions, answers to which would help me to understand your above statements better. One, of two parts, is from my prior post:
Internal energy density? By that, do you mean the density in space of the motional energy of gas molecules of A, prior to their being admitted to a space occupied by gas molecules of B? Then (1), on the concrete/beginners level, the change from initial pure A to its mixing with B (and thus, in a larger final volume) is a radical change in its 'energy density' in space: fewer energetic A molecules per cubic cm. in that final state. Now (2): on a more sophisticated level of abstraction -- of energy levels (!) -- the 'energy density' (however you define it) is also radically changed because of the decreased spacing between energy levels common to any increase in volume. (Despite a possible confusion in terminology of the energy levels being considered "more dense" in some writing.)
Two: When you consider a mixture of two completely different gas molecules, A and B, by your usual statmech routines of combinatorially counting them and inserting the result in a Boltz. entropy equation, followed by conversion to moles/mole fractions, what would you tell students or me that you have been counting?
Thx! (Maybe we should break this up to "2.Questions" or such, for readier editing?) FrankLambert 21:21, 4 October 2006 (UTC)
PAR, we now have a well-coupled dialogue:
You said (about mixing A and B), “If energy dispersal exists in this case, how would you quantitatively define it and/or measure it?” (Previously, you said, “No problem with what we both know is actually happening” in re my 05:37 4 Oct. comments about ‘real molec movement'.)
Let’s take a more obvious example than A and B 'anonymous gases' – say, molecules that are responsible for the color in a liquid food dye. In their bottle (that is mainly water) they really are speeding, colliding and this kinetic activity (at an unchanged T) is their ‘internal energy’. The molecules of the dye in one drop placed in a beaker of water at T, even with no stirring or convective currents, will very slowly spontaneously move so that the whole beaker has a faint color. This, to me or any chemist, is only understandable as a dispersal of the internal energy (the kinetic movement but also including any potential energy) of those dye molecules. Mainly and ceaselessly colliding with water molecules, they reach an equilibrium state of greatly spread out energy distribution (compared to their state in the bottle) in the space of the enormously greater volume of the beaker of water. The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase (but I will discuss examples whose entropy change is easier to calculate).
For over a century, this has been a quantitative technique for determining the amount of a colored substance in a solution, fundamentally based on how spread out are the energetic molecules in a particular sample. (It can be equally successfully used to determine the concentration of red-orange bromine vapors in air et sim., of course.).
On the educationally lowest level, then, this is a quantitative definition and measure of molecular motional energy dispersal. Let’s move to a more fundamental level. My definition at the start of our discussion (with mixing as a principal case) was, “Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T …”
Consider a mole of argon in a 24.5 L chamber, connected by a stopcock to an identical chamber with a mole of helium, the two mounted in a thermostat at 298 K. The stopcock is opened and the two gases spontaneously mix.
Obviously, the internal energy of the argon – the motional energy of the total of all the molecules of argon -- that was concentrated in its 24.5 L is more spread out in the final 49 L. But the next more fundamental level, is the entropy increase in the mixing: Δ S = - R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions. With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.
But I have emphasized again and again, and assumed that readers would do the calculations, that the ultimate measure of dispersal of molecular motional energy in a process is the change in the number of accessible microstates as a result of that process. It is a trivial calculation thanks to Boltzmann’s (and Planck’s) genius: the WFinal from Δ S = k ln WFinal/WInitial.
Given that the standard entropy of argon is ~155 J/K mol (and, of course, its WInitial at 0 K is 1) there are two questions. 1. How many microstates are accessible for a mole of argon at 298 K in its original chamber? 2. How many are accessible for the argon after the two gases have been mixed.? (This question simply involves adding 5.76 J/K to argon’s standard state and recalculating W.))
The answers are: 1. 10 ^ 48,600,000,000,000,000,000,000,000. [This exponent is 48.6 x 10 ^ 24.] 2. The W for the argon in the mixture is 10 ^ 48,800,000,000,000,000,000,000,000.
The numbers of these accessible microstates are enormous to the level of incomprehensibility. They should be interpreted simply as the relative number of choices that a system has in one instant of being in a different microstate the next: the larger the number, the less the chance of the system remaining in its present microstate – i.e. the less the chance of being localized. Thus, in contrast to localization, the greater the number of accessible microstates, the greater the dispersal of the energy of the system. This is the definition of entropy change that I repeated in my 4 October summary here.
(A system never could explore ["in a temporal dance", as Harvey Leff says] even a tiny fraction of the calculated number of microstates in 'near-infinite' time, and in fact computer programs show a rapid narrowing down from ‘truly incredible 10^10^25’ to merely ‘gigantic’! Nevertheless, the greater is the entropy change, the greater is the number of calculated accessible microstates.)
The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase.
With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.
Some observations (made already, but I thought worth gathering together again) on "dispersedness of energy" as the be-all-and-end-all way of thinking about entropy:
1. The most straightforward interpretation of energy dispersal fails for entropy of mixing, because at the end of the day both originall separated parts of the system will have contents with exactly the same energy (and the same energy density) that they started with.
2. What happens if a structure with configurational entropy is cooled towards a limiting temperature of absolute zero? In the limit there's still a residual entropy, even though there's no energy.
3. The most dispersed arrangement of energy of all, a completely even distribution, actually has very low entropy, because there is only one way to achieve it. In reality in equilibrium (maximum entropy) there are significant deviations from perfect dispersal.
To even start to make the "dispersal of energy" story work, it seems to me you have to start making non-obvious special definitions that are at least as radical as defining "disorderedness" in terms of multiplicity.
Fundamentally, the problem is that it's not energy being dispersed between different microstates - it's probability. Whichever microstate the universe ends up in, each of those microstates must contain all the energy of the original microstate. It's not energy being dispersed, it's certainty being spread out into more and more broadly spread distributions of probability.
That's why ultimately, I fear tying entropy uniquely to the idea of energy dispersal can become a much bigger stumbling block to a good understanding of entropy even than the confusion which can be caused by the word "disorder". Jheald 23:44, 5 October 2006 (UTC).
Incidentally, I think we should also be careful with the phrase "accessible microstates". It can get quite confusing as to whether we're talking about the totality of ergodically accessible microstates in infinite time; or the "number of choices that a system has in one instant of being in a different microstate the next" ((where I assume we're thinking about a Heisenberg picture notion of states of the system, subject to "quantum jumps" from external perturbations)). Better usually I think to talk about the number of microstates compatible with the particular macroscopical and dynamical description we have in mind. Jheald 00:05, 6 October 2006 (UTC)
Thanks to Jheald for relegating older Talk:Entropy sections to an Archive. Skilled in information theory, Jheald has every right to express his understanding of that topic (e.g.,as in Archive 3 at 21:41 4 July). However, I chose to emphasize that word "Thermodynamic" in the heading of this section because his preceding comments may confuse readers who do not realize his divergent views about thermodynamic entropy.
Taking them in order and number: 1. Here, Jheald implies that "energy of A/Volume" exactly equals "energy of A/ 2Volumes". I'm sure it is an innocent error. The second paragraph in my "Conclusion" above corrects this kind of misunderstanding. I would urge readers to check it and its explication.
2. I told JHeald (19:25, 30 June) about a ms. from a colleague concerning the "problem" of residual entropy and, to save the list complex trivia, offered to give full details via email. He didn't respond. The ms. has since been accepted for publication (E.I Kozliak, J.Chem.Educ. "in press"; will be published within 6 months; email me and he might be willing to release a copy of the ms.) It is perfectly covered by my emphasis on energy as the proper focus in evaluating entropy change: multiple Boltzmann distributions (of ENERGY/energetic molecs.!) being frozen-in to solid CO, H2O, FClO3, etc.
3. This statement shows the confusion that arises when information 'entropy' ideas are misapplied to thermodynamic systems. All thermodynamic entropy numbers/calculations/evaluations are actually entropy changes -- from 0 K; from an initial state to a final state [each of them ultimately relatable to 0 K state]. There is only one situation in thermodynamics where any system can have "one way to achieve it" and that is a perfect crystal of a substance at 0 K. Thermodynamic equilibrium for any system indeed is a max distibution of energies, i.e., max dispersal, MaxEnt for that system under its particular equilib. constraints of P, V, T, etc.
The dispersal of energy evaluated by the Boltzmann entropy equation is coincident with the multiplicity of thermal physics. That's a major part of its value -- direct relationships that unite simple standard views of molecular behavior that then lead to stimulating new insights in stat mech and are consonant with QM.
JHeald: "the problem is that it's not energy being dispersed between different microstates - it's probability. Here we go again! (First, _I_ am not talkng about energy "being dispersed BETWEEN different microstates"!!! The energy of a system is ALWAYS and ONLY present in one microstate at one instant. If it stayed in that same microstate or had only a dozen or a million microstates TO WHICH IT COULD change in the next instant, that would be a more localized distribution of that energy than if there were the choice of any one of quadrillions or godzillions of microstates to which it could change. That is what 'energy dispersal' in a system means in terms of microstates.)
Then, JHeald's viewpoint is skewed toward info 'entropy' and diverts emphasis on thermodynamic entropy, the focus of this section of Wikipedia. Information theory, indeed, is half of thermodynamic entropy results; the probability measurement of energy's dispersal in the 'actualizing' half. It is important. (See 21:32 of 1 July for my presentation). But the function of probability is in describing the max number of ENERGY distributions that are accessible under the constraints of the process being examined. I don't think that a roaring gas flame heating a huge boiler or even a hot gas stove burner boiling a pan of water is well described by "certainty being spread out into distributions of probability". Thermodynamic entropy deserves a better press than that :-).
Energy dispersal a stumbling block? I don't know evidence of that. Wish I could quiz the 50,000 or more students who were exposed to it instead of 'disorder' last year.
"Accessible microstates"? We are indeed talking about the totality of ergodically accessible microstates in infinite time which is better described to beginners in the terms of 'choices..etc.' NO WAY are those Heisenberg QM susceptible states! The 'jumps' occur because each microstate describes a particular set of molecular energies of the system that certainly ARE compatible with what JHeald urges. FrankLambert 04:51, 6 October 2006 (UTC)