![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
Hi PAR. (1) Why use E for internal energy when the article on ideal gas uses U? (2) Why not define u=U/kN and v=V/kN for specific energy and volume ? (3) The argument to the logarithm seems not to be dimensionless! Bo Jacoby 13:06, 9 November 2005 (UTC)
{{
cite journal}}
: |volume=
has extra text (
help); External link in |title=
(
help)Which brings up a point that worries me, and it is not a point for someone who is book-bound, so I know I'm talking to the right person. Even though its not generally done, you should be able to assign the additional dimension of "particle" to certain quantites. N has dimension of "particles" so that N/V has dimensions of "particles per unit volume". Boltzmanns constant k has units of "entropy per particle", so that kT is "energy per particle". Planck's constant has units of "action per particle". I cannot get the argument of the logarithm in the Sackur-Tetrode equation to be dimensionless doing this. The idea that particle=dimensionless is just a mindless hand-me-down rule that I cannot figure out how to disprove nor justify. I have a strong hunch that its ok to assign the dimension "particle" and that my inability to render the argument dimensionless points out that I am missing some subtle point in the physics. PAR 16:55, 9 November 2005 (UTC)
The ideal monatomic gas entropy is where Φ is some undetermined constant. The Sackur-Tetrode equation specifies that constant, so the two are completely compatible.
I've never need to use it practically, so I'm sort of winging it here - If you are dealing with entropy differences, you dont need to know the constant. If you are dealing with enormous entropies (S/Nk huge) then again, no need. If you are dealing with absolute entropy at or near the critical point (S/Nk of order unity) then still no need, it breaks down. But for S/Nk in an intermediate range, the question of what is the constant is important. Check this out.
I think there should be a strong distinction between the ideas of dimensions and units. The speed of light has the dimensions of velocity or distance/time, and has units of m/sec in the SI system, cm/sec in cgi system, feet/sec in Imperial units. It can also be measured in furlongs/ fortnight. From a theoretical point of view, who cares about the units? The dimensions are of fundamental theoretical importance, the units are not (except that they tell you the dimensions.) Worrying about units is like worrying about the language a scientific paper is written in. Who cares? as long as its translated into a language you understand. Worrying about dimensions is like worrying about what the paper is saying, and theoretically, this is worth worrying about. Worrying about units is of little theoretical significance (but of huge practical significance, of course.)
The bottom line is that units are vitally important to communication, just as language is. I don't have a favorite language, but I do have one that I am confined to speak in because of my upbringing and mental limitations. I don't wish to be similarly confined by having a "favorite" set of units. Units are just some language I have to learn in order to communicate with others. Dimensions are much more interesting. PAR 16:30, 11 November 2005 (UTC)
Yes, the distinction will never be a problem if you live on a desert island. In reality, you have to negotiate the difference between concept and word with other people, and to do so effectively you need to understand the difference between the two. In my mind, I try to deal with concepts. The process of translating these concepts to words is extremely negotiable in my mind, whatever it takes to communicate. Which means I have little respect for "proper english" while at the same time I strive to be adept at it. I have little respect for units either, yet I always try to get them right.
Thats why I like the topic of dimensional analysis especially the Buckingham Pi theorem - one of its basic tenets is that all physical theories must be expressible in dimensionless terms in order to have any validity. That says it all! PAR 17:04, 14 November 2005 (UTC)
If you go back to the pre-SackurTetrode equation(leave out the N in V/N) then you get a dimensionless argument for the logarithm. Reason being:the argument is the ratio of two phase-space hypervolumes which yields the number of microstates consistent with the macrostate description. Dividing by N! spoils this. I don't believe the problem arises in quantum statistics because there you deal with the number of states from the outset.-- GRaetz 17:25, 21 January 2006 (UTC)
Do other users think that we should include the derivation from the Schrodinger equation? It is relatively simple. Tiberius Curtainsmith —Preceding undated comment added 18:55, 30 July 2009 (UTC).
The phrase, "Making sense of something using X" might imply to a neophyte that X is needed to make sense of something. Many can "make sense" of the equation through the thermodynamic or the information theory perspectives. The article does not signal that entropy has these different perspectives (even though it is obvious to experts that the equation was originally derived in the ... perspective). I will change the title and first sentence of the section on information theory to signal the two interpretations. Each perspective has its champions, but the enthusiasm of their POV should not be reflected in titles. Laburke ( talk) 13:33, 18 September 2011 (UTC)
Volume and energy are not the direct source of the states that generate entropy, so I wanted to express it in terms of x*p/h' number of states for each N. Someone above asked for a derivation from the uncertainty principle ("the U.P.") and he says it's pretty easy. S-T pre-dates U.P., so it may be only for historical reasons that a more efficient derivation is not often seen.
The U.P. says x'p'>h/4pi where x'p' are the standard deviations of x and p. The x and p below are the full range, not the 34.1% of the standard deviation so I multiplied x and p each by 0.341. It is 2p because p could be + or - in x,y, and z. By plugging the variables in and solving, it comes very close to the S-T equation. For Ω/N=1000 this was even more accurate than S-T, 0.4% lower.
Sackur-Tetrode equation:
where
Stirling's approximation N!=(N/e)^N is used in two places that results in a 1/N^(5/2) and and e^(5/2) which is where the 5/2 factor comes from. The molecules' internal energy U is kinetic energy for the monoatomic gas case for which the S-T applies. b=1 for monoatomic, and it may simply be changed for other non-monoatomic gases that have a different K.E./U ratio. The equation for p is the only difficult part of getting from the U.P. to the S-T equation and it is difficult only because the thermodynamic measurements T (kinetic energy per atom) and V are an energy and a distance where the U.P. needs x*p or t*E. This strangeness is where the 3/2 and 5/2 factors come from. The 2m is to get 2*m*1/2*m*V^2 = p^2. Boltzmann's entropy assumes it is a max for the given T, V, and P which I believe means the N's are evenly distributed in x^3 and assumes all are carrying the same magnitude p momentum.
[edit: see information theory entropy talk page for a better way]
To show how this can come directly from information theory, first remember that Shannon's H function is valid only for a random variable. In this physical case, there are only 2 possible values that each phase space can have: with an atom in it carrying energy, or not, so it is like a binary file. But unlike normal information entropy, some or many of the atoms may have zero momentum. The only requirement is that the total energy be the same, so the physical system has more freedom of choice than you would expect. Physical entropy can use anywhere from N to 1 symbols (atoms) to carry the same message (the energy), whereas information entropy is typically stuck with N. The modified Shannon H shown below is the sum of the surprisals and is equal to the information (entropy) content in bits. The left sum is the information contributions from the empty phase space slots and the right side are those where an atom occurs. The left sum is about 0.7 without regard to N (1 if ln() had been used) and the right side is about 17*N for a gas at room temperature (for Ω/N ~ 100,000 states/atom):
Physical entropy S then comes directly from this Shannon entropy:
A similar procedure can be applied to phonons in solids to go from information theory to physical entropy. For reference here is a 1D oscillator in solids:
Ywaz ( talk) 02:37, 15 January 2016 (UTC)
Would be nice to see the equivalent eqn in arbitrary dimension d, instead of just d=3. 67.198.37.16 ( talk) 21:06, 19 May 2024 (UTC)
![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
Hi PAR. (1) Why use E for internal energy when the article on ideal gas uses U? (2) Why not define u=U/kN and v=V/kN for specific energy and volume ? (3) The argument to the logarithm seems not to be dimensionless! Bo Jacoby 13:06, 9 November 2005 (UTC)
{{
cite journal}}
: |volume=
has extra text (
help); External link in |title=
(
help)Which brings up a point that worries me, and it is not a point for someone who is book-bound, so I know I'm talking to the right person. Even though its not generally done, you should be able to assign the additional dimension of "particle" to certain quantites. N has dimension of "particles" so that N/V has dimensions of "particles per unit volume". Boltzmanns constant k has units of "entropy per particle", so that kT is "energy per particle". Planck's constant has units of "action per particle". I cannot get the argument of the logarithm in the Sackur-Tetrode equation to be dimensionless doing this. The idea that particle=dimensionless is just a mindless hand-me-down rule that I cannot figure out how to disprove nor justify. I have a strong hunch that its ok to assign the dimension "particle" and that my inability to render the argument dimensionless points out that I am missing some subtle point in the physics. PAR 16:55, 9 November 2005 (UTC)
The ideal monatomic gas entropy is where Φ is some undetermined constant. The Sackur-Tetrode equation specifies that constant, so the two are completely compatible.
I've never need to use it practically, so I'm sort of winging it here - If you are dealing with entropy differences, you dont need to know the constant. If you are dealing with enormous entropies (S/Nk huge) then again, no need. If you are dealing with absolute entropy at or near the critical point (S/Nk of order unity) then still no need, it breaks down. But for S/Nk in an intermediate range, the question of what is the constant is important. Check this out.
I think there should be a strong distinction between the ideas of dimensions and units. The speed of light has the dimensions of velocity or distance/time, and has units of m/sec in the SI system, cm/sec in cgi system, feet/sec in Imperial units. It can also be measured in furlongs/ fortnight. From a theoretical point of view, who cares about the units? The dimensions are of fundamental theoretical importance, the units are not (except that they tell you the dimensions.) Worrying about units is like worrying about the language a scientific paper is written in. Who cares? as long as its translated into a language you understand. Worrying about dimensions is like worrying about what the paper is saying, and theoretically, this is worth worrying about. Worrying about units is of little theoretical significance (but of huge practical significance, of course.)
The bottom line is that units are vitally important to communication, just as language is. I don't have a favorite language, but I do have one that I am confined to speak in because of my upbringing and mental limitations. I don't wish to be similarly confined by having a "favorite" set of units. Units are just some language I have to learn in order to communicate with others. Dimensions are much more interesting. PAR 16:30, 11 November 2005 (UTC)
Yes, the distinction will never be a problem if you live on a desert island. In reality, you have to negotiate the difference between concept and word with other people, and to do so effectively you need to understand the difference between the two. In my mind, I try to deal with concepts. The process of translating these concepts to words is extremely negotiable in my mind, whatever it takes to communicate. Which means I have little respect for "proper english" while at the same time I strive to be adept at it. I have little respect for units either, yet I always try to get them right.
Thats why I like the topic of dimensional analysis especially the Buckingham Pi theorem - one of its basic tenets is that all physical theories must be expressible in dimensionless terms in order to have any validity. That says it all! PAR 17:04, 14 November 2005 (UTC)
If you go back to the pre-SackurTetrode equation(leave out the N in V/N) then you get a dimensionless argument for the logarithm. Reason being:the argument is the ratio of two phase-space hypervolumes which yields the number of microstates consistent with the macrostate description. Dividing by N! spoils this. I don't believe the problem arises in quantum statistics because there you deal with the number of states from the outset.-- GRaetz 17:25, 21 January 2006 (UTC)
Do other users think that we should include the derivation from the Schrodinger equation? It is relatively simple. Tiberius Curtainsmith —Preceding undated comment added 18:55, 30 July 2009 (UTC).
The phrase, "Making sense of something using X" might imply to a neophyte that X is needed to make sense of something. Many can "make sense" of the equation through the thermodynamic or the information theory perspectives. The article does not signal that entropy has these different perspectives (even though it is obvious to experts that the equation was originally derived in the ... perspective). I will change the title and first sentence of the section on information theory to signal the two interpretations. Each perspective has its champions, but the enthusiasm of their POV should not be reflected in titles. Laburke ( talk) 13:33, 18 September 2011 (UTC)
Volume and energy are not the direct source of the states that generate entropy, so I wanted to express it in terms of x*p/h' number of states for each N. Someone above asked for a derivation from the uncertainty principle ("the U.P.") and he says it's pretty easy. S-T pre-dates U.P., so it may be only for historical reasons that a more efficient derivation is not often seen.
The U.P. says x'p'>h/4pi where x'p' are the standard deviations of x and p. The x and p below are the full range, not the 34.1% of the standard deviation so I multiplied x and p each by 0.341. It is 2p because p could be + or - in x,y, and z. By plugging the variables in and solving, it comes very close to the S-T equation. For Ω/N=1000 this was even more accurate than S-T, 0.4% lower.
Sackur-Tetrode equation:
where
Stirling's approximation N!=(N/e)^N is used in two places that results in a 1/N^(5/2) and and e^(5/2) which is where the 5/2 factor comes from. The molecules' internal energy U is kinetic energy for the monoatomic gas case for which the S-T applies. b=1 for monoatomic, and it may simply be changed for other non-monoatomic gases that have a different K.E./U ratio. The equation for p is the only difficult part of getting from the U.P. to the S-T equation and it is difficult only because the thermodynamic measurements T (kinetic energy per atom) and V are an energy and a distance where the U.P. needs x*p or t*E. This strangeness is where the 3/2 and 5/2 factors come from. The 2m is to get 2*m*1/2*m*V^2 = p^2. Boltzmann's entropy assumes it is a max for the given T, V, and P which I believe means the N's are evenly distributed in x^3 and assumes all are carrying the same magnitude p momentum.
[edit: see information theory entropy talk page for a better way]
To show how this can come directly from information theory, first remember that Shannon's H function is valid only for a random variable. In this physical case, there are only 2 possible values that each phase space can have: with an atom in it carrying energy, or not, so it is like a binary file. But unlike normal information entropy, some or many of the atoms may have zero momentum. The only requirement is that the total energy be the same, so the physical system has more freedom of choice than you would expect. Physical entropy can use anywhere from N to 1 symbols (atoms) to carry the same message (the energy), whereas information entropy is typically stuck with N. The modified Shannon H shown below is the sum of the surprisals and is equal to the information (entropy) content in bits. The left sum is the information contributions from the empty phase space slots and the right side are those where an atom occurs. The left sum is about 0.7 without regard to N (1 if ln() had been used) and the right side is about 17*N for a gas at room temperature (for Ω/N ~ 100,000 states/atom):
Physical entropy S then comes directly from this Shannon entropy:
A similar procedure can be applied to phonons in solids to go from information theory to physical entropy. For reference here is a 1D oscillator in solids:
Ywaz ( talk) 02:37, 15 January 2016 (UTC)
Would be nice to see the equivalent eqn in arbitrary dimension d, instead of just d=3. 67.198.37.16 ( talk) 21:06, 19 May 2024 (UTC)