This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 10 | Archive 11 | Archive 12 | Archive 13 | Archive 14 |
The article so far fails to distinguish clearly between entropy changes due to a reversible process and an irreversible process. The two are usually conceptually distinct and the article should strive to provide an example of each so that the distinct is clear to the reader.-- 212.3.132.195 ( talk) 13:35, 27 January 2013 (UTC)
There is a serious structural problem with this article. Most of the material in this article should be moved to Entropy (classical thermodynamics) and this article should then be redirected to Entropy (disambiguation). That is the only way to make plain the other subject areas of entropy from information theory, quantum mechanics, etc. No wonder this article has been so confused and has failed to find focus.-- 190.204.70.243 ( talk) 07:18, 9 January 2013 (UTC)
(Undent)Well, I was trying to avoid that confusion. I think you're right, though, that it did introduce some confusion. Perhaps describing it as an interval variable rather than a ratio variable is better. (I was thinking that, because both joules and kelvin are ratio variables, entropy must be a ratio as well.) However, as the new addition is written, it doesn't make much sense, especially to a newcomer. As PAR mentioned above, saying that entropy is an "abstract function of state" doesn't really say anything, and I believe this only adds to the confusion. The first sentence there should concisely state exactly what the math says entropy is. The difficulty lies in making the correct translation.
Entropy is definitely not an abstract thing, but a measurable property of heat, which I was trying to define from the macroscopic, thermodynamic standpoint first, before getting into other forms. To see this, perhaps it would be helpful to point out the difference between entropy and heat capacity. Heat capacity is the amount of energy that needs to be added to a certain amount of something to change its entire temperature a single degree. For instance, It takes a certain amount of energy to raise the temperature of a gallon of water a single degree.
On the other hand. entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only. Entropy does not deal with the heat capacity of the entire substance, but only with the energy needed to change (or "maintain" perhaps would be a better word) the temperature at the boundary where energy is being transferred.
In other words, as energy is added to the gallon of water, the temperature of the boundary does not change instantly. If it did, the energy and temperature would be equal, and the entropy would be nothing. Instead, if adding 1000 joules only increases the boundary temperature to 800 kelvin, then logic dictates that some of that energy is being used for something else. By dividing 1000 by 800, we get 1.25. If providing 800 degrees at the boundary is 100% of the energy needed to perform work, (in this example, performing work is simply heating the entire gallon one degree), then you will actually need to add 125% of the needed amount. The rest of that energy will not be used for work (temperature change), and will only be released as waste once the gallon of water cools. I think the main thing to understand is that entropy is not just something that occurs in machinery, but it occurs anytime heat transfers. Zaereth ( talk) 01:01, 29 January 2013 (UTC)
(undent) Getting back to the topic of restructuring. It might be helpful to see what some of the other languages do on this topic.
But English currently has:
Perhaps we should restructure along the lines of the French and German.-- 190.73.248.92 ( talk) 14:15, 2 February 2013 (UTC)
I have removed the anchor for "specific entropy" and put sentence to define it in the lead. I have also updated the redirect specific entropy to just point to this article with no anchor. The anchor I removed did not make sense and seems to have been added by someone just searching for the first occurrence of the phrase, which turned out to be inappropriate.-- 178.210.45.18 ( talk) 18:11, 6 February 2013 (UTC)
So what else does this article need before it is ready for peer review? One thing that would be nice is if we could have just one section that deals with the classical and statistical descriptions.-- 200.165.161.254 ( talk) 00:01, 8 February 2013 (UTC)
(undent) The person who created the two sub-articles is this person:
The eoht.info web site (of which he is the owner and primary editor) is full of original research about trying to apply the math of thermodynamics to the human relationships between men and women and making babies, etc. There was technically nothing wrong with the sub-articles when he created them in 2006, but the whole idea of having both sub-articles was, I think, not well thought-out. I think that entropy (classical thermodynamics) should be merged back into this article because otherwise I do not think that we can provide any satisfactory explanation to the reader about what distinction we are trying to make in having both articles around.-- 200.109.197.133 ( talk) 06:25, 12 February 2013 (UTC)
The Entropy (classical thermodynamics) page is completely encompassed by this article. There is no sensible explanation about why we should have both pages.-- 200.109.197.133 ( talk) 07:30, 12 February 2013 (UTC)
*Support The reason seems apt. There may be other sections that can be factored out if the article gets too large.--
Probeb217 (
talk)
04:49, 13 February 2013 (UTC)
Also oppose. The thermodynamic entropy article is, within its limitations, quite coherent, careful, and clear. The general entropy article has some nice attempts to include a more modern, general definition. However, the organization is a mess and various idiosyncratic and sloppy ideas dangle off various parts of it. Today I removed one section that was entirely wrong. Until this article is in better shape, I think it would be a shame to mess with the well-constructed thermodynamic article. Mbweissman ( talk) 03:27, 8 March 2013 (UTC)
It seems to me the article might be better off without the first paragraph, which reads:
"Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy."
The article concerns thermodynamic, not specifically statistical mechanical, entropy, but the first sentence is more applicable to statistical mechanical interpretation of thermodynamic entropy than to the thermodynamic concept itself, which is worth understanding independent of the statistical mechanical accounts that may be given of it. It is also a specific, Boltzmannian attempt to give statistical mechanical interpretation to entropy, and may be at odds with more Gibbsian versions of entropy, so again, is probably best not to lead with. The validity of the second sentence is highly dependent on your definition of entropy, and again, it is probably best not to lead with it, but to discuss it later in the article. It is a reasonable point of view that isolated systems do not evolve toward thermodynamic equilibrium (and many attempts to prove that this is always so have failed), but rather that thermodynamic equilibrium tends to be reached through interaction with an environment.
If there's no strong objection within the next couple of weeks, I may give this a try, also checking to make sure that the points made in these sentences are addressed, with more nuance, later in the article.
The next paragraph introduces a thermodynamic definition of entropy, which seems a better starting point.
MorphismOfDoom ( talk) 12:30, 5 June 2013 (UTC)
Today Koen Van de moortel changed one P to a p with the edit summary "P=Power, p=pressure". So I skimmed the whole article with the thought of making the notation uniform and found that
So to be consistent we can choose Option 1 P = probability, p = pressure everywhere, or Option 2 P = pressure, p = probability everywhere. Opinions? Dirac66 ( talk) 19:00, 21 May 2013 (UTC)
Yes, P for pressure and p for probability (Option 2) seems best to me but either seems reasonable. Go for it! MorphismOfDoom ( talk) 12:32, 5 June 2013 (UTC)
The entire section on energy dispersal should be deleted. All but one of the sources it references are innapropriate for a wikipedia article since they are not authoritative or representative of a widely accepted approach. A retired professor's personal website should not be relied on. One stray article he wrote for an education journal, even though a respectable peer-reviewed publication, also is insignificant. An unsupported unsubstantiated and, I suspect from perusing google books, false statement about shifting trends in Chemistry textbooks is just too remote from the purpposes of this article. A direct quote from Atkins might be useful. This section is way too big to be in proportion to the importance of this hobby-horse point of view. 98.109.238.95 ( talk) 22:05, 28 June 2013 (UTC)
Hi,
I am mystified by this formulation (which appears to be often used in most unpredictable ways): "It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it." Are not order and disorder purely psychological (not physical) phenomena? - 92.100.165.149 ( talk) 16:48, 13 December 2013 (UTC)
The edit summary, and references given, state the well established reasons for the that said "In the model of this present account, as shown in the diagram, it is important that the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.< Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, pp. 44, 146–147.><Haase, R. (1971). Survey of Fundamental Laws, chapter 1 of Thermodynamics, pages 1–97 of volume 1, ed. W. Jost, of Physical Chemistry. An Advanced Treatise, ed. H. Eyring, D. Henderson, W. Jost, Academic Press, New York, p. 35.>" Chjoaygame ( talk) 23:28, 24 May 2014 (UTC)
I agree. But the assumption of maintaining equilibrium should have started on the first equation, not the second. Otherwise it looks to the casual reader like you missed the chain rule in calculus. — Preceding unsigned comment added by 2601:C:8D80:249:250:8DFF:FEB5:7FA4 ( talk) 22:24, 31 May 2014 (UTC)
In the opening paragraphs the article refers to isolated thermodynamic systems "evolving toward equilibrium" and further down "progressing toward equilibrium" these statements are misleading in that they convey some idea of improvement. The correct term is decay (ref Peter Atkins - Creation Revisited). So I would like to suggest the revised wording.. "decay toward equilibrium" in all cases where this is the intended meaning. Vh mby ( talk) 12:44, 18 May 2014 (UTC)
Well now we know why the article got de-listed from "good", It has suffered some decay [1]. If budding contributors don't understand it is their responsibility to find appropriate references to support their opinions and at the very least read and understand the validity, in the scientific community, of those given, I would suggest they don't belong here. So I shall include the reference..(Done) Vh mby ( talk) 01:35, 22 May 2014 (UTC)
Reference
It seems Waleswatcher considers himself above Wiki protocols and above published authorities on the subject.My Ref:Atkins, Peter, W. (1992). Creation Revisited: The Origin of Space, Time and the Universe. Penguin Books.
ISBN
0140174257.{{
cite book}}
: CS1 maint: multiple names: authors list (
link). So I would request Waleswatcher to state here in this discussion page an answer to the question;
To what (state) do isolated thermodynamic systems evolve..? Please include your reference. Mike 01:47, 31 May 2014 (UTC)
Regarding this choice of terms - the problem with "decay" is that it implies that something changes form, disappears, or otherwise gets reduced or eliminated. But in many circumstances that is not what happens. For example when two systems are put into thermal contact, heat flows between them until their temperatures are equal. Nothing is decaying (well I suppose you could say the temperature difference is decaying, but that's convoluted and unclear). Similarly, drop some ink into a glass of water and the ink will mix into the water, increasing the entropy, but again nothing is "decaying". So decay is not necessary, "evolve" is more neutral and more accurate. That point of view is shared both by references (for instance Kittel and Kromer never use the term decay in the many places they discuss the second law - their statement of it is simply that the entropy will most probably increase if the system is not in equilibrium) and other editors (W. P. Uzer above, and Chjoaygame here /info/en/?search=Talk:Second_law_of_thermodynamics ). Waleswatcher (talk) 16:06, 30 May 2014 (UTC)
Here's another reference that uses evolve in exactly the same way as in the article, and never decay (this is just one I can easily link to - there are many more). P. 122: "The second law, as expressed in (4.13), is responsible for the observed arrow of time in the macroscopic world. Isolated systems can only evolve to states of equal or higher entropy." http://www.damtp.cam.ac.uk/user/tong/statphys/sp.pdf Waleswatcher (talk) 16:13, 30 May 2014 (UTC)
More references, just for fun.
From Kardar, Statistical Physics of Particles: "The time evolution of systems toward equilibrium is governed by the second law of thermodynamics.... Do all systems naturally evolve towards an equilibrium state?...What is the time evolution of a system that is not quite in equilibrium?... In contrast to kinetic theory, equilibrium statistical mechanics leaves out the question of how various systems evolve to a state of equilibrium." The term "decay" is never used in the context of the second law or entropy anywhere in the book; "evolve" is used throughout.
From Landau and Lifschitz, Statistical Physics: "...its macroscopic state will vary with time...the system continually passes from states of lower to those of higher entropy until finally the system reaches...". Again, no "decay".
From Huang, Introduction to Statistical Physics, explaining the second law in some examples of distributions of gas molecules: "...almost any state that looks like (a) will evolve into a uniform state like (b). But..."
Need I go on? Waleswatcher (talk) 17:03, 30 May 2014 (UTC)
Ok done
Gyroman (
talk)
12:11, 18 June 2014 (UTC)
Would Mr W P Uzer care to define "evolve" in a manner clearly synonymous with what is known and accepted concerning any isolated system ie it will irreversibly undergo an "increase in disorder".!
Gyroman (
talk)
01:10, 19 June 2014 (UTC)
Note "system evolves" "system decays" The same holds on Google Web and Books, and also when adding any of the prepositions "to"/"into"/"towards". Paradoctor ( talk) 01:38, 23 July 2014 (UTC)
I have asked for reliable sourcing for a statement in the article: "It has more recently been extended in the area of non-equilibrium thermodynamics."
An opinion is expressed by Lieb, E.H., Yngvason, J. (2003), The Entropy of Classical Thermodynamics, chapter 8, pp. 147–195, of Entropy, edited by Greven, A, Keller, G. Warnecke, G. (2003), Princeton University Press, Princeton NJ, ISBN 0-691-11338-6. They write on page 190:
An opinion is expressed by Grandy, W.T., Jr, (2008), Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6, on page 153. He writes:
The above discussion is confusing to non-experts, because the second law was originally classical and is in fact frequently used to study non-equilibrium states. If Clausius and Kelvin said that the entropy of the (system plus surroundings) always increases for non-equilibrium processes, then they must have provided some definition of the quantity which increases. The two sentences questioned were Historically, the classical thermodynamics definition developed first. It has more recently been extended in the area of non-equilibrium thermodynamics. The second sentence has now been deleted by Chjoaygame, and perhaps it was not quite accurate or reliably sourced. But I think it would be useful to readers to provide a more accurate explanation of why entropy can be used to describe non-equilibrium states and processes. I think it has to do with considering reversible processes which are infinitesimally removed from true equilibrium states, but I won't try to write a complete (and sourced) statement because I am certain that Chjoaygame can write a more accurate explanation of this point than I can. Please. Dirac66 ( talk) 20:42, 23 July 2014 (UTC)
"disorder" Gyroman ( talk) 01:33, 6 August 2014 (UTC)
The entropy of classical thermodynamics is a state variable for the energy picture U = U(S,V,{Nj}), and a state function for the entropy picture S = S(U,V,{Nj}), of a thermodynamic system in its own state of internal thermodynamic equilibrium. For physical systems in general, the classical entropy is not defined.
A straightforward but perhaps long-winded, though very safe, statement of the second law might go as follows:
Every natural thermodynamic process is irreversible. 'Reversible processes' in thermodynamics are virtual or fictional mathematical artifices, valuable, indeed practically indispensable, devices for equilibrium thermodynamic studies. Mathematical artifices nevertheless.
Thermodynamic operations have been implicitly recognized, though not so named, since the early days. Kelvin spoke of "inanimate agency" : "It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects." Logicaly this implies that he contemplates "animate agency"; that means, in modern language, 'thermodynamic operation'. They are essential for thermodynamic reasoning.
A step to dealing with non-equilibrium problems is to consider physical systems near enough to being in their own states of internal thermodynamic equilibrium that one can take the entropy to be the same function of the same state variables as for equilibrium. This is an approximation that works very well for many problems, and much valuable work has been done with it. But for an article on entropy to allow it, without specific notice that it is an approximation, I think is loose.
A further step towards non-equilibrium thermodynamics is to try to work with a scalar entropy that is a function of an extended set of state variables, that for example may include fluxes and time rates of change or spatial gradients of classical state variables. This works for a wider range of problems. Again I think an article on entropy that intends to include this should say so explicitly, not just loosely imply it.
A further step towards non-equilibrium thermodynamics is to use a thoroughly non-classical extension of the concept of entropy, the multiple-time hierarchy of entropies. The two-time entropy is a function of a time interval, between the initial state and the final state. It provides a criterion for the expected direction of non-equilibrium processes, when the one-time entropy is not an adequate guide. But I would say that for Wikipedia it is to be regarded as research matter, and reliable sources for it are not too many. I think in this article it calls for explicit, rather than just vaguely or loosely implied, mention.
Loose statements that refer to changes in entropy in an isolated system I think are indeed loose statements, in an article on entropy. The classical entropy of a physical system not in its own state of internal thermodynamic equilibrium is not defined. The entropy of an isolated thermodynamic system in its own state of internal thermodynamic equilibrium does not change. Statements that refer to changes have many implicit but tacit presuppositions that are not likely to be apparent to readers not familiar with the subject.
Loose statements in this article can easily be used to support wild speculation, such as about the entropy of the universe, something that is hardly definable, and certainly not classically defined. How far do we want this article to supply such support?
Loose statements may be argued for because they 'help' readers who want a 'quick and efficient' glimpse of entropy. Perhaps. But are they really well served by inviting them to accept loose statements? And what about those who want to learn something factual and reliable?
It would not be easy to change the article to say this kind of thing. Chjoaygame ( talk) 10:31, 24 July 2014 (UTC) Chjoaygame ( talk) 10:42, 24 July 2014 (UTC)
Why exactly is there a need for a citation for the statement - in the definitions section - that quantum statistical thermodynamics came after classical statistical thermodynamics? This is a well known historical fact. — Preceding unsigned comment added by 80.45.182.9 ( talk) 14:02, 11 August 2014 (UTC)
An unsigned edit has adverted to the notation for differentials here.
Sometimes a notation đQ has been used to denote an incomplete differential (e.g. Kirkwood & Oppenheim 1961, Tisza 1966, Callen 1960/1985, Adkins 1968/1983). Some authors use the notation q (Pippard 1957/1966, Guggenheim 1949/1967). Landsberg uses the symbol d'Q, perhaps because of the fonts available to him. I do not know how to put đQ into LateX; perhaps someone will kindly enlighten me.
Some authors do not mark the incompleteness by a special symbol (Born 1949, Buchdahl 1966, ter Haar & Wergeland 1966, Münster 1970, Kondepudi & Prigogine 1998, Bailyn 1994, Tschoegl 2000).
Often in Wikipedia the same object seems I think (subject to correction) to be denoted δQ; in such notation, that object is not a finite difference; in such notation it is an infinitesimal. Perhaps someone will check this out.
The quantity on the lefthand side of the equation in question is on the other hand a finite difference, not a differential. It is customarily, as in the lead of the article denoted by a non-italic font capital delta, ΔS, not an italic font lower case letter δS.
The word infinitesimal is so spelt, not as "infinitessimal". The word an is so spelt, not as "and". Chjoaygame ( talk) 05:09, 12 August 2014 (UTC)
Failed to parse (unknown function "\textcrd"): {\displaystyle \textcrd}
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 10 | Archive 11 | Archive 12 | Archive 13 | Archive 14 |
The article so far fails to distinguish clearly between entropy changes due to a reversible process and an irreversible process. The two are usually conceptually distinct and the article should strive to provide an example of each so that the distinct is clear to the reader.-- 212.3.132.195 ( talk) 13:35, 27 January 2013 (UTC)
There is a serious structural problem with this article. Most of the material in this article should be moved to Entropy (classical thermodynamics) and this article should then be redirected to Entropy (disambiguation). That is the only way to make plain the other subject areas of entropy from information theory, quantum mechanics, etc. No wonder this article has been so confused and has failed to find focus.-- 190.204.70.243 ( talk) 07:18, 9 January 2013 (UTC)
(Undent)Well, I was trying to avoid that confusion. I think you're right, though, that it did introduce some confusion. Perhaps describing it as an interval variable rather than a ratio variable is better. (I was thinking that, because both joules and kelvin are ratio variables, entropy must be a ratio as well.) However, as the new addition is written, it doesn't make much sense, especially to a newcomer. As PAR mentioned above, saying that entropy is an "abstract function of state" doesn't really say anything, and I believe this only adds to the confusion. The first sentence there should concisely state exactly what the math says entropy is. The difficulty lies in making the correct translation.
Entropy is definitely not an abstract thing, but a measurable property of heat, which I was trying to define from the macroscopic, thermodynamic standpoint first, before getting into other forms. To see this, perhaps it would be helpful to point out the difference between entropy and heat capacity. Heat capacity is the amount of energy that needs to be added to a certain amount of something to change its entire temperature a single degree. For instance, It takes a certain amount of energy to raise the temperature of a gallon of water a single degree.
On the other hand. entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only. Entropy does not deal with the heat capacity of the entire substance, but only with the energy needed to change (or "maintain" perhaps would be a better word) the temperature at the boundary where energy is being transferred.
In other words, as energy is added to the gallon of water, the temperature of the boundary does not change instantly. If it did, the energy and temperature would be equal, and the entropy would be nothing. Instead, if adding 1000 joules only increases the boundary temperature to 800 kelvin, then logic dictates that some of that energy is being used for something else. By dividing 1000 by 800, we get 1.25. If providing 800 degrees at the boundary is 100% of the energy needed to perform work, (in this example, performing work is simply heating the entire gallon one degree), then you will actually need to add 125% of the needed amount. The rest of that energy will not be used for work (temperature change), and will only be released as waste once the gallon of water cools. I think the main thing to understand is that entropy is not just something that occurs in machinery, but it occurs anytime heat transfers. Zaereth ( talk) 01:01, 29 January 2013 (UTC)
(undent) Getting back to the topic of restructuring. It might be helpful to see what some of the other languages do on this topic.
But English currently has:
Perhaps we should restructure along the lines of the French and German.-- 190.73.248.92 ( talk) 14:15, 2 February 2013 (UTC)
I have removed the anchor for "specific entropy" and put sentence to define it in the lead. I have also updated the redirect specific entropy to just point to this article with no anchor. The anchor I removed did not make sense and seems to have been added by someone just searching for the first occurrence of the phrase, which turned out to be inappropriate.-- 178.210.45.18 ( talk) 18:11, 6 February 2013 (UTC)
So what else does this article need before it is ready for peer review? One thing that would be nice is if we could have just one section that deals with the classical and statistical descriptions.-- 200.165.161.254 ( talk) 00:01, 8 February 2013 (UTC)
(undent) The person who created the two sub-articles is this person:
The eoht.info web site (of which he is the owner and primary editor) is full of original research about trying to apply the math of thermodynamics to the human relationships between men and women and making babies, etc. There was technically nothing wrong with the sub-articles when he created them in 2006, but the whole idea of having both sub-articles was, I think, not well thought-out. I think that entropy (classical thermodynamics) should be merged back into this article because otherwise I do not think that we can provide any satisfactory explanation to the reader about what distinction we are trying to make in having both articles around.-- 200.109.197.133 ( talk) 06:25, 12 February 2013 (UTC)
The Entropy (classical thermodynamics) page is completely encompassed by this article. There is no sensible explanation about why we should have both pages.-- 200.109.197.133 ( talk) 07:30, 12 February 2013 (UTC)
*Support The reason seems apt. There may be other sections that can be factored out if the article gets too large.--
Probeb217 (
talk)
04:49, 13 February 2013 (UTC)
Also oppose. The thermodynamic entropy article is, within its limitations, quite coherent, careful, and clear. The general entropy article has some nice attempts to include a more modern, general definition. However, the organization is a mess and various idiosyncratic and sloppy ideas dangle off various parts of it. Today I removed one section that was entirely wrong. Until this article is in better shape, I think it would be a shame to mess with the well-constructed thermodynamic article. Mbweissman ( talk) 03:27, 8 March 2013 (UTC)
It seems to me the article might be better off without the first paragraph, which reads:
"Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy."
The article concerns thermodynamic, not specifically statistical mechanical, entropy, but the first sentence is more applicable to statistical mechanical interpretation of thermodynamic entropy than to the thermodynamic concept itself, which is worth understanding independent of the statistical mechanical accounts that may be given of it. It is also a specific, Boltzmannian attempt to give statistical mechanical interpretation to entropy, and may be at odds with more Gibbsian versions of entropy, so again, is probably best not to lead with. The validity of the second sentence is highly dependent on your definition of entropy, and again, it is probably best not to lead with it, but to discuss it later in the article. It is a reasonable point of view that isolated systems do not evolve toward thermodynamic equilibrium (and many attempts to prove that this is always so have failed), but rather that thermodynamic equilibrium tends to be reached through interaction with an environment.
If there's no strong objection within the next couple of weeks, I may give this a try, also checking to make sure that the points made in these sentences are addressed, with more nuance, later in the article.
The next paragraph introduces a thermodynamic definition of entropy, which seems a better starting point.
MorphismOfDoom ( talk) 12:30, 5 June 2013 (UTC)
Today Koen Van de moortel changed one P to a p with the edit summary "P=Power, p=pressure". So I skimmed the whole article with the thought of making the notation uniform and found that
So to be consistent we can choose Option 1 P = probability, p = pressure everywhere, or Option 2 P = pressure, p = probability everywhere. Opinions? Dirac66 ( talk) 19:00, 21 May 2013 (UTC)
Yes, P for pressure and p for probability (Option 2) seems best to me but either seems reasonable. Go for it! MorphismOfDoom ( talk) 12:32, 5 June 2013 (UTC)
The entire section on energy dispersal should be deleted. All but one of the sources it references are innapropriate for a wikipedia article since they are not authoritative or representative of a widely accepted approach. A retired professor's personal website should not be relied on. One stray article he wrote for an education journal, even though a respectable peer-reviewed publication, also is insignificant. An unsupported unsubstantiated and, I suspect from perusing google books, false statement about shifting trends in Chemistry textbooks is just too remote from the purpposes of this article. A direct quote from Atkins might be useful. This section is way too big to be in proportion to the importance of this hobby-horse point of view. 98.109.238.95 ( talk) 22:05, 28 June 2013 (UTC)
Hi,
I am mystified by this formulation (which appears to be often used in most unpredictable ways): "It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it." Are not order and disorder purely psychological (not physical) phenomena? - 92.100.165.149 ( talk) 16:48, 13 December 2013 (UTC)
The edit summary, and references given, state the well established reasons for the that said "In the model of this present account, as shown in the diagram, it is important that the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.< Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, pp. 44, 146–147.><Haase, R. (1971). Survey of Fundamental Laws, chapter 1 of Thermodynamics, pages 1–97 of volume 1, ed. W. Jost, of Physical Chemistry. An Advanced Treatise, ed. H. Eyring, D. Henderson, W. Jost, Academic Press, New York, p. 35.>" Chjoaygame ( talk) 23:28, 24 May 2014 (UTC)
I agree. But the assumption of maintaining equilibrium should have started on the first equation, not the second. Otherwise it looks to the casual reader like you missed the chain rule in calculus. — Preceding unsigned comment added by 2601:C:8D80:249:250:8DFF:FEB5:7FA4 ( talk) 22:24, 31 May 2014 (UTC)
In the opening paragraphs the article refers to isolated thermodynamic systems "evolving toward equilibrium" and further down "progressing toward equilibrium" these statements are misleading in that they convey some idea of improvement. The correct term is decay (ref Peter Atkins - Creation Revisited). So I would like to suggest the revised wording.. "decay toward equilibrium" in all cases where this is the intended meaning. Vh mby ( talk) 12:44, 18 May 2014 (UTC)
Well now we know why the article got de-listed from "good", It has suffered some decay [1]. If budding contributors don't understand it is their responsibility to find appropriate references to support their opinions and at the very least read and understand the validity, in the scientific community, of those given, I would suggest they don't belong here. So I shall include the reference..(Done) Vh mby ( talk) 01:35, 22 May 2014 (UTC)
Reference
It seems Waleswatcher considers himself above Wiki protocols and above published authorities on the subject.My Ref:Atkins, Peter, W. (1992). Creation Revisited: The Origin of Space, Time and the Universe. Penguin Books.
ISBN
0140174257.{{
cite book}}
: CS1 maint: multiple names: authors list (
link). So I would request Waleswatcher to state here in this discussion page an answer to the question;
To what (state) do isolated thermodynamic systems evolve..? Please include your reference. Mike 01:47, 31 May 2014 (UTC)
Regarding this choice of terms - the problem with "decay" is that it implies that something changes form, disappears, or otherwise gets reduced or eliminated. But in many circumstances that is not what happens. For example when two systems are put into thermal contact, heat flows between them until their temperatures are equal. Nothing is decaying (well I suppose you could say the temperature difference is decaying, but that's convoluted and unclear). Similarly, drop some ink into a glass of water and the ink will mix into the water, increasing the entropy, but again nothing is "decaying". So decay is not necessary, "evolve" is more neutral and more accurate. That point of view is shared both by references (for instance Kittel and Kromer never use the term decay in the many places they discuss the second law - their statement of it is simply that the entropy will most probably increase if the system is not in equilibrium) and other editors (W. P. Uzer above, and Chjoaygame here /info/en/?search=Talk:Second_law_of_thermodynamics ). Waleswatcher (talk) 16:06, 30 May 2014 (UTC)
Here's another reference that uses evolve in exactly the same way as in the article, and never decay (this is just one I can easily link to - there are many more). P. 122: "The second law, as expressed in (4.13), is responsible for the observed arrow of time in the macroscopic world. Isolated systems can only evolve to states of equal or higher entropy." http://www.damtp.cam.ac.uk/user/tong/statphys/sp.pdf Waleswatcher (talk) 16:13, 30 May 2014 (UTC)
More references, just for fun.
From Kardar, Statistical Physics of Particles: "The time evolution of systems toward equilibrium is governed by the second law of thermodynamics.... Do all systems naturally evolve towards an equilibrium state?...What is the time evolution of a system that is not quite in equilibrium?... In contrast to kinetic theory, equilibrium statistical mechanics leaves out the question of how various systems evolve to a state of equilibrium." The term "decay" is never used in the context of the second law or entropy anywhere in the book; "evolve" is used throughout.
From Landau and Lifschitz, Statistical Physics: "...its macroscopic state will vary with time...the system continually passes from states of lower to those of higher entropy until finally the system reaches...". Again, no "decay".
From Huang, Introduction to Statistical Physics, explaining the second law in some examples of distributions of gas molecules: "...almost any state that looks like (a) will evolve into a uniform state like (b). But..."
Need I go on? Waleswatcher (talk) 17:03, 30 May 2014 (UTC)
Ok done
Gyroman (
talk)
12:11, 18 June 2014 (UTC)
Would Mr W P Uzer care to define "evolve" in a manner clearly synonymous with what is known and accepted concerning any isolated system ie it will irreversibly undergo an "increase in disorder".!
Gyroman (
talk)
01:10, 19 June 2014 (UTC)
Note "system evolves" "system decays" The same holds on Google Web and Books, and also when adding any of the prepositions "to"/"into"/"towards". Paradoctor ( talk) 01:38, 23 July 2014 (UTC)
I have asked for reliable sourcing for a statement in the article: "It has more recently been extended in the area of non-equilibrium thermodynamics."
An opinion is expressed by Lieb, E.H., Yngvason, J. (2003), The Entropy of Classical Thermodynamics, chapter 8, pp. 147–195, of Entropy, edited by Greven, A, Keller, G. Warnecke, G. (2003), Princeton University Press, Princeton NJ, ISBN 0-691-11338-6. They write on page 190:
An opinion is expressed by Grandy, W.T., Jr, (2008), Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6, on page 153. He writes:
The above discussion is confusing to non-experts, because the second law was originally classical and is in fact frequently used to study non-equilibrium states. If Clausius and Kelvin said that the entropy of the (system plus surroundings) always increases for non-equilibrium processes, then they must have provided some definition of the quantity which increases. The two sentences questioned were Historically, the classical thermodynamics definition developed first. It has more recently been extended in the area of non-equilibrium thermodynamics. The second sentence has now been deleted by Chjoaygame, and perhaps it was not quite accurate or reliably sourced. But I think it would be useful to readers to provide a more accurate explanation of why entropy can be used to describe non-equilibrium states and processes. I think it has to do with considering reversible processes which are infinitesimally removed from true equilibrium states, but I won't try to write a complete (and sourced) statement because I am certain that Chjoaygame can write a more accurate explanation of this point than I can. Please. Dirac66 ( talk) 20:42, 23 July 2014 (UTC)
"disorder" Gyroman ( talk) 01:33, 6 August 2014 (UTC)
The entropy of classical thermodynamics is a state variable for the energy picture U = U(S,V,{Nj}), and a state function for the entropy picture S = S(U,V,{Nj}), of a thermodynamic system in its own state of internal thermodynamic equilibrium. For physical systems in general, the classical entropy is not defined.
A straightforward but perhaps long-winded, though very safe, statement of the second law might go as follows:
Every natural thermodynamic process is irreversible. 'Reversible processes' in thermodynamics are virtual or fictional mathematical artifices, valuable, indeed practically indispensable, devices for equilibrium thermodynamic studies. Mathematical artifices nevertheless.
Thermodynamic operations have been implicitly recognized, though not so named, since the early days. Kelvin spoke of "inanimate agency" : "It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects." Logicaly this implies that he contemplates "animate agency"; that means, in modern language, 'thermodynamic operation'. They are essential for thermodynamic reasoning.
A step to dealing with non-equilibrium problems is to consider physical systems near enough to being in their own states of internal thermodynamic equilibrium that one can take the entropy to be the same function of the same state variables as for equilibrium. This is an approximation that works very well for many problems, and much valuable work has been done with it. But for an article on entropy to allow it, without specific notice that it is an approximation, I think is loose.
A further step towards non-equilibrium thermodynamics is to try to work with a scalar entropy that is a function of an extended set of state variables, that for example may include fluxes and time rates of change or spatial gradients of classical state variables. This works for a wider range of problems. Again I think an article on entropy that intends to include this should say so explicitly, not just loosely imply it.
A further step towards non-equilibrium thermodynamics is to use a thoroughly non-classical extension of the concept of entropy, the multiple-time hierarchy of entropies. The two-time entropy is a function of a time interval, between the initial state and the final state. It provides a criterion for the expected direction of non-equilibrium processes, when the one-time entropy is not an adequate guide. But I would say that for Wikipedia it is to be regarded as research matter, and reliable sources for it are not too many. I think in this article it calls for explicit, rather than just vaguely or loosely implied, mention.
Loose statements that refer to changes in entropy in an isolated system I think are indeed loose statements, in an article on entropy. The classical entropy of a physical system not in its own state of internal thermodynamic equilibrium is not defined. The entropy of an isolated thermodynamic system in its own state of internal thermodynamic equilibrium does not change. Statements that refer to changes have many implicit but tacit presuppositions that are not likely to be apparent to readers not familiar with the subject.
Loose statements in this article can easily be used to support wild speculation, such as about the entropy of the universe, something that is hardly definable, and certainly not classically defined. How far do we want this article to supply such support?
Loose statements may be argued for because they 'help' readers who want a 'quick and efficient' glimpse of entropy. Perhaps. But are they really well served by inviting them to accept loose statements? And what about those who want to learn something factual and reliable?
It would not be easy to change the article to say this kind of thing. Chjoaygame ( talk) 10:31, 24 July 2014 (UTC) Chjoaygame ( talk) 10:42, 24 July 2014 (UTC)
Why exactly is there a need for a citation for the statement - in the definitions section - that quantum statistical thermodynamics came after classical statistical thermodynamics? This is a well known historical fact. — Preceding unsigned comment added by 80.45.182.9 ( talk) 14:02, 11 August 2014 (UTC)
An unsigned edit has adverted to the notation for differentials here.
Sometimes a notation đQ has been used to denote an incomplete differential (e.g. Kirkwood & Oppenheim 1961, Tisza 1966, Callen 1960/1985, Adkins 1968/1983). Some authors use the notation q (Pippard 1957/1966, Guggenheim 1949/1967). Landsberg uses the symbol d'Q, perhaps because of the fonts available to him. I do not know how to put đQ into LateX; perhaps someone will kindly enlighten me.
Some authors do not mark the incompleteness by a special symbol (Born 1949, Buchdahl 1966, ter Haar & Wergeland 1966, Münster 1970, Kondepudi & Prigogine 1998, Bailyn 1994, Tschoegl 2000).
Often in Wikipedia the same object seems I think (subject to correction) to be denoted δQ; in such notation, that object is not a finite difference; in such notation it is an infinitesimal. Perhaps someone will check this out.
The quantity on the lefthand side of the equation in question is on the other hand a finite difference, not a differential. It is customarily, as in the lead of the article denoted by a non-italic font capital delta, ΔS, not an italic font lower case letter δS.
The word infinitesimal is so spelt, not as "infinitessimal". The word an is so spelt, not as "and". Chjoaygame ( talk) 05:09, 12 August 2014 (UTC)
Failed to parse (unknown function "\textcrd"): {\displaystyle \textcrd}