This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | → | Archive 5 |
Surely the thermodynamic quantity described as entropy in this page is in fact the same as the older and more understandable quantity - Heat Capacity at constant volume. I would also like to split this page into two separate pages - one giving a view of entropy as a measure of 'disorder' or number of ways of arranging a system - i.e. the statistical treatment (most of the second half of the article) and a second giving a view of entropy as a thermodynamic quantity (i.e. heat capacity) as used in relation with enthalpies and free energies etc - maybe this would help those (see below) who found the page confusing. Any comments / ideas ? Yes/No to this idea? Is also can't help noticing that much of the mid section of this work is wrong - a possible source of confusion for readers? (specifically sections entitled measuring entropy and thermodynamic definition of entropy and sections following directly on from the these sections.) I think a lot of confusion stems from the unnecessary introduction of a new word entropy when the concept of heat capacity is more intuitive - do other people share this opinion? HappyVR 19:04, 11 February 2006 (UTC)
May be you could find a better place for this missing infomation. HappyVR 22:30, 12 February 2006 (UTC)
This talk page needs to have stuff dumped into an archive. Any dead arguments, or old and obsolete comments should be archived. Please move all the comments under one header, not only one. perhaps we can have one header that lists the headers in the archive. I've made a link to the non existant archive above. Fresheneesz 00:50, 9 December 2005 (UTC)
Sometimes the article is frustratingly vague. In particular:
It doesn't always seem to distinguish between definitions (p=mv, by definition) and laws (momentum is conserved, by observation).
A thermodynamic transformation is defined as "a change in a system's thermodynamic properties, such as its temperature and volume" (emphasis added). What other properties are relevant? Is it a long list?
What does "equilibrium state" mean? Complete thermodynamic equilibrium--the entire system must be a single temperature and pressure throughout?
I'd work on it myself, but I'm too clueless. Jorend 05:18, 24 Apr 2005 (UTC)
I read this article hoping to obtain an intuitive understanding of entropy. Didn't get it. And I don't think the problem is that the article is too "hard"; I don't think it would provide an intuitive grasp of the subject to any reader, however sophisticated. Maybe the topic is too hard. If nothing else, more examples (and examples less wedded to chemical engineering) would be welcome.
Perhaps it is unclear what I mean by "intuitive understanding". I mean the ability to answer questions like these:
Consider my coffee cup as a thermodynamic system. I drop an ice cube into it. How does this affect the entropy? I have introduced a big difference in temperature between two parts of the system, so the entropy decreases, right? But coming at it from a different direction, it seems as though I have added heat to the system, by adding matter to it: even though the ice is frozen, the ice molecules have some thermal energy. So the entropy increases, right?
Is entropy meaningful outside the context of thermodynamic machines? For example, is entropy defined at non-equilibrium states? (The article sort of gives the impression that it isn't. But lots of real-world systems are essentially never at equilibrium, so this would be a serious limitation.)
I'm not clear on how entropy, energy, and work are tied together. How exactly is entropy related to the ability to do work? The article states that when you reach the maximum entropy state, no more work can be done--states it, that is, but doesn't explain. What is meant by "can't do work"--can't do significant, macroscopic work?
Can a supply of energy be referred to as "ordered" (i.e., capable of doing work)? It seems by definition that heat is the only form of energy that increases entropy. Are all other forms of energy equally "ordered" as far as entropy is concerned? After some thought, my guess is that all other forms of energy can be used to do work; only thermal energy can become dispersed and unable to do work, and entropy is a function of the distribution of thermal energy in a system. Am I getting warm?
The article talks about entropy of a system in relation to work done by the system. What about work done within the system (as, by one part of the system on another part)? Perhaps an equivalent way to put this question would be: How does entropy relate to non-thermodynamic properties (like potential energy) of parts of a system? If my previous guess is right, the only effect on entropy is when activity within the system converts other sorts of energy to "waste heat".
Entropy is defined as a function of the state of a system. Does it make any sense to compare the entropy of one system to the entropy of some other system that contains entirely different matter? For example, which has greater entropy: your average rock; or an ice crystal of the same mass at the same temperature? Is the question meaningful?
Kindly don't answer the questions here, unless you just want to talk them through. Improve the article instead. Thanks! Jorend 05:18, 24 Apr 2005 (UTC)
Entropy is defined for an irreversible expansion as:
because delta = 0 and .
Therefore, -- Tygar
Hmm, math seems to be borked right now, except for cached equations and text translations. Just in time for my big landing. Oh well. -- CYD
think you should mention that in classical statistical mechanics infinite states/coarse graining isn't a problem when you are talking about change in entropy because difference in logs = log of a ratio which can be looked at as a ratio of volumes in phase space or the limit of ratios using coarse graining. currently the statistical formula is introduced here but immediately implied to be useless without jumping to quantum considerations
I can't touch it because I'm a total wikinewbie and don't know anything about thermo either
this article not written very clearly at all. The definition of absolute temperature is especially sketchy, and I don't like how it comes a good ways after the concept is first mentioned. And according to the current definition of "microstate", it would seem that there's an infinite number of them in any case. A good physics article should be accessable to anyone with competent knowledge of math.
sitearm 16:38, 26 July 2005 (UTC)
I'm not entirely sure what's happened here. The article is at Entropy, but Talk:Entropy redirects to Talk:Thermodynamic entropy. As far as I can see, the article used to be at Thermodynamic entropy; after Entropy was moved to Entropy (disambiguation), Thermodynamic entropy was moved to Entropy. Looks like the talk page was left behind. I'm going to delete the current Talk:Entropy (which is just, and has always been, a redirect) and move Talk:Thermodynamic entropy there to match the article. — Knowledge Seeker দ 06:13, 12 May 2005 (UTC)
moved from User talk:CYD:
Dear CYD: I notice you removed several of my edits to the article on Entropy. I found this disconcerting. When I read your change history comments I find I dispute them. The entropy/cosmology connection including gravity and black holes is clearly presented in Penrose and I made a point of adding him to the references. Also "entropy as a measure of disorder" does NOT suffice when you also state it as a measure of work that cannot be done. The added definitions established the work connection. I am new to Wikipedia. You are the original author of this article. Does that mean that you may freely delete anything you disagree with, even if sources for the material are cited? sitearm 03:44, 27 July 2005 (UTC)
The current version of this article is MUCH better than what had been here before, thanks to CYD. The article still needs minor work, but CYD has taken care of the major revisions it desperately needed.
I would suggest that this Talk page be archived, with only comments made after CYD's revision being kept. Ravenswood 20:29, July 31, 2005 (UTC)
Below are questions that I think are important to better understand entropy. These need to stay on Talk until addressed in the article. I am OK with the rest of the Talk material to be archived as proposed. Thank you for your patience. Sitearm 15:46, 3 August 2005 (UTC)
I will look to again add a more narrative section on the relationships between entropy, energy, and work in a separate section from yours. Please feel free to clarify in it but not to outright delete it. I request you be accomodating to additional ways of describing mechanical entropy. What is clear and obvious to you in your exposition is not so to me. Let's at least get the narrative worked out before trying to collapse things. I have spent a lot of time reading the policy discussions in Village Pump and I am aware there are major edit revert wars going on in some of these pages. I am not interested in that. I am interested in a great article on Entropy. I am interested in following Wikiquette rules. A concerned newcomer Sitearm 07:48, 4 August 2005 (UTC)
notation is confusing. How is the heat changing? The heat article also has this notation in places
I changed this to simply be Q
I removed the following,
because a heat engine rarely uses a continuum of temperatures, and because it is pointless. Bo Jacoby 11:28, 16 September 2005 (UTC)
I changed Q to dE everywhere, because 'heat' may either mean energy or entropy and so should not be used. Here Q was energy. I finished the definition of temperature, noting that the definition leaves the unit undefined. I hope you like it. Bo Jacoby 13:44, 16 September 2005 (UTC)
Either show it or don't show it, but don't tell me that it is easy to show. I don't think it is easy to show. I don't even think it is correct. Throw a hot piece of iron into a bucket of water, and the entropy of the iron is diminished. . . Bo Jacoby 14:00, 16 September 2005 (UTC)
I've reverted the introduction to a previous version, because it's difficult to read and riddled with inaccuracies. The problems are too many to point out individually, but take for example
This is inaccurate because it assumes no change in volume or other extensive thermodynamic variables, an important assumption that's simply left unstated. Also, "entropy transfer" is not a meaningful concept, only "entropy change" -- entropy is a property of a thermodynamic system.
That whole discussion of "quantities of electricity" is almost unreadable, thanks to liberal use of inline TeX; luckily, it is also almost completely irrelevant, and can be safely deleted.
Incidentally, the article itself serves as a demonstration of the second law of thermodynamics. When I last looked at it a month ago, it was in pretty good shape. Since then, the amount of disorder has increased, and one has to do work to get it back to its original level of quality. -- CYD
I think the entropy and disorder section was getting a little wordy. The new addition was good in that it introduced the concept of Shannon entropy to the messy bedroom analogy, and I have tried to pare the argument in that section down to a simpler statement. PAR 18:39, 8 January 2006 (UTC)
About the reference to to the book " Entropy: A New World View". The article states it is "a notorious misinterpretation of entropy" and links to Foresight to prove it. But Foresight seems to be full of crap, they state that "Therefore, entropy can decline in a closed system", clearly not understanding what a closed system even is. How can you link to such an unreliable source? The book may over interpret some aspects of the law, but it still have a lot of good points and should not be categorical put down in that way. -- A human 14:21, 16 January 2006 (UTC)
This is a great article. I'm sure I would only make it worse if i were to try to add my two cents to it. but it would be nice if certain things were ummm aaa mentioned ... maybe dealt with .. maybe added to external links??
I have in mind data stuff somewhat like:
Context: On evolution "entropy is limited to heat" was changed to "entropy is about useable energy" was changed to "entropy is about spacial distribution of energy" ... or something like that. WAS 4.250 03:46, 17 January 2006 (UTC)
I'm a non-scientist, and this may as well have been in urdu for all I could understand of it. Aren't these articles supposed to be understandable by the average joe like me? ElectricRay 23:29, 22 January 2006 (UTC)
This article is slightly better explained than the two courses I've taken in two different colleges, so its a pretty good explanation. Some people point out ambiguities and that's all good, but the point is driven home better than in most books and college courses. This is NOT an easy topic, and is, for most people, impossible to graps by skimming trough it. As for the current state of "this article doesn't say what the hell the topic (entropy) actually is."... I point to the line that reads: "Entropy is a measure of how far along this process of equalization has come." This, plus the equations, is all. The rest is only so you can grasp this phrase, that is found right in the introduction. Congratulations to whoever wrote this. As for the ice cube and entropy effect, I point to the article again: "the entropy of an isolated system can only increase or remain the same; it cannot decrease" And for delta-Q notation, removing it is a bad idea. Q usually means the heat contained in the system, and using this to notate heat transfers is confusing to those who use formal notation. Please, do not reinvent standard notation. Also, if you didn't take a formal class in the subject, please refrain from editing, as you probably don't grasp the concept fully. -Annoyed Wikipedia reader
I realize there's a long history about the word disorder, but I wonder if it would be better to rephrase it as "statistical disorder" or even "statistical randomness." Some sources have even abandon this and used "dispersion of energy." Olin
By the conservation of energy, the net energy lost by the environment is equal to the work done by the engine.
looks some what vague.What say?-- Sahodaran 05:25, 8 February 2006 (UTC)
How does entropy "pick" a direction in time? By my reasoning, statistically it should increase both into the future and into the past. A partially melted ice cube in a glass of water is far more likely to have been an unlikely perturbation in water molecules (a random jump to slightly lower entropy) than it is to have ever been a fully formed ice cube (an extremely lower entropy.) As I understand, the illusion of an arrow of time is based on an extremely low entropy early universe. Please discuss this. Adodge 23:04, 11 February 2006 (UTC)
IMO, Adodge, you are exactly right. (Though I don't think I'd call the arrow of time an "illusion"). You might also like to look at the article MaxEnt thermodynamics, which comes to the same conclusion, that there is important "prior" information missing from the formalism at it stands if one is to address the retrodiction question (sometimes called "Boltzmann's 2nd Problem").
That article probably goes the furthest. The articles H-theorem and Loschmidt's paradox don't go quite as far, but both do address problems with the claim that "statistical mechanics proves that statistical entropy increases" — as the article on information theory currently puts it: "The theorem ((that statistical entropy should be always increasing)) relies on a hidden assumption, that useful information is destroyed by the collisions, which can be questioned; also, it relies on a non-equilibrium state being singled out as the initial state (not the final state), which breaks time symmetry; also, strictly it applies only in a statistical sense, namely that an average H-function would be non-decreasing)".
So, the supposed "proof" from within statistical mechanics begs the question. If entropy does increase, that is an observational fact which cannot be deduced just from the bare formalism - the prediction requires the injection of additional a-priori beliefs about the initial state. As you say, it presumably reflects an invariable experimental fact that thermodynamic systems, for some unspecified external reason, always start out in an unusually low-entropy initial state, about which the statistical mechanical formalism itself is entirely agnostic. (This may reflect particularly low-entropy initial state at the start of the universe. A slightly different take is that the number of available histories/non-identical microstates in the universe may be increasing, eg for some quantum or gravitational/cosmological reason).
Finally, it's worth noting that it's often argued that the psychological direction of time may be a completely dependent consequence of the entropic direction of time -- ie we would always define 'forward' time as the direction with increasing entropy, and then fix the labels on any graphs to match. All of this (should be!) discussed at greater length in the article Arrow of time. Hope this helps. -- Jheald 00:50, 13 February 2006 (UTC).
Excellent. Very illuminating. Thank you. Alex Dodge 08:45, 13 February 2006 (UTC) ???
Anybody interested in joining to do a stub on the term: " corporate entropy"? I keep coming across the term in many places. For example, the (1999) 2nd Edition of the book Peopleware has a section on how to fight “corporate entropy”, Google shows all sorts of hits for the term, and you can find it used on many Blogs. Does anyone know the origin of this term, i.e. who coined it or first used it in application?-- Sadi Carnot 19:00, 20 February 2006 (UTC)
From the article:
If the universe can be considered to have increasing entropy, then, as Roger Penrose has pointed out, an important role in the disordering process is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes.
What effect does hawking radiation have on this? I'm specifically thinking of the fact that black holes are unstable, and have a limited lifetime - which either contradicts the second law of thermodynamics or the statement that black holes have the maximum possible entropy. Mike Peel 13:18, 17 March 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | → | Archive 5 |
Surely the thermodynamic quantity described as entropy in this page is in fact the same as the older and more understandable quantity - Heat Capacity at constant volume. I would also like to split this page into two separate pages - one giving a view of entropy as a measure of 'disorder' or number of ways of arranging a system - i.e. the statistical treatment (most of the second half of the article) and a second giving a view of entropy as a thermodynamic quantity (i.e. heat capacity) as used in relation with enthalpies and free energies etc - maybe this would help those (see below) who found the page confusing. Any comments / ideas ? Yes/No to this idea? Is also can't help noticing that much of the mid section of this work is wrong - a possible source of confusion for readers? (specifically sections entitled measuring entropy and thermodynamic definition of entropy and sections following directly on from the these sections.) I think a lot of confusion stems from the unnecessary introduction of a new word entropy when the concept of heat capacity is more intuitive - do other people share this opinion? HappyVR 19:04, 11 February 2006 (UTC)
May be you could find a better place for this missing infomation. HappyVR 22:30, 12 February 2006 (UTC)
This talk page needs to have stuff dumped into an archive. Any dead arguments, or old and obsolete comments should be archived. Please move all the comments under one header, not only one. perhaps we can have one header that lists the headers in the archive. I've made a link to the non existant archive above. Fresheneesz 00:50, 9 December 2005 (UTC)
Sometimes the article is frustratingly vague. In particular:
It doesn't always seem to distinguish between definitions (p=mv, by definition) and laws (momentum is conserved, by observation).
A thermodynamic transformation is defined as "a change in a system's thermodynamic properties, such as its temperature and volume" (emphasis added). What other properties are relevant? Is it a long list?
What does "equilibrium state" mean? Complete thermodynamic equilibrium--the entire system must be a single temperature and pressure throughout?
I'd work on it myself, but I'm too clueless. Jorend 05:18, 24 Apr 2005 (UTC)
I read this article hoping to obtain an intuitive understanding of entropy. Didn't get it. And I don't think the problem is that the article is too "hard"; I don't think it would provide an intuitive grasp of the subject to any reader, however sophisticated. Maybe the topic is too hard. If nothing else, more examples (and examples less wedded to chemical engineering) would be welcome.
Perhaps it is unclear what I mean by "intuitive understanding". I mean the ability to answer questions like these:
Consider my coffee cup as a thermodynamic system. I drop an ice cube into it. How does this affect the entropy? I have introduced a big difference in temperature between two parts of the system, so the entropy decreases, right? But coming at it from a different direction, it seems as though I have added heat to the system, by adding matter to it: even though the ice is frozen, the ice molecules have some thermal energy. So the entropy increases, right?
Is entropy meaningful outside the context of thermodynamic machines? For example, is entropy defined at non-equilibrium states? (The article sort of gives the impression that it isn't. But lots of real-world systems are essentially never at equilibrium, so this would be a serious limitation.)
I'm not clear on how entropy, energy, and work are tied together. How exactly is entropy related to the ability to do work? The article states that when you reach the maximum entropy state, no more work can be done--states it, that is, but doesn't explain. What is meant by "can't do work"--can't do significant, macroscopic work?
Can a supply of energy be referred to as "ordered" (i.e., capable of doing work)? It seems by definition that heat is the only form of energy that increases entropy. Are all other forms of energy equally "ordered" as far as entropy is concerned? After some thought, my guess is that all other forms of energy can be used to do work; only thermal energy can become dispersed and unable to do work, and entropy is a function of the distribution of thermal energy in a system. Am I getting warm?
The article talks about entropy of a system in relation to work done by the system. What about work done within the system (as, by one part of the system on another part)? Perhaps an equivalent way to put this question would be: How does entropy relate to non-thermodynamic properties (like potential energy) of parts of a system? If my previous guess is right, the only effect on entropy is when activity within the system converts other sorts of energy to "waste heat".
Entropy is defined as a function of the state of a system. Does it make any sense to compare the entropy of one system to the entropy of some other system that contains entirely different matter? For example, which has greater entropy: your average rock; or an ice crystal of the same mass at the same temperature? Is the question meaningful?
Kindly don't answer the questions here, unless you just want to talk them through. Improve the article instead. Thanks! Jorend 05:18, 24 Apr 2005 (UTC)
Entropy is defined for an irreversible expansion as:
because delta = 0 and .
Therefore, -- Tygar
Hmm, math seems to be borked right now, except for cached equations and text translations. Just in time for my big landing. Oh well. -- CYD
think you should mention that in classical statistical mechanics infinite states/coarse graining isn't a problem when you are talking about change in entropy because difference in logs = log of a ratio which can be looked at as a ratio of volumes in phase space or the limit of ratios using coarse graining. currently the statistical formula is introduced here but immediately implied to be useless without jumping to quantum considerations
I can't touch it because I'm a total wikinewbie and don't know anything about thermo either
this article not written very clearly at all. The definition of absolute temperature is especially sketchy, and I don't like how it comes a good ways after the concept is first mentioned. And according to the current definition of "microstate", it would seem that there's an infinite number of them in any case. A good physics article should be accessable to anyone with competent knowledge of math.
sitearm 16:38, 26 July 2005 (UTC)
I'm not entirely sure what's happened here. The article is at Entropy, but Talk:Entropy redirects to Talk:Thermodynamic entropy. As far as I can see, the article used to be at Thermodynamic entropy; after Entropy was moved to Entropy (disambiguation), Thermodynamic entropy was moved to Entropy. Looks like the talk page was left behind. I'm going to delete the current Talk:Entropy (which is just, and has always been, a redirect) and move Talk:Thermodynamic entropy there to match the article. — Knowledge Seeker দ 06:13, 12 May 2005 (UTC)
moved from User talk:CYD:
Dear CYD: I notice you removed several of my edits to the article on Entropy. I found this disconcerting. When I read your change history comments I find I dispute them. The entropy/cosmology connection including gravity and black holes is clearly presented in Penrose and I made a point of adding him to the references. Also "entropy as a measure of disorder" does NOT suffice when you also state it as a measure of work that cannot be done. The added definitions established the work connection. I am new to Wikipedia. You are the original author of this article. Does that mean that you may freely delete anything you disagree with, even if sources for the material are cited? sitearm 03:44, 27 July 2005 (UTC)
The current version of this article is MUCH better than what had been here before, thanks to CYD. The article still needs minor work, but CYD has taken care of the major revisions it desperately needed.
I would suggest that this Talk page be archived, with only comments made after CYD's revision being kept. Ravenswood 20:29, July 31, 2005 (UTC)
Below are questions that I think are important to better understand entropy. These need to stay on Talk until addressed in the article. I am OK with the rest of the Talk material to be archived as proposed. Thank you for your patience. Sitearm 15:46, 3 August 2005 (UTC)
I will look to again add a more narrative section on the relationships between entropy, energy, and work in a separate section from yours. Please feel free to clarify in it but not to outright delete it. I request you be accomodating to additional ways of describing mechanical entropy. What is clear and obvious to you in your exposition is not so to me. Let's at least get the narrative worked out before trying to collapse things. I have spent a lot of time reading the policy discussions in Village Pump and I am aware there are major edit revert wars going on in some of these pages. I am not interested in that. I am interested in a great article on Entropy. I am interested in following Wikiquette rules. A concerned newcomer Sitearm 07:48, 4 August 2005 (UTC)
notation is confusing. How is the heat changing? The heat article also has this notation in places
I changed this to simply be Q
I removed the following,
because a heat engine rarely uses a continuum of temperatures, and because it is pointless. Bo Jacoby 11:28, 16 September 2005 (UTC)
I changed Q to dE everywhere, because 'heat' may either mean energy or entropy and so should not be used. Here Q was energy. I finished the definition of temperature, noting that the definition leaves the unit undefined. I hope you like it. Bo Jacoby 13:44, 16 September 2005 (UTC)
Either show it or don't show it, but don't tell me that it is easy to show. I don't think it is easy to show. I don't even think it is correct. Throw a hot piece of iron into a bucket of water, and the entropy of the iron is diminished. . . Bo Jacoby 14:00, 16 September 2005 (UTC)
I've reverted the introduction to a previous version, because it's difficult to read and riddled with inaccuracies. The problems are too many to point out individually, but take for example
This is inaccurate because it assumes no change in volume or other extensive thermodynamic variables, an important assumption that's simply left unstated. Also, "entropy transfer" is not a meaningful concept, only "entropy change" -- entropy is a property of a thermodynamic system.
That whole discussion of "quantities of electricity" is almost unreadable, thanks to liberal use of inline TeX; luckily, it is also almost completely irrelevant, and can be safely deleted.
Incidentally, the article itself serves as a demonstration of the second law of thermodynamics. When I last looked at it a month ago, it was in pretty good shape. Since then, the amount of disorder has increased, and one has to do work to get it back to its original level of quality. -- CYD
I think the entropy and disorder section was getting a little wordy. The new addition was good in that it introduced the concept of Shannon entropy to the messy bedroom analogy, and I have tried to pare the argument in that section down to a simpler statement. PAR 18:39, 8 January 2006 (UTC)
About the reference to to the book " Entropy: A New World View". The article states it is "a notorious misinterpretation of entropy" and links to Foresight to prove it. But Foresight seems to be full of crap, they state that "Therefore, entropy can decline in a closed system", clearly not understanding what a closed system even is. How can you link to such an unreliable source? The book may over interpret some aspects of the law, but it still have a lot of good points and should not be categorical put down in that way. -- A human 14:21, 16 January 2006 (UTC)
This is a great article. I'm sure I would only make it worse if i were to try to add my two cents to it. but it would be nice if certain things were ummm aaa mentioned ... maybe dealt with .. maybe added to external links??
I have in mind data stuff somewhat like:
Context: On evolution "entropy is limited to heat" was changed to "entropy is about useable energy" was changed to "entropy is about spacial distribution of energy" ... or something like that. WAS 4.250 03:46, 17 January 2006 (UTC)
I'm a non-scientist, and this may as well have been in urdu for all I could understand of it. Aren't these articles supposed to be understandable by the average joe like me? ElectricRay 23:29, 22 January 2006 (UTC)
This article is slightly better explained than the two courses I've taken in two different colleges, so its a pretty good explanation. Some people point out ambiguities and that's all good, but the point is driven home better than in most books and college courses. This is NOT an easy topic, and is, for most people, impossible to graps by skimming trough it. As for the current state of "this article doesn't say what the hell the topic (entropy) actually is."... I point to the line that reads: "Entropy is a measure of how far along this process of equalization has come." This, plus the equations, is all. The rest is only so you can grasp this phrase, that is found right in the introduction. Congratulations to whoever wrote this. As for the ice cube and entropy effect, I point to the article again: "the entropy of an isolated system can only increase or remain the same; it cannot decrease" And for delta-Q notation, removing it is a bad idea. Q usually means the heat contained in the system, and using this to notate heat transfers is confusing to those who use formal notation. Please, do not reinvent standard notation. Also, if you didn't take a formal class in the subject, please refrain from editing, as you probably don't grasp the concept fully. -Annoyed Wikipedia reader
I realize there's a long history about the word disorder, but I wonder if it would be better to rephrase it as "statistical disorder" or even "statistical randomness." Some sources have even abandon this and used "dispersion of energy." Olin
By the conservation of energy, the net energy lost by the environment is equal to the work done by the engine.
looks some what vague.What say?-- Sahodaran 05:25, 8 February 2006 (UTC)
How does entropy "pick" a direction in time? By my reasoning, statistically it should increase both into the future and into the past. A partially melted ice cube in a glass of water is far more likely to have been an unlikely perturbation in water molecules (a random jump to slightly lower entropy) than it is to have ever been a fully formed ice cube (an extremely lower entropy.) As I understand, the illusion of an arrow of time is based on an extremely low entropy early universe. Please discuss this. Adodge 23:04, 11 February 2006 (UTC)
IMO, Adodge, you are exactly right. (Though I don't think I'd call the arrow of time an "illusion"). You might also like to look at the article MaxEnt thermodynamics, which comes to the same conclusion, that there is important "prior" information missing from the formalism at it stands if one is to address the retrodiction question (sometimes called "Boltzmann's 2nd Problem").
That article probably goes the furthest. The articles H-theorem and Loschmidt's paradox don't go quite as far, but both do address problems with the claim that "statistical mechanics proves that statistical entropy increases" — as the article on information theory currently puts it: "The theorem ((that statistical entropy should be always increasing)) relies on a hidden assumption, that useful information is destroyed by the collisions, which can be questioned; also, it relies on a non-equilibrium state being singled out as the initial state (not the final state), which breaks time symmetry; also, strictly it applies only in a statistical sense, namely that an average H-function would be non-decreasing)".
So, the supposed "proof" from within statistical mechanics begs the question. If entropy does increase, that is an observational fact which cannot be deduced just from the bare formalism - the prediction requires the injection of additional a-priori beliefs about the initial state. As you say, it presumably reflects an invariable experimental fact that thermodynamic systems, for some unspecified external reason, always start out in an unusually low-entropy initial state, about which the statistical mechanical formalism itself is entirely agnostic. (This may reflect particularly low-entropy initial state at the start of the universe. A slightly different take is that the number of available histories/non-identical microstates in the universe may be increasing, eg for some quantum or gravitational/cosmological reason).
Finally, it's worth noting that it's often argued that the psychological direction of time may be a completely dependent consequence of the entropic direction of time -- ie we would always define 'forward' time as the direction with increasing entropy, and then fix the labels on any graphs to match. All of this (should be!) discussed at greater length in the article Arrow of time. Hope this helps. -- Jheald 00:50, 13 February 2006 (UTC).
Excellent. Very illuminating. Thank you. Alex Dodge 08:45, 13 February 2006 (UTC) ???
Anybody interested in joining to do a stub on the term: " corporate entropy"? I keep coming across the term in many places. For example, the (1999) 2nd Edition of the book Peopleware has a section on how to fight “corporate entropy”, Google shows all sorts of hits for the term, and you can find it used on many Blogs. Does anyone know the origin of this term, i.e. who coined it or first used it in application?-- Sadi Carnot 19:00, 20 February 2006 (UTC)
From the article:
If the universe can be considered to have increasing entropy, then, as Roger Penrose has pointed out, an important role in the disordering process is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes.
What effect does hawking radiation have on this? I'm specifically thinking of the fact that black holes are unstable, and have a limited lifetime - which either contradicts the second law of thermodynamics or the statement that black holes have the maximum possible entropy. Mike Peel 13:18, 17 March 2006 (UTC)