This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
I have a book which says Clausius derived the term entropy not from "entrepein" but from "en tropei" (both Greek) meaning change.
(Copying this here out of archive2 so it doesn't get overlooked, because I feel this was never properly resolved, and I for one still feel quite uncomfortable about these edits of Paul's Jheald 09:21, 8 July 2006 (UTC))
Paul, you made several semantic edits which change the context and tone of the article, but you added no citations for any of them. Your edits, if properly cited, could provide a valuable educational addition to the article. If not cited however, they seem an awful lot like POV.
Astrobayes
20:24, 22 June 2006 (UTC)
At the bottom of the Overview section, it reads that the second law does not apply to the infinite Universe, yet further down the article in 'The Second Law' section, it reads that the entropy of the Universe does tend to increase. Maybe I'm missing something, but this may need clarifying in order to properly state whether or not the Second law applies to the universe or not. —unsigned
The jury is still out on whether the universe we live in, is finite in size or infinite. If it is infinite, the Second Law does not apply, if finite it would. Any statement about the entropy of "the Universe" increasing, is premature and speculative. My bets are with the infinite - you choose. Paul venter 21:28, 29 June 2006 (UTC)
Let me first say that I think the 8 lines or so that we have in the section "Entropy and Information Theory" currently in the article are (IMO) about right, and pitched at about the right level. I don't think there's any great need for change from this, either to emphasise or to de-emphasise the intuitive connection.
But I would like to offer a couple of comments (constructive, I hope) on some of the wordings that Frank was most recently discussing on the talk page now archived:
In his CalPoly discussion, Frank writes:
I think there's a danger that this can be taken the wrong way. As I tried to persuade Astrobayes earlier, there's no great significance to the fact that in Chemistry entropy is usually measured in Joules per Kelvin, and temperature in Kelvins, while Σ[Pi log Pi] is dimesnionless. One can entirely equivalently work throughout with the dimensionless entropy σ = S / k, and the energy-scale temperature τ = kT.
Indeed, as I suggested to Astrobayes, σ and τ are in many ways the more natural quantities, interestingly as Frank's friend Harvey Leff also expored in a recent paper, "What if Entropy were Dimensionless?", Am. J. Phys. 67, 1114-1122 (1999). Not only does it more naturally reflect that the multiplicity, which is fundamentally what underlies entropy, is fundamentally a dimensionless variable. It also throws into relief that the crucial thing about temperature is the corresponding energy-scale quantity τ = kT. This is the amount of energy you have to put in to a system in thermodynamic equilibrium to get a unit increase in its dimensionless entropy σ. That energy is greater for hot systems than for cold systems: energy flows from hot to cold, so for the 2nd law to be obeyed, the energy you need to add to a cold system to get a unit increase in dimensionless entropy must cause less than a unit decrease in the dimensionless entropy of the hot system. So working with σ and τ makes it a little more transparent why the energy τ for an ensemble, defined by ðE/ðσ (nearest I could find to a partial derivative "d") is the fundamental measure of its temperature.
To his credit, in what he wrote on the (archived) talk page, Frank adds a sentance that underlines this:
This is surely the point on no account to be obscured: one can freely move between S and σ, and back again, without losing (or gaining) any physical meaning. There's nothing particularly special either way about choosing whether or not to include the extra factor kB, and therefore whether or not to work with S rather than σ.
Frank writes:
I have a problem with the word "factors", because of its mathematical association with multiplying two things together. Can I ask Frank for his reaction if I was to suggest that a clearer word might be "aspects"? Would the following equally capture the message that you're trying to convey:
Am I right in interpreting that as your key message?
There are a couple of points I might make here, but I've probably written enough for the moment to be going on with. Jheald 14:47, 8 July 2006 (UTC).
One other point with the CalPoly paper with its extensive discussion of "configurations" is perhaps to remember that properly applied to Classical mechanics, the states that statistical mechanics works with are configurations of the whole system in 6N-dimensional
phase space, including momentum co-ordinates for each particle, not just the position co-ordinates. So, properly considered, the different possible molecular motions are always a consideration, even in the classical picture. (Even though we can sometimes useful separate that entropy out into different additive contributions, one from eg the permutations of impurities in a crystal, the other from other degrees of freedom, if the probability distributions are near enough independent).
Jheald
14:47, 8 July 2006 (UTC).
I think it's best to work out differences on this Talk forum rather than insert and delete, etc. on the actual article page. So here's 1.3 pages of the Article that introduces the topic a bit more slowly, more pointed toward the young or mature reader not sharp in physics (and certainly not knowing calculus in the first line!)
So, following the present italics links to dismbig, infor entropy, entropy in therm/info theory:
In chemistry and physics thermodynamic entropy is an important ratio, a measure of the heat, q, that is transferred to or from a system when it is warmed or cooled by its surroundings. Only a small amount of q should be transferred (dq) at a specific temperature so that the temperature stays essentially unchanged. Then the process is reversible and the ratio is dq (rev)/T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1860s to account for the dissipation of energy from thermodynamic systems that produce work. He coined the term ‘entropy’, that to him meant ‘transformation’, from the Greek word, trope, and added the prefix, en- to emphasize the close relationship of dq(rev)/T to energy. The concept of entropy is a thermodynamic construct, but the word entropy has been widely misused, e.g., in economics and fiction, because its thermodynamic relation to physico-chemical energy is blurred or even omitted. Legitimate linkages to fields of study are mentioned in sections that follow.
The essence of Clausius’ original words are in the above paragraph. They differ somewhat from the version in the present Wikipedia ‘Entropy’ article, because (as shown below ) my statements use the German of his 1865 paper, cited in an authoritative history “The World of Physical Chemistry” by Keith Laidler (Oxford, 1993, p. 104-105):
“…schlage ich vor, die Grösse S nach dem griechischen Worte ή τροπη [trope], die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habe ich absichtlich dem Worte Energie möglichst ähnlich gebildet, den die beiden Grössen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, dass eine gewisse Gleichartigkeit in der Benennung mir zweckmässig zu sein scheint.”
“ | I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate. | ” |
Good work Frank, I've been looking for this quote for a long time. I'll add it to the article. Thanks:-- Sadi Carnot 00:46, 10 July 2006 (UTC)
To The Group, explanation of advantage to describe illustration better: The illustration in the present Wikipedia “Entropy” is a neat idea, especially with the terrific enlargement possible! The ice and water phases can be very clearly seen. However…! Its present title indeed applies to the photo but it can hardly be called an aid for the naïve reader! Below is a complete description of what lies behind the photo and the phenomenon, perhaps a bit high level for the reader with no background other than the first paragraph above, but about on the level of a reader after the Overview.
Ice melting in a 77 F (298 K) room: - a classic example of heat energy, q, from the warmer room surroundings being ‘transferred’ to the cooler system of ice and water at its constant T of 32 F (273 K), the melting temperature of ice. Thermodynamic System The entropy of the system increases by q/273 K. [q is actually the ΔH for ice, the enthalpy of fusion for 18 grams]. The entropy of the surrounding room decreases less (because the room temperature T is higher, as q/298 K versus the ice-water of q/273 K shows). Then, when the ice has melted and the temperature of the cool water then rises to that of the room (as the room further cools essentially imperceptibly), the sum of the dq/T over the continuous range (“at many increments”) in the initially cool to finally warm (room temperature) water can be found by calculus. The entire miniature ‘universe’ of room surroundings and ice-water-glass system has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a 'system' in it. FrankLambert 20:25, 9 July 2006 (UTC)
FL suggestion: Paragraph 1
FL suggestion: Paragraph 2
Frank, I see that you don’t like the order/disorder connection to entropy; and from your talk-page comments, I see that you would like to ban it from the world. The problem here is that this is a widely used description; this morning, for example, I am reading Joon Chang Lee’s 2001 Thermal Physics – Entropy and Free Energies book, and it is filled with order/disorder stuff. Thus, knowing that WP is a neutral point of view information presentation forum, we simply have to present concepts as they are, from all perspectives. In summary, my point is that WP is not a place to debate whether something is right or wrong. Thanks: -- Sadi Carnot 13:13, 11 July 2006 (UTC)
I can’t believe you’re still trying to talk junk to me? It occurs above twice and on seperate talk pages as well: here and here. In Wikipedia terminology, these are called person attacks. Before you go around making any more of these, you might want to read the section " no personal attacks" which states: Do not make personal attacks anywhere in Wikipedia. Comment on content, not on the contributor. In any event, in addition to what I’ve just posted below, here and here, in regards to you thinking that you know about entropy even though, as you state, you have never read any of Clausius’ work (which in itself is very humorous); from Anderson’s 2005 Thermodynamics of Natural Systems (textbook), we find:
“ | Virtually since the beginning, a popular viewpoint has been to see entropy as a measure of disorder. Helmholtz used the word “Unordnung” (disorder) in 1882. | ” |
Lastly, in regards to your presumption: “You had a year of chemistry, didn't you?”; for the record, in regards to knowledge of entropy, one chemical engineer (me) is more knowledgeable than any three chemists (you) put together. Thank-you for your attempt at trying to belittle me. Adios: -- Sadi Carnot 05:17, 13 July 2006 (UTC)
FL proposal:
FL proposal:
FL proposal:
Wikipedia:WikiProject_Chemistry currently rates this article as "needs more chemistry to go with the physics" [1] (note: rating is not necessarily that recent).
In progress, I think we're already discussing issues about the appropriate tone to make a friendlier article for P Chem students. (And, what specifically in the article might make it seem less friendly, or even alien to P Chem students).
But what *topics* should Physical Chemistry students expect to find presented here, that aren't already (and/or in breakout articles from here) ?
eg:
...
Actually, I suspect very little of that on this page, which is mostly signposts to more detailed articles; but perhaps more detail in an Entropy in Chemistry page, or handed off to Chemical Thermodynamics; or in individual relevant pages.
Part of the problem is that (IMO) Wikipedia needs a big push in the whole subject area of Physical Chemistry. The whole P Chem subject area seems a lot less developed, so (as a comparative outsider) I find it hard to find out what needs the Entropy article needs to supply.
Hence this talk subject to try to get discussion going. (And a plea at WP:Project Chemistry for an actively managed initiative to try to push quality in the whole field). Jheald 19:35, 10 July 2006 (UTC).
Entropy is perhaps the most difficult topic that chemistry students ever meet. In fact there are now places, regretably, that now no longer teach it, at least to all students. I am retired now, so I am no longer teaching Physical Chemistry and the publishers are no longer sending me copies of the latest text books. However before I retired I saw the change that Frank talks about, although I did not realise it came from him. I had started to concentrate on energy dispersal rather than disorder. For those chemistry students who are still taught about entropy, this article is not much help. It is way beyond what is in their text books and it really is placing more weight on how physicists see the subject. There is a general problem here with all articles that cover topics in both physics and chemistry. In some areas it is like entropy - one article but not clearly satisfying both chemists and physicists. In other areas, particularly in the areas of molecular physics, chemical physics, quantum chemistry, computaitonal chemistry etc., I keep coming across articles that have a major overlap with other articles - one written by chemists and linking to other chemistry articles, and one written by physicists and linking to physics articles. We need to sort this out. I am not sure how. Maybe we do need different articles. Maybe we need an article on chemical entropy or entropy in chemistry. Better, I think, would be to make this article fairly simple and link to separate more complex articles on entropy in chemistry and entropy in physics. -- Bduke 23:57, 11 July 2006 (UTC)
One possibility might to have a separate article Thermodynamic entropy (Which currently redirect here) that is a gentle introduction to entropy as physical chemists use it. This article can then be the broad overview article that provides short introductions to all of the other entropy articles. We currently have Entropy (thermodynamic views), Entropy (statistical views), Entropy in thermodynamics and information theory, Information entropy and History of entropy, along with many other related articles in the entropy category. Nonsuch 00:13, 12 July 2006 (UTC)
Frank is trying to convince us that since some chemistry textbooks now avoid the use of the word "disorder" following his advice, this description is dead and should not be mentioned in Wikipedia. (Instead he suggests to use the vague concept "energy dispersal" that I still cannot understand what its exact definition is). He writes above that I should check what they teach in the Chemistry Department at Harvard. So I took the challenge, and what do I find in the lecture notes of the course "Chemistry 60: Foundations of Physical Chemistry"? It's online: Lecture 8 p. 11: "Entropy S is a quantitative measure of disorder" :) Yevgeny Kats 14:34, 11 July 2006 (UTC)
Frank, I feel that your intentions are well-minded; but you need to loosen up a bit. When you go around tooting that “disorder is dead” and “energy dispersal is in”, what you are doing is akin to censorship. From my talk page, you state that you have never dug into the history of Clausius (1850-1865), but that your interest peeked with the papers of Boltzmann (1872) and Kelvin (1875), below are your words:
Well, to contradict you, I do know how he said it. Over the last several days, I have been down at the UIC library reading an original copy of Clausius famous 1865 book Mechanical Theory of Heat, which contains his nine papers between the years 1850-1865, upon which the foundation of entropy is built. In his 1862 paper, he states the following relation:
where:
Clausius then goes on to discuss the underlying nature of the concept of “equivalence-value”, i.e. what he labels as entropy in the following two years. He then discusses how a working body of gas comprised of individual molecules configured into very specific arrangements, i.e. “order”, is changed through “compensations” as it progresses through a complete Carnot cycle; something which Carnot himself did not take into account.
First, he speaks of the change in the arrangements of the molecules of the body not being a reversible phenomenon as the body moves from one state to the next, and in doing so overcomes resistances which are proportional to the absolute temperature.
Clausius then states: “when heat performs work, these processes always admit of being reduced to the alteration in some way or another of the arrangement of the constitute parts of the body. For instance, when bodies are expanded by heat, their molecules being thus separated from each other: in this case the mutual attractions of the molecules on the other one had, and external opposing forces on the other, in so far as any are in operation, have to be overcome.” Hence, we see that in 1862, before entropy was even named, the founder of the concept of entropy is speaking about arrangements of the atoms and molecules of the constituent body and how these arrangements are changed, via uncompensated transformations, as they progress through a cycle. As we see, entropy in its essential nature, is built on the concept of the changes in the ordering of atoms and molecules of the working body. The opposite of order is disorder. This is a core perspective in the concept of entropy; and it is not dead as you seem to think. Just a friendly note: -- Sadi Carnot 02:43, 12 July 2006 (UTC)
To elaborate further on the preceding discussions, I particularly do not like individuals who purposely strive to alter historical frameworks of concept, in such a manner as to substitute their own personal contrivances for the original. Owing to this conviction, today I was again down at the sacred copies room at the UIC library typing up, word-for-word, the original formulation of entropy, as it was intended to be stated. The entire framework of entropy is built on some 365 pages worth of logical argument presented in nine memoirs over the course of 15 years, from 1850 to 1865. Below are the main points of the way entropy was presented to the world in 1862-65 (starting from pg. 218 of Clausius' Mechanical Theory of Heat [the third page of the sixth memoir]):
(skipping a few paragraphs; Clausius then states the following law (as he calls it)):
(skipping a paragraph)
(skipping about 20 pages)
(Then starting from page 354 of the ninth memoir is where he assigns the symbol S and gives it the name “entropy”)
(Then after a page of derivations, which includes a symbol assignment of disgregation, we then arrive back at the following equation)
(Then, in grand finale of this equation-packed magnificent book, we arrive at his famous last paragraph on the last page [pg. 365] of ninth memoir, first presented to the public as sourced below*)
So there you have it. I hope this gets us all on the same page so that there is no more talk-page bickering regarding fictitious concepts that were never presented in the words of Clausius. Adios: -- Sadi Carnot 04:23, 13 July 2006 (UTC)
What is (or should be) the focus of this article? Much of the recent discussion has focused on how particular communities construe the word, either presently, or in the past. IMHO, this article, which is simple entitled 'entropy', should give a concise definition, a short introduction to the concept, a brief history, and a discussion of various ways that entropy has been used in different contexts, and at least a listing of related concepts, special cases and generalizations, with suitable crosslinking to more specialized articles. Discuss. Nonsuch 07:30, 17 July 2006 (UTC)
As a first draft:
"Entropy has been described in texts, dictionaries, and encyclopedias for over a century as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". This is unfortunate because "disorder" and "mixed-upness" are not quantifiable scientific terms in themselves. A modern description of entropy that has appeared in most US general chemistry texts as of 2006 is readily understandable and directly related to quantitative measurement It is "Entropy is a measure of the dispersal of energy (in chemistry, most often the motional energy of molecules): how much energy is spread out in a process, as in 'thermal entropy' and/or how widely spread out it becomes, as in 'positional entropy', both at a specific temperature."
(Information theory deals with the quantitation of 'disorder' and 'mixedup-ness'. See the links to articles in its section that follows.)" FrankLambert 18:48, 17 July 2006 (UTC)
In reply to the comments of Jheald above, I have reassessed the article for WikiProject Chemistry as "Start". A lot of specific chemical information (eg, measurement techniques) can go into Standard molar entropy or into Gibbs' free energy, as these are the quantities which chemists actually use, but there needs to be enough of a "taster" here so that chemists will know to look elsewhere for the details. Gibbs' free energy is not mentioned by name, although the concept is used in the article! Similarly, there is no mention of the Third law of thermodynamics, which surely has a place in a general discussion of entropy...
I would like to see at least one example of a chemical system in the article: I usually use the reaction between chalk and hydrochloric acid, as the evolution of a gas and the dissolution of a solid always lead to an increase in entropy. As the article stands, I do not see how a reader can get a grasp as to what entropy actually is, beyond the fact that it happens to be equal to dqrev/dT! The example of the Ellingham diagram also works well with French students: entropy as an approximation to d(ΔG)/dT in the absence of phase changes. I am mystified by the current choice of example: the article presents entropy as the energy which is not available to perform work, but chooses an example where no work is done at all! I'm teaching "Basic Physics for Environmental Scientists" this summer ("basic" includes such joys as damped and forced oscillators, Reynold's number and Fick's Laws, so I'm glad I don't have to teach "advanced"!), and I will almost certainly use the example of a steam pump to illustrate entropy. I will use the example of melting ice to check if they have understood what a phase equilibrium is ("OK, so it's in equilibrium, so what is ΔG? So if ΔG is zero, what is ΔS?"). The use of the entropy unit in American publications should also be mentioned.
On a more general level, care needs to be taking with the mathematics. The "d" of a complete differential is always in upright type, latin letters denoting physical quantities (q, T) are always in italics. Entropy is a state function, not a differential, and so is equal to dqrev/dT, not dQ/T as stated in the article and earlier on this talk page (either lower or upper case "Q" may be used for heat, I'm not complaining about that). As we have the correct Greek spelling of trope, this should be included.
Finally, the definitions seem to be rather too based on a historical presentation, even if I cannot fault the quality of the history. The section "Thermodynamic definition" is six paragraphs long, but stops at 1903! IMHO, the force of Clausius and Boltzmann is that their ideas are still taught a century and a half later in virtually the same terms as they themselves used: I feel that we should honour them by teaching these ideas as physical facts and not as historical facts!
An appropriate response to all this would be {{ sofixit}}, but I'm not going to dive into the article itself without some feedback on my views (feedback can be good, just ask Jimi Hendrix). All the more so as I remark that the article is dissipating quite a lot of energy as heat which, as we all know, can only be to the detriment of useful work :) Physchim62 (talk) 18:16, 18 July 2006 (UTC)
Hi,
Although I think the disorder concept is a terrible way to explain entropy (not withstanding the fact that most of the time its just plain wrong), I think that this page needs to retain a reference to the historic use of the words "disorder" and "mixedupedness" - because otherwise some maleducated individual will once again put disorder into this page. Not only that, it will perpetuate the confusion people have as to why disorder is ever used.
So we should have a little paragraph that explains why entropy was explained as "disorder", and why its wrong. Just a short little thing. Fresheneesz 18:11, 21 July 2006 (UTC)
Given this edit by Sadi Carnot (which is a valid edit, btw) I propose we move this article (i.e., renasme) to "Entropy (thermodynamics)" and remove anything unrelated to thermodynamics as the other uses are not in keeping with the thermodynamic meaning of entropy, and as they have their own articles. •Jim62sch• 16:27, 6 August 2006 (UTC)
Except that that misses the point, and ultimately misleads the reader do to a functional, widespread misunderstand of what "entropy" means. And really, the other uses are not truly related to thermodynamics. In those uses, entropy is taken to mean disorder, chaos, in thermodynamics it really means "re-order", it makes no subjective judgment re chaos or disorder. •Jim62sch• 23:39, 6 August 2006 (UTC)
Now I'm confused...if there's an entropy disambig page, why does this article mention non-thermodynamic entropy? •Jim62sch• 23:10, 7 August 2006 (UTC)
Loosely, maybe. •Jim62sch• 00:05, 8 August 2006 (UTC)
Suggested changes to generalise the concept in clearer terms. "State function" doesnt cut it for an intro, even if "relative measurement" isnt quite correct. Criticism welcome. - Ste| vertigo 18:12, 27 August 2006 (UTC)
Entropy is a concept of a relative
measurement of the degree, rate, and vector of
energy dispersal within an
isolated system, as well as between connected systems. The interpreted meaning depends on the context, for example In thermodynamics, entropy (symbolized by S) is a fundamental part of the second law of thermodynamics, which defines the relationship between time and the energy within an unequilibriated system — hence it is an important factor in determining a system's free energy to do work. It is measured as a state function of a thermodynamic system, defined by the differential quantity , where dQ is the amount of heat absorbed in a reversible process in which the system goes from the one state to another, and T is the absolute temperature. [1] |
Entropy is a concept of a relative measurement of the degree of energy dispersal within an isolated system, as well as between connected systems. The interpreted meaning depends on the context, for example in truly isolated systems (such as the universe is theorised to be) the entire energy system is entropic, whereas in connected systems (as found in nature) entropy relates directly to heat transfer. It is measured as a state function of a thermodynamic system, defined by the differential quantity , where dQ is the amount of heat absorbed in a reversible process in which the system goes from the one state to another, and T is the absolute temperature. [2] |
Note: This article has a small number of in-line citations for an article of its size and currently would not pass criteria 2b.
Members of the
Wikipedia:WikiProject Good articles are in the process of doing a re-review of current
Good Article listings to ensure compliance with the standards of the
Good Article Criteria. (Discussion of the changes and re-review can be found
here). A significant change to the GA criteria is the mandatory use of some sort of in-line citation (In accordance to
WP:CITE) to be used in order for an article to pass the
verification and reference criteria. It is recommended that the article's editors take a look at the inclusion of in-line citations as well as how the article stacks up against the rest of the Good Article criteria. GA reviewers will give you at least a week's time from the date of this notice to work on the in-line citations before doing a full re-review and deciding if the article still merits being considered a Good Article or would need to be de-listed. If you have any questions, please don't hesitate to contact us on the Good Article project
talk page or you may contact me personally. On behalf of the Good Articles Project, I want to thank you for all the time and effort that you have put into working on this article and improving the overall quality of the Wikipedia project.
Agne
00:19, 26 September 2006 (UTC)
Two important consequences are that heat cannot of itself pass from a colder to a hotter body
The intro included evolution as another field of study in which the concept is used, but the only reference I've found in the article is to HG Wells' fictional "devolution", and in the evolution article it appears as a common misconception, so I've removed it pending explanation. ... dave souza, talk 08:26, 28 September 2006 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
I have a book which says Clausius derived the term entropy not from "entrepein" but from "en tropei" (both Greek) meaning change.
(Copying this here out of archive2 so it doesn't get overlooked, because I feel this was never properly resolved, and I for one still feel quite uncomfortable about these edits of Paul's Jheald 09:21, 8 July 2006 (UTC))
Paul, you made several semantic edits which change the context and tone of the article, but you added no citations for any of them. Your edits, if properly cited, could provide a valuable educational addition to the article. If not cited however, they seem an awful lot like POV.
Astrobayes
20:24, 22 June 2006 (UTC)
At the bottom of the Overview section, it reads that the second law does not apply to the infinite Universe, yet further down the article in 'The Second Law' section, it reads that the entropy of the Universe does tend to increase. Maybe I'm missing something, but this may need clarifying in order to properly state whether or not the Second law applies to the universe or not. —unsigned
The jury is still out on whether the universe we live in, is finite in size or infinite. If it is infinite, the Second Law does not apply, if finite it would. Any statement about the entropy of "the Universe" increasing, is premature and speculative. My bets are with the infinite - you choose. Paul venter 21:28, 29 June 2006 (UTC)
Let me first say that I think the 8 lines or so that we have in the section "Entropy and Information Theory" currently in the article are (IMO) about right, and pitched at about the right level. I don't think there's any great need for change from this, either to emphasise or to de-emphasise the intuitive connection.
But I would like to offer a couple of comments (constructive, I hope) on some of the wordings that Frank was most recently discussing on the talk page now archived:
In his CalPoly discussion, Frank writes:
I think there's a danger that this can be taken the wrong way. As I tried to persuade Astrobayes earlier, there's no great significance to the fact that in Chemistry entropy is usually measured in Joules per Kelvin, and temperature in Kelvins, while Σ[Pi log Pi] is dimesnionless. One can entirely equivalently work throughout with the dimensionless entropy σ = S / k, and the energy-scale temperature τ = kT.
Indeed, as I suggested to Astrobayes, σ and τ are in many ways the more natural quantities, interestingly as Frank's friend Harvey Leff also expored in a recent paper, "What if Entropy were Dimensionless?", Am. J. Phys. 67, 1114-1122 (1999). Not only does it more naturally reflect that the multiplicity, which is fundamentally what underlies entropy, is fundamentally a dimensionless variable. It also throws into relief that the crucial thing about temperature is the corresponding energy-scale quantity τ = kT. This is the amount of energy you have to put in to a system in thermodynamic equilibrium to get a unit increase in its dimensionless entropy σ. That energy is greater for hot systems than for cold systems: energy flows from hot to cold, so for the 2nd law to be obeyed, the energy you need to add to a cold system to get a unit increase in dimensionless entropy must cause less than a unit decrease in the dimensionless entropy of the hot system. So working with σ and τ makes it a little more transparent why the energy τ for an ensemble, defined by ðE/ðσ (nearest I could find to a partial derivative "d") is the fundamental measure of its temperature.
To his credit, in what he wrote on the (archived) talk page, Frank adds a sentance that underlines this:
This is surely the point on no account to be obscured: one can freely move between S and σ, and back again, without losing (or gaining) any physical meaning. There's nothing particularly special either way about choosing whether or not to include the extra factor kB, and therefore whether or not to work with S rather than σ.
Frank writes:
I have a problem with the word "factors", because of its mathematical association with multiplying two things together. Can I ask Frank for his reaction if I was to suggest that a clearer word might be "aspects"? Would the following equally capture the message that you're trying to convey:
Am I right in interpreting that as your key message?
There are a couple of points I might make here, but I've probably written enough for the moment to be going on with. Jheald 14:47, 8 July 2006 (UTC).
One other point with the CalPoly paper with its extensive discussion of "configurations" is perhaps to remember that properly applied to Classical mechanics, the states that statistical mechanics works with are configurations of the whole system in 6N-dimensional
phase space, including momentum co-ordinates for each particle, not just the position co-ordinates. So, properly considered, the different possible molecular motions are always a consideration, even in the classical picture. (Even though we can sometimes useful separate that entropy out into different additive contributions, one from eg the permutations of impurities in a crystal, the other from other degrees of freedom, if the probability distributions are near enough independent).
Jheald
14:47, 8 July 2006 (UTC).
I think it's best to work out differences on this Talk forum rather than insert and delete, etc. on the actual article page. So here's 1.3 pages of the Article that introduces the topic a bit more slowly, more pointed toward the young or mature reader not sharp in physics (and certainly not knowing calculus in the first line!)
So, following the present italics links to dismbig, infor entropy, entropy in therm/info theory:
In chemistry and physics thermodynamic entropy is an important ratio, a measure of the heat, q, that is transferred to or from a system when it is warmed or cooled by its surroundings. Only a small amount of q should be transferred (dq) at a specific temperature so that the temperature stays essentially unchanged. Then the process is reversible and the ratio is dq (rev)/T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1860s to account for the dissipation of energy from thermodynamic systems that produce work. He coined the term ‘entropy’, that to him meant ‘transformation’, from the Greek word, trope, and added the prefix, en- to emphasize the close relationship of dq(rev)/T to energy. The concept of entropy is a thermodynamic construct, but the word entropy has been widely misused, e.g., in economics and fiction, because its thermodynamic relation to physico-chemical energy is blurred or even omitted. Legitimate linkages to fields of study are mentioned in sections that follow.
The essence of Clausius’ original words are in the above paragraph. They differ somewhat from the version in the present Wikipedia ‘Entropy’ article, because (as shown below ) my statements use the German of his 1865 paper, cited in an authoritative history “The World of Physical Chemistry” by Keith Laidler (Oxford, 1993, p. 104-105):
“…schlage ich vor, die Grösse S nach dem griechischen Worte ή τροπη [trope], die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habe ich absichtlich dem Worte Energie möglichst ähnlich gebildet, den die beiden Grössen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, dass eine gewisse Gleichartigkeit in der Benennung mir zweckmässig zu sein scheint.”
“ | I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate. | ” |
Good work Frank, I've been looking for this quote for a long time. I'll add it to the article. Thanks:-- Sadi Carnot 00:46, 10 July 2006 (UTC)
To The Group, explanation of advantage to describe illustration better: The illustration in the present Wikipedia “Entropy” is a neat idea, especially with the terrific enlargement possible! The ice and water phases can be very clearly seen. However…! Its present title indeed applies to the photo but it can hardly be called an aid for the naïve reader! Below is a complete description of what lies behind the photo and the phenomenon, perhaps a bit high level for the reader with no background other than the first paragraph above, but about on the level of a reader after the Overview.
Ice melting in a 77 F (298 K) room: - a classic example of heat energy, q, from the warmer room surroundings being ‘transferred’ to the cooler system of ice and water at its constant T of 32 F (273 K), the melting temperature of ice. Thermodynamic System The entropy of the system increases by q/273 K. [q is actually the ΔH for ice, the enthalpy of fusion for 18 grams]. The entropy of the surrounding room decreases less (because the room temperature T is higher, as q/298 K versus the ice-water of q/273 K shows). Then, when the ice has melted and the temperature of the cool water then rises to that of the room (as the room further cools essentially imperceptibly), the sum of the dq/T over the continuous range (“at many increments”) in the initially cool to finally warm (room temperature) water can be found by calculus. The entire miniature ‘universe’ of room surroundings and ice-water-glass system has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a 'system' in it. FrankLambert 20:25, 9 July 2006 (UTC)
FL suggestion: Paragraph 1
FL suggestion: Paragraph 2
Frank, I see that you don’t like the order/disorder connection to entropy; and from your talk-page comments, I see that you would like to ban it from the world. The problem here is that this is a widely used description; this morning, for example, I am reading Joon Chang Lee’s 2001 Thermal Physics – Entropy and Free Energies book, and it is filled with order/disorder stuff. Thus, knowing that WP is a neutral point of view information presentation forum, we simply have to present concepts as they are, from all perspectives. In summary, my point is that WP is not a place to debate whether something is right or wrong. Thanks: -- Sadi Carnot 13:13, 11 July 2006 (UTC)
I can’t believe you’re still trying to talk junk to me? It occurs above twice and on seperate talk pages as well: here and here. In Wikipedia terminology, these are called person attacks. Before you go around making any more of these, you might want to read the section " no personal attacks" which states: Do not make personal attacks anywhere in Wikipedia. Comment on content, not on the contributor. In any event, in addition to what I’ve just posted below, here and here, in regards to you thinking that you know about entropy even though, as you state, you have never read any of Clausius’ work (which in itself is very humorous); from Anderson’s 2005 Thermodynamics of Natural Systems (textbook), we find:
“ | Virtually since the beginning, a popular viewpoint has been to see entropy as a measure of disorder. Helmholtz used the word “Unordnung” (disorder) in 1882. | ” |
Lastly, in regards to your presumption: “You had a year of chemistry, didn't you?”; for the record, in regards to knowledge of entropy, one chemical engineer (me) is more knowledgeable than any three chemists (you) put together. Thank-you for your attempt at trying to belittle me. Adios: -- Sadi Carnot 05:17, 13 July 2006 (UTC)
FL proposal:
FL proposal:
FL proposal:
Wikipedia:WikiProject_Chemistry currently rates this article as "needs more chemistry to go with the physics" [1] (note: rating is not necessarily that recent).
In progress, I think we're already discussing issues about the appropriate tone to make a friendlier article for P Chem students. (And, what specifically in the article might make it seem less friendly, or even alien to P Chem students).
But what *topics* should Physical Chemistry students expect to find presented here, that aren't already (and/or in breakout articles from here) ?
eg:
...
Actually, I suspect very little of that on this page, which is mostly signposts to more detailed articles; but perhaps more detail in an Entropy in Chemistry page, or handed off to Chemical Thermodynamics; or in individual relevant pages.
Part of the problem is that (IMO) Wikipedia needs a big push in the whole subject area of Physical Chemistry. The whole P Chem subject area seems a lot less developed, so (as a comparative outsider) I find it hard to find out what needs the Entropy article needs to supply.
Hence this talk subject to try to get discussion going. (And a plea at WP:Project Chemistry for an actively managed initiative to try to push quality in the whole field). Jheald 19:35, 10 July 2006 (UTC).
Entropy is perhaps the most difficult topic that chemistry students ever meet. In fact there are now places, regretably, that now no longer teach it, at least to all students. I am retired now, so I am no longer teaching Physical Chemistry and the publishers are no longer sending me copies of the latest text books. However before I retired I saw the change that Frank talks about, although I did not realise it came from him. I had started to concentrate on energy dispersal rather than disorder. For those chemistry students who are still taught about entropy, this article is not much help. It is way beyond what is in their text books and it really is placing more weight on how physicists see the subject. There is a general problem here with all articles that cover topics in both physics and chemistry. In some areas it is like entropy - one article but not clearly satisfying both chemists and physicists. In other areas, particularly in the areas of molecular physics, chemical physics, quantum chemistry, computaitonal chemistry etc., I keep coming across articles that have a major overlap with other articles - one written by chemists and linking to other chemistry articles, and one written by physicists and linking to physics articles. We need to sort this out. I am not sure how. Maybe we do need different articles. Maybe we need an article on chemical entropy or entropy in chemistry. Better, I think, would be to make this article fairly simple and link to separate more complex articles on entropy in chemistry and entropy in physics. -- Bduke 23:57, 11 July 2006 (UTC)
One possibility might to have a separate article Thermodynamic entropy (Which currently redirect here) that is a gentle introduction to entropy as physical chemists use it. This article can then be the broad overview article that provides short introductions to all of the other entropy articles. We currently have Entropy (thermodynamic views), Entropy (statistical views), Entropy in thermodynamics and information theory, Information entropy and History of entropy, along with many other related articles in the entropy category. Nonsuch 00:13, 12 July 2006 (UTC)
Frank is trying to convince us that since some chemistry textbooks now avoid the use of the word "disorder" following his advice, this description is dead and should not be mentioned in Wikipedia. (Instead he suggests to use the vague concept "energy dispersal" that I still cannot understand what its exact definition is). He writes above that I should check what they teach in the Chemistry Department at Harvard. So I took the challenge, and what do I find in the lecture notes of the course "Chemistry 60: Foundations of Physical Chemistry"? It's online: Lecture 8 p. 11: "Entropy S is a quantitative measure of disorder" :) Yevgeny Kats 14:34, 11 July 2006 (UTC)
Frank, I feel that your intentions are well-minded; but you need to loosen up a bit. When you go around tooting that “disorder is dead” and “energy dispersal is in”, what you are doing is akin to censorship. From my talk page, you state that you have never dug into the history of Clausius (1850-1865), but that your interest peeked with the papers of Boltzmann (1872) and Kelvin (1875), below are your words:
Well, to contradict you, I do know how he said it. Over the last several days, I have been down at the UIC library reading an original copy of Clausius famous 1865 book Mechanical Theory of Heat, which contains his nine papers between the years 1850-1865, upon which the foundation of entropy is built. In his 1862 paper, he states the following relation:
where:
Clausius then goes on to discuss the underlying nature of the concept of “equivalence-value”, i.e. what he labels as entropy in the following two years. He then discusses how a working body of gas comprised of individual molecules configured into very specific arrangements, i.e. “order”, is changed through “compensations” as it progresses through a complete Carnot cycle; something which Carnot himself did not take into account.
First, he speaks of the change in the arrangements of the molecules of the body not being a reversible phenomenon as the body moves from one state to the next, and in doing so overcomes resistances which are proportional to the absolute temperature.
Clausius then states: “when heat performs work, these processes always admit of being reduced to the alteration in some way or another of the arrangement of the constitute parts of the body. For instance, when bodies are expanded by heat, their molecules being thus separated from each other: in this case the mutual attractions of the molecules on the other one had, and external opposing forces on the other, in so far as any are in operation, have to be overcome.” Hence, we see that in 1862, before entropy was even named, the founder of the concept of entropy is speaking about arrangements of the atoms and molecules of the constituent body and how these arrangements are changed, via uncompensated transformations, as they progress through a cycle. As we see, entropy in its essential nature, is built on the concept of the changes in the ordering of atoms and molecules of the working body. The opposite of order is disorder. This is a core perspective in the concept of entropy; and it is not dead as you seem to think. Just a friendly note: -- Sadi Carnot 02:43, 12 July 2006 (UTC)
To elaborate further on the preceding discussions, I particularly do not like individuals who purposely strive to alter historical frameworks of concept, in such a manner as to substitute their own personal contrivances for the original. Owing to this conviction, today I was again down at the sacred copies room at the UIC library typing up, word-for-word, the original formulation of entropy, as it was intended to be stated. The entire framework of entropy is built on some 365 pages worth of logical argument presented in nine memoirs over the course of 15 years, from 1850 to 1865. Below are the main points of the way entropy was presented to the world in 1862-65 (starting from pg. 218 of Clausius' Mechanical Theory of Heat [the third page of the sixth memoir]):
(skipping a few paragraphs; Clausius then states the following law (as he calls it)):
(skipping a paragraph)
(skipping about 20 pages)
(Then starting from page 354 of the ninth memoir is where he assigns the symbol S and gives it the name “entropy”)
(Then after a page of derivations, which includes a symbol assignment of disgregation, we then arrive back at the following equation)
(Then, in grand finale of this equation-packed magnificent book, we arrive at his famous last paragraph on the last page [pg. 365] of ninth memoir, first presented to the public as sourced below*)
So there you have it. I hope this gets us all on the same page so that there is no more talk-page bickering regarding fictitious concepts that were never presented in the words of Clausius. Adios: -- Sadi Carnot 04:23, 13 July 2006 (UTC)
What is (or should be) the focus of this article? Much of the recent discussion has focused on how particular communities construe the word, either presently, or in the past. IMHO, this article, which is simple entitled 'entropy', should give a concise definition, a short introduction to the concept, a brief history, and a discussion of various ways that entropy has been used in different contexts, and at least a listing of related concepts, special cases and generalizations, with suitable crosslinking to more specialized articles. Discuss. Nonsuch 07:30, 17 July 2006 (UTC)
As a first draft:
"Entropy has been described in texts, dictionaries, and encyclopedias for over a century as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". This is unfortunate because "disorder" and "mixed-upness" are not quantifiable scientific terms in themselves. A modern description of entropy that has appeared in most US general chemistry texts as of 2006 is readily understandable and directly related to quantitative measurement It is "Entropy is a measure of the dispersal of energy (in chemistry, most often the motional energy of molecules): how much energy is spread out in a process, as in 'thermal entropy' and/or how widely spread out it becomes, as in 'positional entropy', both at a specific temperature."
(Information theory deals with the quantitation of 'disorder' and 'mixedup-ness'. See the links to articles in its section that follows.)" FrankLambert 18:48, 17 July 2006 (UTC)
In reply to the comments of Jheald above, I have reassessed the article for WikiProject Chemistry as "Start". A lot of specific chemical information (eg, measurement techniques) can go into Standard molar entropy or into Gibbs' free energy, as these are the quantities which chemists actually use, but there needs to be enough of a "taster" here so that chemists will know to look elsewhere for the details. Gibbs' free energy is not mentioned by name, although the concept is used in the article! Similarly, there is no mention of the Third law of thermodynamics, which surely has a place in a general discussion of entropy...
I would like to see at least one example of a chemical system in the article: I usually use the reaction between chalk and hydrochloric acid, as the evolution of a gas and the dissolution of a solid always lead to an increase in entropy. As the article stands, I do not see how a reader can get a grasp as to what entropy actually is, beyond the fact that it happens to be equal to dqrev/dT! The example of the Ellingham diagram also works well with French students: entropy as an approximation to d(ΔG)/dT in the absence of phase changes. I am mystified by the current choice of example: the article presents entropy as the energy which is not available to perform work, but chooses an example where no work is done at all! I'm teaching "Basic Physics for Environmental Scientists" this summer ("basic" includes such joys as damped and forced oscillators, Reynold's number and Fick's Laws, so I'm glad I don't have to teach "advanced"!), and I will almost certainly use the example of a steam pump to illustrate entropy. I will use the example of melting ice to check if they have understood what a phase equilibrium is ("OK, so it's in equilibrium, so what is ΔG? So if ΔG is zero, what is ΔS?"). The use of the entropy unit in American publications should also be mentioned.
On a more general level, care needs to be taking with the mathematics. The "d" of a complete differential is always in upright type, latin letters denoting physical quantities (q, T) are always in italics. Entropy is a state function, not a differential, and so is equal to dqrev/dT, not dQ/T as stated in the article and earlier on this talk page (either lower or upper case "Q" may be used for heat, I'm not complaining about that). As we have the correct Greek spelling of trope, this should be included.
Finally, the definitions seem to be rather too based on a historical presentation, even if I cannot fault the quality of the history. The section "Thermodynamic definition" is six paragraphs long, but stops at 1903! IMHO, the force of Clausius and Boltzmann is that their ideas are still taught a century and a half later in virtually the same terms as they themselves used: I feel that we should honour them by teaching these ideas as physical facts and not as historical facts!
An appropriate response to all this would be {{ sofixit}}, but I'm not going to dive into the article itself without some feedback on my views (feedback can be good, just ask Jimi Hendrix). All the more so as I remark that the article is dissipating quite a lot of energy as heat which, as we all know, can only be to the detriment of useful work :) Physchim62 (talk) 18:16, 18 July 2006 (UTC)
Hi,
Although I think the disorder concept is a terrible way to explain entropy (not withstanding the fact that most of the time its just plain wrong), I think that this page needs to retain a reference to the historic use of the words "disorder" and "mixedupedness" - because otherwise some maleducated individual will once again put disorder into this page. Not only that, it will perpetuate the confusion people have as to why disorder is ever used.
So we should have a little paragraph that explains why entropy was explained as "disorder", and why its wrong. Just a short little thing. Fresheneesz 18:11, 21 July 2006 (UTC)
Given this edit by Sadi Carnot (which is a valid edit, btw) I propose we move this article (i.e., renasme) to "Entropy (thermodynamics)" and remove anything unrelated to thermodynamics as the other uses are not in keeping with the thermodynamic meaning of entropy, and as they have their own articles. •Jim62sch• 16:27, 6 August 2006 (UTC)
Except that that misses the point, and ultimately misleads the reader do to a functional, widespread misunderstand of what "entropy" means. And really, the other uses are not truly related to thermodynamics. In those uses, entropy is taken to mean disorder, chaos, in thermodynamics it really means "re-order", it makes no subjective judgment re chaos or disorder. •Jim62sch• 23:39, 6 August 2006 (UTC)
Now I'm confused...if there's an entropy disambig page, why does this article mention non-thermodynamic entropy? •Jim62sch• 23:10, 7 August 2006 (UTC)
Loosely, maybe. •Jim62sch• 00:05, 8 August 2006 (UTC)
Suggested changes to generalise the concept in clearer terms. "State function" doesnt cut it for an intro, even if "relative measurement" isnt quite correct. Criticism welcome. - Ste| vertigo 18:12, 27 August 2006 (UTC)
Entropy is a concept of a relative
measurement of the degree, rate, and vector of
energy dispersal within an
isolated system, as well as between connected systems. The interpreted meaning depends on the context, for example In thermodynamics, entropy (symbolized by S) is a fundamental part of the second law of thermodynamics, which defines the relationship between time and the energy within an unequilibriated system — hence it is an important factor in determining a system's free energy to do work. It is measured as a state function of a thermodynamic system, defined by the differential quantity , where dQ is the amount of heat absorbed in a reversible process in which the system goes from the one state to another, and T is the absolute temperature. [1] |
Entropy is a concept of a relative measurement of the degree of energy dispersal within an isolated system, as well as between connected systems. The interpreted meaning depends on the context, for example in truly isolated systems (such as the universe is theorised to be) the entire energy system is entropic, whereas in connected systems (as found in nature) entropy relates directly to heat transfer. It is measured as a state function of a thermodynamic system, defined by the differential quantity , where dQ is the amount of heat absorbed in a reversible process in which the system goes from the one state to another, and T is the absolute temperature. [2] |
Note: This article has a small number of in-line citations for an article of its size and currently would not pass criteria 2b.
Members of the
Wikipedia:WikiProject Good articles are in the process of doing a re-review of current
Good Article listings to ensure compliance with the standards of the
Good Article Criteria. (Discussion of the changes and re-review can be found
here). A significant change to the GA criteria is the mandatory use of some sort of in-line citation (In accordance to
WP:CITE) to be used in order for an article to pass the
verification and reference criteria. It is recommended that the article's editors take a look at the inclusion of in-line citations as well as how the article stacks up against the rest of the Good Article criteria. GA reviewers will give you at least a week's time from the date of this notice to work on the in-line citations before doing a full re-review and deciding if the article still merits being considered a Good Article or would need to be de-listed. If you have any questions, please don't hesitate to contact us on the Good Article project
talk page or you may contact me personally. On behalf of the Good Articles Project, I want to thank you for all the time and effort that you have put into working on this article and improving the overall quality of the Wikipedia project.
Agne
00:19, 26 September 2006 (UTC)
Two important consequences are that heat cannot of itself pass from a colder to a hotter body
The intro included evolution as another field of study in which the concept is used, but the only reference I've found in the article is to HG Wells' fictional "devolution", and in the evolution article it appears as a common misconception, so I've removed it pending explanation. ... dave souza, talk 08:26, 28 September 2006 (UTC)