This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | Archive 14 |
Today, after coming across another "dropping an egg on the floor" explanation of entropy, I began making a table (in progress) of the various oft-cited ‘entropy models’ used as teaching heuristics throughout history, listed chronologically:
Feel free to leave comment if you see any that I’ve left out or forgotten. -- Libb Thims ( talk) 17:12, 18 July 2010 (UTC)
The date of coining and the description of the coining of the term entropy are both incorrect (citing the Online Etymology Dictionary over that of the original source):
I would suggest that someone fix this misinformation, which has been in the article for at least a year or two now. -- Libb Thims ( talk) 17:53, 18 July 2010 (UTC)
This article should give a simple, clear definition of entropy in the opening paragraph so that the concept is easily understood by those unfamiliar with it. Rather than doing this however the article abruptly jumps into specialist terminology and calculus equations, thereby having the unfortunate effect of alienating audiences unfamiliar with advanced physics. Frankly, the opening paragraph needs a complete rewrite, as much for the sake of clarity and coherence as to make the information accessible to a wider audience 64.222.110.155 ( talk) 19:25, 22 February 2010 (UTC)
I don't know what is the best way to introduce entropy in this article, but I do know the following:
My question is - how long do we blur the distinction between the thermodynamic definition of entropy and statistical explanation of entropy in the name of simplicity? How long do we pretend that there is a number called "Entropy" that we can assign to a system that has some absolute meaning like temperature or pressure? I agree, this distinction need not be made immediately, but we do a disservice to carry on as if statistical mechanics and thermodynamics are two names for the same theory, and that total entropy is some rigorous physical quantity that, if we only knew enough, we could write down an absolute value for. PAR ( talk) 16:29, 2 March 2010 (UTC)
If thermodynamic systems are described using thermal energy instead of temperature, then entropy is just a number by which the thermal energy in the system is multiplied. The resulting energy is an energy for which no information is available which would be required to convert the energy in technical systems from one form (e.g. electrical) into another form (e.g. mechanical).
In technical applications, machines are basically energy conversion devices. Thus, such devices only can be driven by convertible energy. The same applies to biological organisms. The product of thermal energy (or the equivalents of thermal energy) and entropy is "already converted energy".
The quotation at the beginning is taken out of context. It is a statement that the discoverer of the covalent bond found hard to understand. If Lewis found that statement difficult to understand, why is it the opening passage of page without at least a warning that even experts find that hard to understand? That whole introductory section contains no useful overview information and should be removed. —Preceding unsigned comment added by 24.16.68.149 ( talk) 21:32, 2 June 2010 (UTC)
I think it is time for another rewrite. To adress the old comment by PAR above that you can define entropy without invoking information theoretical concepts, microstates etc., I would say that this is not really true, because you then have to appeal to vague concepts like "disorder". Also, what is "temperature", what is "heat"? these cpncepts that are not rigorously defined in classical thermodynamics that are assumed to be given in an ad-hoc way in practical situations (like heat engines).
I started a rewrite here. I rewrote the stuff till the grand-canonical ensemble. But note that the explanation in the first paragraphs is a bit lacking, I need to invoke ergodicity, the fact that this is not proven etc. etc.. As things stand there, the explanations are a bit misleading and a bit too simplistic. Count Iblis ( talk) 15:06, 27 August 2010 (UTC)
Clausius coined the term entropy in 1865. And it is misinformed in this article and so I am correcting it. Nobleness of Mind ( talk) 13:41, 19 July 2010 (UTC)
To me entropy means that matter and energy (like the salt and pepper analogy) can never be completely separated. Is this a postulation upheld by the law of entropy or am I missing the point? —Preceding unsigned comment added by 165.212.189.187 ( talk) 14:29, 17 August 2010 (UTC) By completely I mean all the energy in the universe and all the matter in the universe. —Preceding unsigned comment added by 165.212.189.187 ( talk) 14:32, 17 August 2010 (UTC)
I don't understand what is the meaning of this section. It should be incorporated into other parts of the article or entirely deleted.-- Netheril96 ( talk) 03:40, 5 October 2010 (UTC)
It's become a cliche to say that entropy is a measure of disorder, but that's not generally true. Imagine a large cloud of gas and dust collapsing by gravity into a planet. Imagine a crystal forming by molecular attraction. The fundamental change is that energy availble to do work is being lost. You could theoretically harness infalling gas and dust to do work, but once it's formed into a tightly packed planet, that opportunity is gone. The idea of disorder increasing is associated with the theory of ideal gas, it's not the general rule. DonPMitchell ( talk) 16:41, 5 October 2010 (UTC)
This isn't a "general article on entropy". That might look something like the article on Scholarpedia. What we have here is an article on entropy in thermodynamics. The treatment of entropy in information theory is cursory and minimal, the bare minimum relevant for context as an interpretation of thermodynamic entropy -- and that is as it should be, because we have had long discussions about this in the past.
If people want to know about Shannon entropy, they should be directed to Shannon entropy, where we have a real article on the subject. It does nobody any favours to present this article as an article on Shannon entropy, nor on entropy generally -- it just isn't what this article does.
The lede should present what this article is about, and where its focus is. Shannon entropy is not what this article is about; it certainly isn't where its focus is. It's hard enough to present a coherent summary of thermodynamic entropy in four paragraphs (it's defeated us for seven years, and what is there at the moment is a long way even from the best we've done). Putting in an uncontextualised and unmotivated line about Shannon entropy frankly doesn't help. Jheald ( talk) 20:09, 26 October 2010 (UTC)
I've closed the very old Wikipedia:WikiProject Mathematics/A-class rating/Entropy as no consensus due to age and lack of input in the discussion.-- Salix ( talk): 17:54, 6 November 2010 (UTC)
This article could do with a section discussing entropy changes in chemical reactions -- in particular, the relation between ΔS and ΔG (entropy of the system and entropy of the surroundings); the effect of temperature on the importance of the ΔS term, and therefore whether the reaction can go forward or not; and, especially, what makes a chemical reaction "entropically favourable" -- i.e. what sort of reactions have a positive ΔS.
This is material worth treating here in its own right; but I think the discussion of what makes ΔS positive would also, by introducing a concrete example, make the discussion of the microscopic picture of entropy much more real (i.e. why it is that releasing a mole of gas tends to make a reaction entropically favourable); and, also, usefully give another macroscopic point of contact with entropy, other than just heat engines.
A good section on "entropy changes in chemical reactions" would I think therefore add very valuably to the article.
I think it would be helpful in the lead too. Having introduced entropy via a heat engine view (relating to energy not available to do useful work), then the microscopic view, it seems to me that following those two with a third paragraph, introducing the idea of entropy change in chemical reaction, would round out the lead well, and make clearer the significance and meaning of entropy as an idea. Jheald ( talk) 12:57, 10 December 2010 (UTC)
The Wikipedia article on "Entropy" states in the first line:
"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines."
This statement is interesting because, as we know, Entropy is not measured in units of Energy, but rather, as a Change of Energy divided by Temperature.
I also observed the Fundamental thermodynamic relation:
I realized that the equation somehow implied for the same change in internal energy, ∆U, that the amount of p∆V was only limited by the value of T∆S. The mainstream idea is that increasing entropy somehow increases the amount of energy not available to the system. However, if this were true, then temperature must decrease faster than entropy increases, for by doing so, if S increased, then the sum of ∆U and p∆V would not have to increase. If ∆U is the change of internal energy, then it is easy to see that P∆V is the change in external energy.
Changing a pressure in a constant volume is like changing a force without changing the displacement—by itself, there is no work associated with such forces. Of course you cannot just change the pressure in a system of constant volume without changing the forces in that volume. Fundamentally, the change in pressure in a "constant" volume is really the result of the changes in the proximal distance between particles in that system. The hotter particles in the volume are, then the greater the variation there is in the distances between those particles, and because of the inverse square law and the fact that the root mean square of a set of unequal real values is always greater than the average, the higher the average forces between the particles, even if the average proximal distance between particles does not change. Therefore, in a sense, V∆p at one scale is actually p∆V at a smaller scale. V∆p is otherwise, implicitly, a part of internal energy, ∆U.
Thus, it is obvious that T∆S = ∆U + p∆V is the change of total energy.
If S is conserved between systems, then it is easy to deduce a direct, linear relationship between temperature and energy of a given system, which is exactly what one expects from the kinetic theory of heat:
Where ∆S is the entropy change of one system at the expense of another (i.e. an amount of entropy "in transit").
Also notice that if T decreases with time, for a given entropy "in transit" (∆S), the total energy "in transit" (∆U + p∆V) literally decreases, which corresponds directly with requirement that such energy becomes unavailable to do work. Technically, this would make T∆S equivalent to exergy rather than the energy. So there is no need to assume that the total entropy of a universe must increase to explain this. All that is required is that entropy is transferred from one domain to another domain of a lower temperature.
The first line of the article on fundamental thermodynamic relation states: "In thermodynamics, the fundamental thermodynamic relation expresses an infinitesimal change in internal energy in terms of infinitesimal changes in entropy, and volume for a closed system."
This makes it clear that the interpretation in mainstream science is that this "fundamental thermodynamic relation" is for closed systems, implying that somehow that all the entropy change is manifested directly by the system. They have no concern for the possibility that entropy could flow into what they have instead considered as a "closed system" (which reads as "a system that is thermally isolated from the environment, in the sense of not being able to receive energy from the outside"). They've come to think that energy transfer is necessary for a transfer of entropy because they treat energy as the fundamental substance of matter. So to them, the entropy in these said "closed" systems arises solely due to the interactions of the energy already present, and of that, only the part which they know how to plug into the fundamental thermodynamic relation. Thus, they do not include entropy from external sources, nor even from subatomic particles themselves, for as of yet, the entropy of systems within atoms is not apparent to them. Additionally, they have not accepted the idea that entropy can flow into a system when that system is energetically isolated from the environment. Thus, it is no wonder that they think entropy can be created from nothing. Kmarinas86 (Expert Sectioneer of Wikipedia) 19+9+14 + karma = 19+9+14 + talk = 86 15:04, 27 January 2011 (UTC)
The article contains a number of inaccurate statements. Let me name a few:
"Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not..." Entropy, as defined by Clausius, is a conserved function in the Carnot cycle along with the internal energy. For efficiencies other than the Carnot efficiency the equality in the first formula in the section "Classical Thermodynamics" becomes an inequality.
"Thermodynamics is a non-conserved state function..." contradicts the first formula in the section "Classical Thermodynamics".
"For almost all practical purposes, this [Gibbs's formula for the entropy] can be taken as the fundamental definition of entropy since all other formulas for S can be derived from it, but not vice versa." This is definitely incorrect; just consider the entropy of degenerate gases which is proportional to the number of particles. Bernhlav ( talk) 08:20, 2 April 2011 (UTC)
Hello,
Thank you. I do not know how to make the changes because that will remove references that are cited. Let me give you an example of the first of the above statements.
The concept of entropy arose from Clausius's study of the Carnot cycle [1]. In a Carnot cycle heat, Q_1, is absorbed from a 'hot' reservoir, isothermally at the higher temperature T_1$, and given up isothermally to a 'cold' reservoir, Q_2, at a lower temperature, T_2. Now according to Carnot's principle work can only be done when there is a drop in the temperature, and the work should be some function of the difference in temperature and the heat absorbed. Carnot did not distinguish between Q_1 and Q_2 since he was working under the hypothesis that caloric theory was valid and hence heat was conserved [2]. Through the efforts of Clausius and Kelvin, we know that the maximum work that can be done is the product of the Carnot efficiency and the heat absorbed at the hot reservoir: In order to derive the Carnot efficiency, , Kelvin had to evaluate the ratio of the work done to the heat absorbed in the isothermal expansion with the help of the Carnot-Clapeyron equation which contained an unknown function, known as the Carnot function. The fact that the Carnot function could be the temperature, measured from zero, was suggested by Joule in a letter to Kelvin, and this allowed Kelvin to establish his absolute temperature scale [3] We also know that the work is the difference in the heat absorbed at the hot reservoir and rejected at the cold one: Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle work and heat would not be equal but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics [4]
Now equating the two expressions gives If we allow Q_2 to incorporate the algebraic sign, this becomes a sum and implies that there is a function of state which is conserved over a complete cycle. Clausius called this state function entropy. This is the second law of thermodynamics.
Then Clausius asked what would happen if there would be less work done than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work, which would now be converted into an inequality, When the second equation is used to express the work as a difference in heats, we get or So more heat is given off to the cold reservoir than in the Carnot cycle. If we denote the entropies by S_i=Q_i/T_i for the two states, then the above inequality can be written as a decrease in the entropy, The wasted heat implies that irreversible processes must have prevented the cycle from carrying out maximum work.
Of all the thermodynamic functions, entropy has the unique role of having one foot in the macroscopic world and the other in the microscopic world of atoms. According to Boltzmann, the entropy is a logarithmic measure of the number of micro-complexions that correspond to, or are indistinguishable from, a given macroscopic state. Since the latter is a very large number, instead of being a proper fraction, Planck referred to it as a 'thermodynamic' probability [5]. Boltzmann was always of the opinion that large number gave better assurance than mere fractions.
As an illustration of how Boltzmann determined his entropy let us consider the distribution of N particles in k urns [6]. The a priori probability of there being particles in the ith urn is . The are not all independent but must obey the condition , and since probability must be conserved, . Then the probability that the first urn contains , the second urn contains , and so on, is , which is known as the multinomial distribution [7].
Now Boltzmann did not know what the a priori probabilities were so he supposed that they were all equal, , for all the urns. According to Boltzmann we should not discriminate among the urns. [But when he attaches different energies to the different urns, he precisely does discriminate among them.] The probability P thus becomes a 'thermodynamic' probability whose logarithm, , he set proportional to the entropy S. The constant of proportionality, or the universal gas constant divided by Avogadro's number, for one mole of gas, was later determined by Planck in his study of black-body radiation, at the same time he derived his constant [8]. Thus, Boltzmann's assumption of a priori equal probabilities [9] converts the multinomial distribution into a thermodynamic probability, which, instead of being an actual probability, is a very big number.
Entropy is a measure of the error that we commit when we observe a value different from its true value which we know to be the average value, . When we specify which average we are considering, the entropy will be determined from Gauss's error law which equates the average value with the most probable value Cite error: A <ref>
tag is missing the closing </ref>
(see the
help page).. This implies that
must be the same function of the observations that is of the average value. And if we take
, we obtain the probability distribution as
which is the well-known Poisson distribution under the condition that Stirling's approximation for the factorial is valid.
If we consider the independent probabilities of the occupation of the k different urns we must consider the product of the probabilities where we must now attach an index on the average value to show that it is the average value for the ith urn. The product then gives the multinomial distribution, where the a priori probabilities are now seen to be . Thus we see that is the Gibbs entropy which ensures that is a true probability, less than one, based on its strict concavity, The strict concavity of the entropy is its defining property, and corresponds to the thermodynamic criteria of stability. Consequently, the selection of which average is the true value and Gauss's error law determine the form of the entropy. The entropy is thus said to be the potential for the law of error, and this completes Boltzmann's principle [10].
I think this is clearer than what is found in the text. The same holds true for the other points mentioned.
Bernhlav (
talk)
16:15, 3 April 2011 (UTC)
I don't understand "the complete state of the gas" as the problem concerns the combinatorics, i.e. how to distribute a certain number of particles among given states. Bernhlav ( talk) 16:47, 6 April 2011 (UTC)
If a system can be divided into two independent parts A and B, then
Thus the entropy is additive. The number of particles in the systems is also additive. If the substance is uniform, consisting of the same type of particles at the same temperature, then any portion of it can be thought of as a sum of identical pieces each containing the same number of particle and the same entropy. Thus the portions' entropy and particle number should be proportional to each other. So what is the problem? JRSpriggs ( talk) 00:42, 15 April 2011 (UTC)
where is some function of the temperature, and the prime means differentiation with respect to it. If I multiply all extensive variables by some constant, , then it is true that the entropy will be increased times, but the entropy has the form so that it is not a linear function of .
However, the entropy of black body radiation,
is proportional to [which is eqn (60.13) in the same reference] where is the radiation constant. The entropy has lost its logarithmic form. Moreover, the Renyi entropy
for all is also additive (and concave in the interval). It becomes the Gibbs-Shannon entropy in the limit so that the Gibbs-Shannon entropy is a limiting form of the Renyi entropy. Not to mention the non-additive entropies, to which the Havrda-Charvat entropy belongs,
which is also mentioned in the text under the name of the Tsallis entropy.
There is still the more fundamental reason why the Gibbs-Shannon entropy cannot be the most general form of the entropy: It corresponds to the Poisson distribution which is a limiting distribution in the limit where and such that their product is constant. So the Gibbs-Shannon entropy is not the most general form of entropy from which all others can be derived. Bernhlav( talk) —Preceding unsigned comment added by 95.247.255.15 ( talk) 10:27, 17 April 2011 (UTC)
can be integrated to give
where is a constant having dimensions of an entropy density. The exponents in the logarithm must be such that the entropy be an extensive quantity. This is explained in H. Callen, Thermodynamics, 2nd ed. pp. 67-68. The N in the denominator is nothing more than the inverse probabilities in the Gibbs-Shannon entropy,
Bernhlav ( talk) 07:49, 18 April 2011 (UTC)
To Bernhlav: When varying N (quantity of substance), extensional quantities like E (energy) and V (volume) should not be held constant. Rather it is the ratios e=E/N and v=V/N which are constant. Thus your formula becomes
as required. JRSpriggs ( talk) 10:06, 18 April 2011 (UTC)
I think the most important one is the assumption that each microstate is equally probable (equal apriori probability). This is not as simple as it sounds, because for a thermodynamic process, in which things are changing, you will have to deal with the time element, i.e. how collisions produce and maintain this equal apriori probability. PAR ( talk) 17:07, 24 April 2011 (UTC)
I don't think that the traditional foundations of statistical mechanics are still being taken very serious. see e.g. here. The reason why statistical mechachanics works is still being vigorously debated. What I find interesting is the Eigenstate Thermalization Hypothesys (ETH), which basically boils down to assuming that a randomly chosen eigenstates of an isolated system with a large number of degrees of freedom will look like a thermal state. See here for an outline and here for a paper by Mark Srednicki on this topic. Count Iblis ( talk) 15:44, 25 April 2011 (UTC)
"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat"
The above first "Entropy" article(opening) paragraph is completely WRONG! Entropy is NOT “measure of the energy not available for useful work” since entropy is an equilibrium property and work is energy in transfer during a process. Actually, the higher system temperature the higher entropy (everything else the same) and more potential for work with reference to the same surroundings (so work is not property but process variable). The second sentence is confusing and ambiguous, thus not accurate. Maximum efficiency is NOT “when converting energy to work,” but if work is obtained in reversible processes (like ideal Carnot cycle). The last (third) sentence is confusing and plain WRONG: “During this(?) work entropy accumulates in the system” - the entropy does NOT accumulates (what system, where?) since entropy is not associated with work but with thermal energy per unit of absolute temperature. Actually, during ideal Carnot cycle (max work efficiency) the entropy is not generated, but conserved: entropy rate to the cycle is equal to the entropy rate out of the cycle. Not a good idea to start Entropy article with confusing and ambiguous, thus inaccurate statements. See entropy definition at: http://www.kostic.niu.edu/Fundamental_Nature_Laws_n_Definitions.htm#DEFINITION_of_ENTROPY —Preceding unsigned comment added by 24.14.178.97 ( talk) 04:25, 14 February 2011 (UTC)
I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. Count Iblis ( talk) 21:39, 31 May 2011 (UTC)
The terms "in general" and "generally" are often ambiguous. For example, the section on The second law of thermodynamics begins:
The last part of this section indicates (without reference from the above quote) an apparent exception:
However, it is not made clear whether this is the only exception.
Also, any exceptions noted here should be made harmonious with the Second law of thermodynamics article.
The terms "in general" and "generally" appear elsewhere in the article, but this is arguably the one most in need of clarification.
Dorkenergy ( talk) 07:26, 4 July 2011 (UTC)
It is impossible for us to cause a decrease in total entropy. However, very small decreases occur spontaneously, but very rarely and very briefly. If the system is not already at equilibrium, this tendency would be overwhelmed by the increase in entropy due to dissipation of free energy. Any attempt to use the spontaneous decreases to do useful work is doomed to failure because more work would be expended to identify and capture them than could thereby be generated. JRSpriggs ( talk) 09:07, 14 July 2011 (UTC)
"Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature,"
The reality is that energy does not always remain thermal. In fact, a car internal combustion engine converts some amount of thermal energy into vehicle motion. In no sense can a temperature be defined for a vehicle as function of a vehicle speed, tire speed, crankshaft speed, or any other speed of any rotating and/or reciprocating car part. It doesn't take long to realize that energy that leaves the system or any energy that remains in the system in some form other than thermal energy will contribute to TdS. All TdS is somehow accounted for in this way. I italicized the latter to implicate the fact that not all TdS constitutes unusable "waste heat". The quantity TdS also encapsulates thermal energy that was lost due to expansion itself (which is in fact a mechanism capable of expansion work that requires no heat transfer whatsoever from any reservoir to any other reservoir for thermal energy to be lost; look it up at
adiabatic process). Recovering regeneratively any work done (kinetic energy of pistons, vehicles, etc.), such as by running electric motors in reverse (e.g. regenerative braking) can reduce the rate at which entropy increases. This non-thermal energy recovered can indeed spontaneously flow from a lower to a higher temperature, and this is because which direction non-thermal energy flows is not principally determined by temperature, but rather by inertia as well as external forces to the body, which are largely unrelated to temperature, such as the force that pulls a cannonball to the Earth. Entropy generally rises because recovery of kinetic energy that was previously in the form of thermal energy derived from waste heat only supplies a tiny fraction of our energy resources. However, the idea that energy in general cannot flow from cold to hot spontaneously is clearly flawed. A windmill milling grain is a clear example of this fact, wherein only non-thermal energy plays a role as far as work done is concerned. Ask yourself, "What temperature of wind and what temperature of grain is relevant to the mere fact that a windmill operates?" That limitation only exists for thermal energy. And there are some exceptions to the rule concerning thermal energy: (Example: Shine a warm light onto a much brighter light more so than the other way around by simply inserting a wavelength-specific filter in between, and though such an effect is of limited utility at this time, it does constitute an example where thermal energy can flow spontaneously from cold to hot. Again it's unlikely, but it's still spontaneous.)
siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 +
talk
23:19, 22 August 2011 (UTC)
If entropy is considered an equilibrium property as in energy physics, then it conflicts with the conservation of information. But the second law of thermodynamics may simply be a apparent effect of the conservation of information, that is, entropy is really the amount of information it takes to describe a system, each reaction creates new information but information cannot be destroyed. That means the second law of thermodynamics is not an independent law of physics at all, but just an apparent effect of the fact that information can be created but not destroyed. The arrow of time is thus not about destruction, but about the continuous creation of information. This explains how the same laws of physics can cause self-organization. Organized systems are not anyhow less chaotic than non-organized systems at all, and the spill heat life produces can, in an information-physical sense, be considered beings eliminated by evolution rather than a step towards equilibrium. It is possible that overload of information will cause the arrow of time to extract vacuum energy into usefulness rather than heat death. 217.28.207.226 ( talk) 10:49, 23 August 2011 (UTC)Martin J Sallberg
I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed. 95.209.181.217 ( talk) 18:28, 23 August 2011 (UTC)Martin J Sallberg
psychological entropy is bullshit — Preceding unsigned comment added by 68.51.78.62 ( talk) 01:42, 23 November 2011 (UTC)
This is nonsense. The Wikipedia article on energy defines it as "the ability a physical system has to do work on other physical systems". Therefore by definition, something which is not available for work is not energy.
Gcsnelgar ( talk) 23:39, 22 December 2011 (UTC)
Entropy is necessarily a concave function of the extensive variables. Saying that entropy is proportional to the area is the same as saying that it is a convex function of the (Schwarzschild) radius, which is proportional to the mass. The putative second law leads to incorrect inequalities (c.f. arXiv:1110.5322). As was brought out in the black hole thermodynamics article, black hole 'evaporation' would lead to a violation of the supposed second law which says that the area of a black hole can only increase. In fact, black body radiation cannot be used as a mechanism for black hole evaporation since a heat source is necessary to keep the walls of the cavity at a given temperature which produces thermal radiation. Thermal radiation is an equilibrium process, as Einstein showed back in 1917, and does not lead to irreversible processes as those that would occur in processes related to evaporation.
According to black hole thermodynamics, the temperature would decrease with energy, and this does violate the second law (cf. http://www.youtube.com/watch?v=dpzbDfqcZSw).
Since when does a personal opinion constitute a scientific publication? which, as noted, is not even referenced. This section, together with reference 60, should be deleted from an otherwise respectable wikipedia article Bernhlav ( talk) 23:01, 28 December 2011 (UTC)bernhlav.
For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy was asserted by physicist Erwin Schrödinger in a 1944 book What is Life?. He posed: “How does the living organism avoid decay?”The obvious answer is: “By eating, drinking, breathing and (in the case of plants) assimilating.” Recent writings have used the concept of Gibbs free energy to elaborate on this issue. [11] While energy from nutrients is necessary to sustain an organism’s order, there is also the Schrӧdinger prescience: “An organism’s astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids… What is Life?. We now know that the ‘aperiodic’ crystal is DNA and that the irregular arrangement is a form of information. “The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by. ”specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell.” [12] DNA and other macromolecules determine an organism’s life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size as genetics is the governing factor. At some point, organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or be taught each generation. Therefore DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann’s perspective of the second law, the change of state from a more probable, less ordered and high entropy arrangement to one of less probability, more order, and lower entropy seen in biological ordering calls for a function like that known of DNA. DNA’s apparent information processing function provides a resolution of the paradox posed by life and the entropy requirement of the second law.. [13] LEBOLTZMANN2 ( talk) 15:51, 9 February 2012 (UTC)
Strenuous efforts are being made to correct the assumption that entropy equates to a degree of order (even in Schrodinger, as quoted above). Yet this article directly perpetuates that error (search 'order' in the main article). As WP is, for better or worse, likely to be the most-read source on entropy for non-physicists/chemists, can we please address this issue and assist the efforts to correct this popular misconception? Rather than being perpetuated in the article, I feel a paragraph directly addressing the misconception would be in order. I don't feel suitably qualified to make the correction myself, but a fundamental observation would be that the reagents in an exothermic chemical reaction are NOT (necessarily) more 'ordered' than the products + the heat energy emitted. Likewise, the heat-emitting crystallisation (increase in 'order') taking place in certain brands of hand-warmer is not likely to be a violation of the 2nd Law of Thermodynamics! Allangmiller ( talk) 09:10, 29 April 2012 (UTC)
I can't understand what entropy is from the article. I suggest that the article begins with a definition. What it begins with now I understand as a statement what entropy can be used for, and not a definition.
Then there are multiple definitions further down but after reading them I still unfortunately have no idea what entropy is. — Preceding unsigned comment added by 77.58.143.68 ( talk) 14:08, 30 April 2012 (UTC)
An example in nature would be helpful to understanding this concept better: "Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such an event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system." Thanks 165.212.189.187 ( talk) 18:56, 23 July 2012 (UTC)
2.216.69.181 ( talk) 09:43, 21 July 2012 (UTC)
Evolution can be seen as reverse entropy. Patrickwooldridge ( talk) 23:45, 13 September 2012 (UTC) ... Yes, an interesting point and suggestion, although I would be inclined to describe the development of Life (and its evolution) as a type of "compensation" rather than a reversal. -- DLMcN ( talk) 05:39, 14 September 2012 (UTC)
This is a great discussion. I would encourage everyone reading or participating to rigorously distinguish between "Life" (whether individual organisms, species, or life in general) and "Evolution" (the physical force or principle which has led life from simpler forms to more complex, more ordered forms). Patrickwooldridge ( talk) 23:58, 23 September 2012 (UTC)
I don't want to disturb what appears to be a stable article of long standing, so I will make a suggestion for an addition to the lead section here first. The lead should be optimally accessable to a general reader. Entropy can be explained to a general reader, even in elementary school, with examples that require no technical or semi-technical language. I first heard about it with an example of a baloon full of smoke popping, whereby the gas spread out and got less well ordered. "Don't cry over spilled milk" and "you can't take the cream out of the coffee" both means that there is a direction in time by which you can't put the milk back, because of entropy, or in more extreme perspectives, that the direction of time is entropy (lot's of WP:RS on that, Reichenbach, Sklar, etc.). Putting WP:MOS Plain English examples like this in the lead would allow a general reader to quickly get the idea, but would change the technical tone of he existing stable article. I will add something like it unless there is an objection here, and put it up top because of MOS reasoning on the most accessable stuff being up top. 64.134.221.228 ( talk) 12:56, 16 September 2012 (UTC)
Are people opposed to editing the first sentence. It looks like it has been stable for a long time, but, as far as I can tell, it's not even gramattically correct. It doesn't really say anything. Any thoughts? Sirsparksalot ( talk) 14:38, 18 September 2012 (UTC)
- Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down.
- Entropy is the thermodynamic property toward equilibrium. Entropy is the property that produces average, creates homogenization, and causes dissipation. Entropy arises from the second law of thermodynamics, which says that uneven temperatures will attempt to equalize. Although, according to the first law of thermodynamics, energy cannot be created or destroyed, the hotter, more dynamic areas of a system will lose heat or energy, causing cooler areas to get warmer or gain energy. Examples of entropy include thermal conduction, in which heat transmits from hotter areas to cooler areas, until thermal equilibrium is reached. It also includes diffusion, where molecules of solute in a solvent, or mixtures of gases or liquids, tend to evenly and irreversibly distribute. Material objects wear out, organisms die, and the universe is cooling down.
Entropy is the thermodynamic property toward equilibrium.
Entropy is the property that produces average, creates homogenization, and causes dissipation.
Entropy arises from the second law of thermodynamics,...
... which says that uneven temperatures will attempt to equalize.
Examples of entropy include...
The only correct definition is this: "The entropy of a system is the amount of information needed to specify the exact physical state it is in." You can then decide in what units to measure this information, which defines the prefactor k in S = k Log(Omega). But it should be clear that if a system can be in Omega number of states, that Log(Omega) is proportional to the number of bits of information needed to point out exactly which if the Omega states the system actually is in.
With this fundamental definition of entropy it is also easy to make contact with thermodynamics, and then you also have a rigorous definition of heat, work and temperature in terms of fundamental concepts. The second law follows naturally from all of this. Count Iblis ( talk) 02:23, 21 September 2012 (UTC)
I've been thinking about this since my last post here. I've read through lots of books and websites alike. I thought it would be easy enought to come up with one sentence which could sum up what entropy means, but, it turns out, there are as many different definitions as there are people giving them. I'm beginning to agree with Swinburne, in that perhaps we still lack the vocabulary, or even the fundamental understanding of the concept, to put it eloquently in words. (In fact, check this out if you want to see what this exact discussion looked like a hundred years ago.)
In all of this, however, there appears to be one definition that all branches of science can agree upon. It alnost seems too obvious, yet it may be like the forest that cannot be seen through all of the trees. Perhaps it may be best to start by saying that entropy is a ratio (joules per kelvin) or a rate measurement, similar to miles per hour (velocity), cents per dollar (profit margin), or ounces per gallon (mixing rate). Maybe, if we start with a very simple statement like that, the rest of the lede and, hopefully, the article will make more sense to the newcomer. (Actually, I think the analogy to profit margin is a rather good one for negentropy, because its easy for anyone to visualize, whereas entropy is more like the expense margin.) Zaereth ( talk) 01:28, 4 January 2013 (UTC)
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 9 | Archive 10 | Archive 11 | Archive 12 | Archive 13 | Archive 14 |
Today, after coming across another "dropping an egg on the floor" explanation of entropy, I began making a table (in progress) of the various oft-cited ‘entropy models’ used as teaching heuristics throughout history, listed chronologically:
Feel free to leave comment if you see any that I’ve left out or forgotten. -- Libb Thims ( talk) 17:12, 18 July 2010 (UTC)
The date of coining and the description of the coining of the term entropy are both incorrect (citing the Online Etymology Dictionary over that of the original source):
I would suggest that someone fix this misinformation, which has been in the article for at least a year or two now. -- Libb Thims ( talk) 17:53, 18 July 2010 (UTC)
This article should give a simple, clear definition of entropy in the opening paragraph so that the concept is easily understood by those unfamiliar with it. Rather than doing this however the article abruptly jumps into specialist terminology and calculus equations, thereby having the unfortunate effect of alienating audiences unfamiliar with advanced physics. Frankly, the opening paragraph needs a complete rewrite, as much for the sake of clarity and coherence as to make the information accessible to a wider audience 64.222.110.155 ( talk) 19:25, 22 February 2010 (UTC)
I don't know what is the best way to introduce entropy in this article, but I do know the following:
My question is - how long do we blur the distinction between the thermodynamic definition of entropy and statistical explanation of entropy in the name of simplicity? How long do we pretend that there is a number called "Entropy" that we can assign to a system that has some absolute meaning like temperature or pressure? I agree, this distinction need not be made immediately, but we do a disservice to carry on as if statistical mechanics and thermodynamics are two names for the same theory, and that total entropy is some rigorous physical quantity that, if we only knew enough, we could write down an absolute value for. PAR ( talk) 16:29, 2 March 2010 (UTC)
If thermodynamic systems are described using thermal energy instead of temperature, then entropy is just a number by which the thermal energy in the system is multiplied. The resulting energy is an energy for which no information is available which would be required to convert the energy in technical systems from one form (e.g. electrical) into another form (e.g. mechanical).
In technical applications, machines are basically energy conversion devices. Thus, such devices only can be driven by convertible energy. The same applies to biological organisms. The product of thermal energy (or the equivalents of thermal energy) and entropy is "already converted energy".
The quotation at the beginning is taken out of context. It is a statement that the discoverer of the covalent bond found hard to understand. If Lewis found that statement difficult to understand, why is it the opening passage of page without at least a warning that even experts find that hard to understand? That whole introductory section contains no useful overview information and should be removed. —Preceding unsigned comment added by 24.16.68.149 ( talk) 21:32, 2 June 2010 (UTC)
I think it is time for another rewrite. To adress the old comment by PAR above that you can define entropy without invoking information theoretical concepts, microstates etc., I would say that this is not really true, because you then have to appeal to vague concepts like "disorder". Also, what is "temperature", what is "heat"? these cpncepts that are not rigorously defined in classical thermodynamics that are assumed to be given in an ad-hoc way in practical situations (like heat engines).
I started a rewrite here. I rewrote the stuff till the grand-canonical ensemble. But note that the explanation in the first paragraphs is a bit lacking, I need to invoke ergodicity, the fact that this is not proven etc. etc.. As things stand there, the explanations are a bit misleading and a bit too simplistic. Count Iblis ( talk) 15:06, 27 August 2010 (UTC)
Clausius coined the term entropy in 1865. And it is misinformed in this article and so I am correcting it. Nobleness of Mind ( talk) 13:41, 19 July 2010 (UTC)
To me entropy means that matter and energy (like the salt and pepper analogy) can never be completely separated. Is this a postulation upheld by the law of entropy or am I missing the point? —Preceding unsigned comment added by 165.212.189.187 ( talk) 14:29, 17 August 2010 (UTC) By completely I mean all the energy in the universe and all the matter in the universe. —Preceding unsigned comment added by 165.212.189.187 ( talk) 14:32, 17 August 2010 (UTC)
I don't understand what is the meaning of this section. It should be incorporated into other parts of the article or entirely deleted.-- Netheril96 ( talk) 03:40, 5 October 2010 (UTC)
It's become a cliche to say that entropy is a measure of disorder, but that's not generally true. Imagine a large cloud of gas and dust collapsing by gravity into a planet. Imagine a crystal forming by molecular attraction. The fundamental change is that energy availble to do work is being lost. You could theoretically harness infalling gas and dust to do work, but once it's formed into a tightly packed planet, that opportunity is gone. The idea of disorder increasing is associated with the theory of ideal gas, it's not the general rule. DonPMitchell ( talk) 16:41, 5 October 2010 (UTC)
This isn't a "general article on entropy". That might look something like the article on Scholarpedia. What we have here is an article on entropy in thermodynamics. The treatment of entropy in information theory is cursory and minimal, the bare minimum relevant for context as an interpretation of thermodynamic entropy -- and that is as it should be, because we have had long discussions about this in the past.
If people want to know about Shannon entropy, they should be directed to Shannon entropy, where we have a real article on the subject. It does nobody any favours to present this article as an article on Shannon entropy, nor on entropy generally -- it just isn't what this article does.
The lede should present what this article is about, and where its focus is. Shannon entropy is not what this article is about; it certainly isn't where its focus is. It's hard enough to present a coherent summary of thermodynamic entropy in four paragraphs (it's defeated us for seven years, and what is there at the moment is a long way even from the best we've done). Putting in an uncontextualised and unmotivated line about Shannon entropy frankly doesn't help. Jheald ( talk) 20:09, 26 October 2010 (UTC)
I've closed the very old Wikipedia:WikiProject Mathematics/A-class rating/Entropy as no consensus due to age and lack of input in the discussion.-- Salix ( talk): 17:54, 6 November 2010 (UTC)
This article could do with a section discussing entropy changes in chemical reactions -- in particular, the relation between ΔS and ΔG (entropy of the system and entropy of the surroundings); the effect of temperature on the importance of the ΔS term, and therefore whether the reaction can go forward or not; and, especially, what makes a chemical reaction "entropically favourable" -- i.e. what sort of reactions have a positive ΔS.
This is material worth treating here in its own right; but I think the discussion of what makes ΔS positive would also, by introducing a concrete example, make the discussion of the microscopic picture of entropy much more real (i.e. why it is that releasing a mole of gas tends to make a reaction entropically favourable); and, also, usefully give another macroscopic point of contact with entropy, other than just heat engines.
A good section on "entropy changes in chemical reactions" would I think therefore add very valuably to the article.
I think it would be helpful in the lead too. Having introduced entropy via a heat engine view (relating to energy not available to do useful work), then the microscopic view, it seems to me that following those two with a third paragraph, introducing the idea of entropy change in chemical reaction, would round out the lead well, and make clearer the significance and meaning of entropy as an idea. Jheald ( talk) 12:57, 10 December 2010 (UTC)
The Wikipedia article on "Entropy" states in the first line:
"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines."
This statement is interesting because, as we know, Entropy is not measured in units of Energy, but rather, as a Change of Energy divided by Temperature.
I also observed the Fundamental thermodynamic relation:
I realized that the equation somehow implied for the same change in internal energy, ∆U, that the amount of p∆V was only limited by the value of T∆S. The mainstream idea is that increasing entropy somehow increases the amount of energy not available to the system. However, if this were true, then temperature must decrease faster than entropy increases, for by doing so, if S increased, then the sum of ∆U and p∆V would not have to increase. If ∆U is the change of internal energy, then it is easy to see that P∆V is the change in external energy.
Changing a pressure in a constant volume is like changing a force without changing the displacement—by itself, there is no work associated with such forces. Of course you cannot just change the pressure in a system of constant volume without changing the forces in that volume. Fundamentally, the change in pressure in a "constant" volume is really the result of the changes in the proximal distance between particles in that system. The hotter particles in the volume are, then the greater the variation there is in the distances between those particles, and because of the inverse square law and the fact that the root mean square of a set of unequal real values is always greater than the average, the higher the average forces between the particles, even if the average proximal distance between particles does not change. Therefore, in a sense, V∆p at one scale is actually p∆V at a smaller scale. V∆p is otherwise, implicitly, a part of internal energy, ∆U.
Thus, it is obvious that T∆S = ∆U + p∆V is the change of total energy.
If S is conserved between systems, then it is easy to deduce a direct, linear relationship between temperature and energy of a given system, which is exactly what one expects from the kinetic theory of heat:
Where ∆S is the entropy change of one system at the expense of another (i.e. an amount of entropy "in transit").
Also notice that if T decreases with time, for a given entropy "in transit" (∆S), the total energy "in transit" (∆U + p∆V) literally decreases, which corresponds directly with requirement that such energy becomes unavailable to do work. Technically, this would make T∆S equivalent to exergy rather than the energy. So there is no need to assume that the total entropy of a universe must increase to explain this. All that is required is that entropy is transferred from one domain to another domain of a lower temperature.
The first line of the article on fundamental thermodynamic relation states: "In thermodynamics, the fundamental thermodynamic relation expresses an infinitesimal change in internal energy in terms of infinitesimal changes in entropy, and volume for a closed system."
This makes it clear that the interpretation in mainstream science is that this "fundamental thermodynamic relation" is for closed systems, implying that somehow that all the entropy change is manifested directly by the system. They have no concern for the possibility that entropy could flow into what they have instead considered as a "closed system" (which reads as "a system that is thermally isolated from the environment, in the sense of not being able to receive energy from the outside"). They've come to think that energy transfer is necessary for a transfer of entropy because they treat energy as the fundamental substance of matter. So to them, the entropy in these said "closed" systems arises solely due to the interactions of the energy already present, and of that, only the part which they know how to plug into the fundamental thermodynamic relation. Thus, they do not include entropy from external sources, nor even from subatomic particles themselves, for as of yet, the entropy of systems within atoms is not apparent to them. Additionally, they have not accepted the idea that entropy can flow into a system when that system is energetically isolated from the environment. Thus, it is no wonder that they think entropy can be created from nothing. Kmarinas86 (Expert Sectioneer of Wikipedia) 19+9+14 + karma = 19+9+14 + talk = 86 15:04, 27 January 2011 (UTC)
The article contains a number of inaccurate statements. Let me name a few:
"Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not..." Entropy, as defined by Clausius, is a conserved function in the Carnot cycle along with the internal energy. For efficiencies other than the Carnot efficiency the equality in the first formula in the section "Classical Thermodynamics" becomes an inequality.
"Thermodynamics is a non-conserved state function..." contradicts the first formula in the section "Classical Thermodynamics".
"For almost all practical purposes, this [Gibbs's formula for the entropy] can be taken as the fundamental definition of entropy since all other formulas for S can be derived from it, but not vice versa." This is definitely incorrect; just consider the entropy of degenerate gases which is proportional to the number of particles. Bernhlav ( talk) 08:20, 2 April 2011 (UTC)
Hello,
Thank you. I do not know how to make the changes because that will remove references that are cited. Let me give you an example of the first of the above statements.
The concept of entropy arose from Clausius's study of the Carnot cycle [1]. In a Carnot cycle heat, Q_1, is absorbed from a 'hot' reservoir, isothermally at the higher temperature T_1$, and given up isothermally to a 'cold' reservoir, Q_2, at a lower temperature, T_2. Now according to Carnot's principle work can only be done when there is a drop in the temperature, and the work should be some function of the difference in temperature and the heat absorbed. Carnot did not distinguish between Q_1 and Q_2 since he was working under the hypothesis that caloric theory was valid and hence heat was conserved [2]. Through the efforts of Clausius and Kelvin, we know that the maximum work that can be done is the product of the Carnot efficiency and the heat absorbed at the hot reservoir: In order to derive the Carnot efficiency, , Kelvin had to evaluate the ratio of the work done to the heat absorbed in the isothermal expansion with the help of the Carnot-Clapeyron equation which contained an unknown function, known as the Carnot function. The fact that the Carnot function could be the temperature, measured from zero, was suggested by Joule in a letter to Kelvin, and this allowed Kelvin to establish his absolute temperature scale [3] We also know that the work is the difference in the heat absorbed at the hot reservoir and rejected at the cold one: Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle work and heat would not be equal but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics [4]
Now equating the two expressions gives If we allow Q_2 to incorporate the algebraic sign, this becomes a sum and implies that there is a function of state which is conserved over a complete cycle. Clausius called this state function entropy. This is the second law of thermodynamics.
Then Clausius asked what would happen if there would be less work done than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work, which would now be converted into an inequality, When the second equation is used to express the work as a difference in heats, we get or So more heat is given off to the cold reservoir than in the Carnot cycle. If we denote the entropies by S_i=Q_i/T_i for the two states, then the above inequality can be written as a decrease in the entropy, The wasted heat implies that irreversible processes must have prevented the cycle from carrying out maximum work.
Of all the thermodynamic functions, entropy has the unique role of having one foot in the macroscopic world and the other in the microscopic world of atoms. According to Boltzmann, the entropy is a logarithmic measure of the number of micro-complexions that correspond to, or are indistinguishable from, a given macroscopic state. Since the latter is a very large number, instead of being a proper fraction, Planck referred to it as a 'thermodynamic' probability [5]. Boltzmann was always of the opinion that large number gave better assurance than mere fractions.
As an illustration of how Boltzmann determined his entropy let us consider the distribution of N particles in k urns [6]. The a priori probability of there being particles in the ith urn is . The are not all independent but must obey the condition , and since probability must be conserved, . Then the probability that the first urn contains , the second urn contains , and so on, is , which is known as the multinomial distribution [7].
Now Boltzmann did not know what the a priori probabilities were so he supposed that they were all equal, , for all the urns. According to Boltzmann we should not discriminate among the urns. [But when he attaches different energies to the different urns, he precisely does discriminate among them.] The probability P thus becomes a 'thermodynamic' probability whose logarithm, , he set proportional to the entropy S. The constant of proportionality, or the universal gas constant divided by Avogadro's number, for one mole of gas, was later determined by Planck in his study of black-body radiation, at the same time he derived his constant [8]. Thus, Boltzmann's assumption of a priori equal probabilities [9] converts the multinomial distribution into a thermodynamic probability, which, instead of being an actual probability, is a very big number.
Entropy is a measure of the error that we commit when we observe a value different from its true value which we know to be the average value, . When we specify which average we are considering, the entropy will be determined from Gauss's error law which equates the average value with the most probable value Cite error: A <ref>
tag is missing the closing </ref>
(see the
help page).. This implies that
must be the same function of the observations that is of the average value. And if we take
, we obtain the probability distribution as
which is the well-known Poisson distribution under the condition that Stirling's approximation for the factorial is valid.
If we consider the independent probabilities of the occupation of the k different urns we must consider the product of the probabilities where we must now attach an index on the average value to show that it is the average value for the ith urn. The product then gives the multinomial distribution, where the a priori probabilities are now seen to be . Thus we see that is the Gibbs entropy which ensures that is a true probability, less than one, based on its strict concavity, The strict concavity of the entropy is its defining property, and corresponds to the thermodynamic criteria of stability. Consequently, the selection of which average is the true value and Gauss's error law determine the form of the entropy. The entropy is thus said to be the potential for the law of error, and this completes Boltzmann's principle [10].
I think this is clearer than what is found in the text. The same holds true for the other points mentioned.
Bernhlav (
talk)
16:15, 3 April 2011 (UTC)
I don't understand "the complete state of the gas" as the problem concerns the combinatorics, i.e. how to distribute a certain number of particles among given states. Bernhlav ( talk) 16:47, 6 April 2011 (UTC)
If a system can be divided into two independent parts A and B, then
Thus the entropy is additive. The number of particles in the systems is also additive. If the substance is uniform, consisting of the same type of particles at the same temperature, then any portion of it can be thought of as a sum of identical pieces each containing the same number of particle and the same entropy. Thus the portions' entropy and particle number should be proportional to each other. So what is the problem? JRSpriggs ( talk) 00:42, 15 April 2011 (UTC)
where is some function of the temperature, and the prime means differentiation with respect to it. If I multiply all extensive variables by some constant, , then it is true that the entropy will be increased times, but the entropy has the form so that it is not a linear function of .
However, the entropy of black body radiation,
is proportional to [which is eqn (60.13) in the same reference] where is the radiation constant. The entropy has lost its logarithmic form. Moreover, the Renyi entropy
for all is also additive (and concave in the interval). It becomes the Gibbs-Shannon entropy in the limit so that the Gibbs-Shannon entropy is a limiting form of the Renyi entropy. Not to mention the non-additive entropies, to which the Havrda-Charvat entropy belongs,
which is also mentioned in the text under the name of the Tsallis entropy.
There is still the more fundamental reason why the Gibbs-Shannon entropy cannot be the most general form of the entropy: It corresponds to the Poisson distribution which is a limiting distribution in the limit where and such that their product is constant. So the Gibbs-Shannon entropy is not the most general form of entropy from which all others can be derived. Bernhlav( talk) —Preceding unsigned comment added by 95.247.255.15 ( talk) 10:27, 17 April 2011 (UTC)
can be integrated to give
where is a constant having dimensions of an entropy density. The exponents in the logarithm must be such that the entropy be an extensive quantity. This is explained in H. Callen, Thermodynamics, 2nd ed. pp. 67-68. The N in the denominator is nothing more than the inverse probabilities in the Gibbs-Shannon entropy,
Bernhlav ( talk) 07:49, 18 April 2011 (UTC)
To Bernhlav: When varying N (quantity of substance), extensional quantities like E (energy) and V (volume) should not be held constant. Rather it is the ratios e=E/N and v=V/N which are constant. Thus your formula becomes
as required. JRSpriggs ( talk) 10:06, 18 April 2011 (UTC)
I think the most important one is the assumption that each microstate is equally probable (equal apriori probability). This is not as simple as it sounds, because for a thermodynamic process, in which things are changing, you will have to deal with the time element, i.e. how collisions produce and maintain this equal apriori probability. PAR ( talk) 17:07, 24 April 2011 (UTC)
I don't think that the traditional foundations of statistical mechanics are still being taken very serious. see e.g. here. The reason why statistical mechachanics works is still being vigorously debated. What I find interesting is the Eigenstate Thermalization Hypothesys (ETH), which basically boils down to assuming that a randomly chosen eigenstates of an isolated system with a large number of degrees of freedom will look like a thermal state. See here for an outline and here for a paper by Mark Srednicki on this topic. Count Iblis ( talk) 15:44, 25 April 2011 (UTC)
"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat"
The above first "Entropy" article(opening) paragraph is completely WRONG! Entropy is NOT “measure of the energy not available for useful work” since entropy is an equilibrium property and work is energy in transfer during a process. Actually, the higher system temperature the higher entropy (everything else the same) and more potential for work with reference to the same surroundings (so work is not property but process variable). The second sentence is confusing and ambiguous, thus not accurate. Maximum efficiency is NOT “when converting energy to work,” but if work is obtained in reversible processes (like ideal Carnot cycle). The last (third) sentence is confusing and plain WRONG: “During this(?) work entropy accumulates in the system” - the entropy does NOT accumulates (what system, where?) since entropy is not associated with work but with thermal energy per unit of absolute temperature. Actually, during ideal Carnot cycle (max work efficiency) the entropy is not generated, but conserved: entropy rate to the cycle is equal to the entropy rate out of the cycle. Not a good idea to start Entropy article with confusing and ambiguous, thus inaccurate statements. See entropy definition at: http://www.kostic.niu.edu/Fundamental_Nature_Laws_n_Definitions.htm#DEFINITION_of_ENTROPY —Preceding unsigned comment added by 24.14.178.97 ( talk) 04:25, 14 February 2011 (UTC)
I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. Count Iblis ( talk) 21:39, 31 May 2011 (UTC)
The terms "in general" and "generally" are often ambiguous. For example, the section on The second law of thermodynamics begins:
The last part of this section indicates (without reference from the above quote) an apparent exception:
However, it is not made clear whether this is the only exception.
Also, any exceptions noted here should be made harmonious with the Second law of thermodynamics article.
The terms "in general" and "generally" appear elsewhere in the article, but this is arguably the one most in need of clarification.
Dorkenergy ( talk) 07:26, 4 July 2011 (UTC)
It is impossible for us to cause a decrease in total entropy. However, very small decreases occur spontaneously, but very rarely and very briefly. If the system is not already at equilibrium, this tendency would be overwhelmed by the increase in entropy due to dissipation of free energy. Any attempt to use the spontaneous decreases to do useful work is doomed to failure because more work would be expended to identify and capture them than could thereby be generated. JRSpriggs ( talk) 09:07, 14 July 2011 (UTC)
"Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature,"
The reality is that energy does not always remain thermal. In fact, a car internal combustion engine converts some amount of thermal energy into vehicle motion. In no sense can a temperature be defined for a vehicle as function of a vehicle speed, tire speed, crankshaft speed, or any other speed of any rotating and/or reciprocating car part. It doesn't take long to realize that energy that leaves the system or any energy that remains in the system in some form other than thermal energy will contribute to TdS. All TdS is somehow accounted for in this way. I italicized the latter to implicate the fact that not all TdS constitutes unusable "waste heat". The quantity TdS also encapsulates thermal energy that was lost due to expansion itself (which is in fact a mechanism capable of expansion work that requires no heat transfer whatsoever from any reservoir to any other reservoir for thermal energy to be lost; look it up at
adiabatic process). Recovering regeneratively any work done (kinetic energy of pistons, vehicles, etc.), such as by running electric motors in reverse (e.g. regenerative braking) can reduce the rate at which entropy increases. This non-thermal energy recovered can indeed spontaneously flow from a lower to a higher temperature, and this is because which direction non-thermal energy flows is not principally determined by temperature, but rather by inertia as well as external forces to the body, which are largely unrelated to temperature, such as the force that pulls a cannonball to the Earth. Entropy generally rises because recovery of kinetic energy that was previously in the form of thermal energy derived from waste heat only supplies a tiny fraction of our energy resources. However, the idea that energy in general cannot flow from cold to hot spontaneously is clearly flawed. A windmill milling grain is a clear example of this fact, wherein only non-thermal energy plays a role as far as work done is concerned. Ask yourself, "What temperature of wind and what temperature of grain is relevant to the mere fact that a windmill operates?" That limitation only exists for thermal energy. And there are some exceptions to the rule concerning thermal energy: (Example: Shine a warm light onto a much brighter light more so than the other way around by simply inserting a wavelength-specific filter in between, and though such an effect is of limited utility at this time, it does constitute an example where thermal energy can flow spontaneously from cold to hot. Again it's unlikely, but it's still spontaneous.)
siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 +
talk
23:19, 22 August 2011 (UTC)
If entropy is considered an equilibrium property as in energy physics, then it conflicts with the conservation of information. But the second law of thermodynamics may simply be a apparent effect of the conservation of information, that is, entropy is really the amount of information it takes to describe a system, each reaction creates new information but information cannot be destroyed. That means the second law of thermodynamics is not an independent law of physics at all, but just an apparent effect of the fact that information can be created but not destroyed. The arrow of time is thus not about destruction, but about the continuous creation of information. This explains how the same laws of physics can cause self-organization. Organized systems are not anyhow less chaotic than non-organized systems at all, and the spill heat life produces can, in an information-physical sense, be considered beings eliminated by evolution rather than a step towards equilibrium. It is possible that overload of information will cause the arrow of time to extract vacuum energy into usefulness rather than heat death. 217.28.207.226 ( talk) 10:49, 23 August 2011 (UTC)Martin J Sallberg
I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed. 95.209.181.217 ( talk) 18:28, 23 August 2011 (UTC)Martin J Sallberg
psychological entropy is bullshit — Preceding unsigned comment added by 68.51.78.62 ( talk) 01:42, 23 November 2011 (UTC)
This is nonsense. The Wikipedia article on energy defines it as "the ability a physical system has to do work on other physical systems". Therefore by definition, something which is not available for work is not energy.
Gcsnelgar ( talk) 23:39, 22 December 2011 (UTC)
Entropy is necessarily a concave function of the extensive variables. Saying that entropy is proportional to the area is the same as saying that it is a convex function of the (Schwarzschild) radius, which is proportional to the mass. The putative second law leads to incorrect inequalities (c.f. arXiv:1110.5322). As was brought out in the black hole thermodynamics article, black hole 'evaporation' would lead to a violation of the supposed second law which says that the area of a black hole can only increase. In fact, black body radiation cannot be used as a mechanism for black hole evaporation since a heat source is necessary to keep the walls of the cavity at a given temperature which produces thermal radiation. Thermal radiation is an equilibrium process, as Einstein showed back in 1917, and does not lead to irreversible processes as those that would occur in processes related to evaporation.
According to black hole thermodynamics, the temperature would decrease with energy, and this does violate the second law (cf. http://www.youtube.com/watch?v=dpzbDfqcZSw).
Since when does a personal opinion constitute a scientific publication? which, as noted, is not even referenced. This section, together with reference 60, should be deleted from an otherwise respectable wikipedia article Bernhlav ( talk) 23:01, 28 December 2011 (UTC)bernhlav.
For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy was asserted by physicist Erwin Schrödinger in a 1944 book What is Life?. He posed: “How does the living organism avoid decay?”The obvious answer is: “By eating, drinking, breathing and (in the case of plants) assimilating.” Recent writings have used the concept of Gibbs free energy to elaborate on this issue. [11] While energy from nutrients is necessary to sustain an organism’s order, there is also the Schrӧdinger prescience: “An organism’s astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids… What is Life?. We now know that the ‘aperiodic’ crystal is DNA and that the irregular arrangement is a form of information. “The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by. ”specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell.” [12] DNA and other macromolecules determine an organism’s life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size as genetics is the governing factor. At some point, organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or be taught each generation. Therefore DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann’s perspective of the second law, the change of state from a more probable, less ordered and high entropy arrangement to one of less probability, more order, and lower entropy seen in biological ordering calls for a function like that known of DNA. DNA’s apparent information processing function provides a resolution of the paradox posed by life and the entropy requirement of the second law.. [13] LEBOLTZMANN2 ( talk) 15:51, 9 February 2012 (UTC)
Strenuous efforts are being made to correct the assumption that entropy equates to a degree of order (even in Schrodinger, as quoted above). Yet this article directly perpetuates that error (search 'order' in the main article). As WP is, for better or worse, likely to be the most-read source on entropy for non-physicists/chemists, can we please address this issue and assist the efforts to correct this popular misconception? Rather than being perpetuated in the article, I feel a paragraph directly addressing the misconception would be in order. I don't feel suitably qualified to make the correction myself, but a fundamental observation would be that the reagents in an exothermic chemical reaction are NOT (necessarily) more 'ordered' than the products + the heat energy emitted. Likewise, the heat-emitting crystallisation (increase in 'order') taking place in certain brands of hand-warmer is not likely to be a violation of the 2nd Law of Thermodynamics! Allangmiller ( talk) 09:10, 29 April 2012 (UTC)
I can't understand what entropy is from the article. I suggest that the article begins with a definition. What it begins with now I understand as a statement what entropy can be used for, and not a definition.
Then there are multiple definitions further down but after reading them I still unfortunately have no idea what entropy is. — Preceding unsigned comment added by 77.58.143.68 ( talk) 14:08, 30 April 2012 (UTC)
An example in nature would be helpful to understanding this concept better: "Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such an event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system." Thanks 165.212.189.187 ( talk) 18:56, 23 July 2012 (UTC)
2.216.69.181 ( talk) 09:43, 21 July 2012 (UTC)
Evolution can be seen as reverse entropy. Patrickwooldridge ( talk) 23:45, 13 September 2012 (UTC) ... Yes, an interesting point and suggestion, although I would be inclined to describe the development of Life (and its evolution) as a type of "compensation" rather than a reversal. -- DLMcN ( talk) 05:39, 14 September 2012 (UTC)
This is a great discussion. I would encourage everyone reading or participating to rigorously distinguish between "Life" (whether individual organisms, species, or life in general) and "Evolution" (the physical force or principle which has led life from simpler forms to more complex, more ordered forms). Patrickwooldridge ( talk) 23:58, 23 September 2012 (UTC)
I don't want to disturb what appears to be a stable article of long standing, so I will make a suggestion for an addition to the lead section here first. The lead should be optimally accessable to a general reader. Entropy can be explained to a general reader, even in elementary school, with examples that require no technical or semi-technical language. I first heard about it with an example of a baloon full of smoke popping, whereby the gas spread out and got less well ordered. "Don't cry over spilled milk" and "you can't take the cream out of the coffee" both means that there is a direction in time by which you can't put the milk back, because of entropy, or in more extreme perspectives, that the direction of time is entropy (lot's of WP:RS on that, Reichenbach, Sklar, etc.). Putting WP:MOS Plain English examples like this in the lead would allow a general reader to quickly get the idea, but would change the technical tone of he existing stable article. I will add something like it unless there is an objection here, and put it up top because of MOS reasoning on the most accessable stuff being up top. 64.134.221.228 ( talk) 12:56, 16 September 2012 (UTC)
Are people opposed to editing the first sentence. It looks like it has been stable for a long time, but, as far as I can tell, it's not even gramattically correct. It doesn't really say anything. Any thoughts? Sirsparksalot ( talk) 14:38, 18 September 2012 (UTC)
- Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down.
- Entropy is the thermodynamic property toward equilibrium. Entropy is the property that produces average, creates homogenization, and causes dissipation. Entropy arises from the second law of thermodynamics, which says that uneven temperatures will attempt to equalize. Although, according to the first law of thermodynamics, energy cannot be created or destroyed, the hotter, more dynamic areas of a system will lose heat or energy, causing cooler areas to get warmer or gain energy. Examples of entropy include thermal conduction, in which heat transmits from hotter areas to cooler areas, until thermal equilibrium is reached. It also includes diffusion, where molecules of solute in a solvent, or mixtures of gases or liquids, tend to evenly and irreversibly distribute. Material objects wear out, organisms die, and the universe is cooling down.
Entropy is the thermodynamic property toward equilibrium.
Entropy is the property that produces average, creates homogenization, and causes dissipation.
Entropy arises from the second law of thermodynamics,...
... which says that uneven temperatures will attempt to equalize.
Examples of entropy include...
The only correct definition is this: "The entropy of a system is the amount of information needed to specify the exact physical state it is in." You can then decide in what units to measure this information, which defines the prefactor k in S = k Log(Omega). But it should be clear that if a system can be in Omega number of states, that Log(Omega) is proportional to the number of bits of information needed to point out exactly which if the Omega states the system actually is in.
With this fundamental definition of entropy it is also easy to make contact with thermodynamics, and then you also have a rigorous definition of heat, work and temperature in terms of fundamental concepts. The second law follows naturally from all of this. Count Iblis ( talk) 02:23, 21 September 2012 (UTC)
I've been thinking about this since my last post here. I've read through lots of books and websites alike. I thought it would be easy enought to come up with one sentence which could sum up what entropy means, but, it turns out, there are as many different definitions as there are people giving them. I'm beginning to agree with Swinburne, in that perhaps we still lack the vocabulary, or even the fundamental understanding of the concept, to put it eloquently in words. (In fact, check this out if you want to see what this exact discussion looked like a hundred years ago.)
In all of this, however, there appears to be one definition that all branches of science can agree upon. It alnost seems too obvious, yet it may be like the forest that cannot be seen through all of the trees. Perhaps it may be best to start by saying that entropy is a ratio (joules per kelvin) or a rate measurement, similar to miles per hour (velocity), cents per dollar (profit margin), or ounces per gallon (mixing rate). Maybe, if we start with a very simple statement like that, the rest of the lede and, hopefully, the article will make more sense to the newcomer. (Actually, I think the analogy to profit margin is a rather good one for negentropy, because its easy for anyone to visualize, whereas entropy is more like the expense margin.) Zaereth ( talk) 01:28, 4 January 2013 (UTC)