Content deleted Content added
some cleanup and correction |
m →Thermodynamic entropy: Added wikilinks for first mention of heat and temperature. |
||
(6 intermediate revisions by 3 users not shown) | |||
Line 12:
Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of [[thermodynamic equilibrium]]. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: A glass of cool water will not [[Spontaneous process|spontaneously]] turn into a glass of warm water with an ice cube in it. Some processes in nature are almost reversible. For example, the orbiting of the planets around the Sun may be thought of as practically reversible: A movie of the planets orbiting the Sun which is run in reverse would not appear to be impossible.
While the second law, and thermodynamics in general, accurately predicts the intimate interactions of complex physical systems, scientists are not content with simply knowing how a system behaves, they also want to know ''why'' it behaves the way it does. The question of why entropy increases until equilibrium is reached was answered in 1877 by physicist [[Ludwig Boltzmann]]. The theory developed by Boltzmann and others
== Explanation ==
Line 18:
=== Thermodynamic entropy ===
The concept of [[Entropy (classical thermodynamics)|thermodynamic entropy]] arises from the [[second law of thermodynamics]]. This law of entropy increase quantifies the reduction in the capacity of an isolated compound thermodynamic system to do [[Work (thermodynamics)|thermodynamic work]] on its surroundings, or indicates whether a thermodynamic process may occur. For example, whenever there is a suitable pathway, [[heat]] spontaneously flows from a hotter body to a colder one.
Thermodynamic entropy is measured as a change in entropy (<math>\Delta S</math>) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the [[macroscopic]] relationship between [[heat flow]] into the sub-system and the [[temperature]] at which it occurs summed over the boundary of that sub-system.
Following the [[Clausius theorem|formalism of Clausius]], the basic calculation can be mathematically stated as:<ref>I. Klotz, R. Rosenberg, ''Chemical Thermodynamics – Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125</ref>
Line 46:
It turns out that if you flip a large number of coins, the macrostates at or near half heads and half tails accounts for almost all of the microstates. In other words, for a million coins, you can be fairly sure that about half will be heads and half tails. The macrostates around a 50–50 ratio of heads to tails will be the "equilibrium" macrostate. A real physical system in equilibrium has a huge number of possible microstates and almost all of them are the equilibrium macrostate, and that is the macrostate you will almost certainly see if you wait long enough. In the coin example, if you start out with a very unlikely macrostate (like all heads, for example with zero entropy) and begin flipping one coin at a time, the entropy of the macrostate will start increasing, just as thermodynamic entropy does, and after a while, the coins will most likely be at or near that 50–50 macrostate, which has the greatest information entropy – the equilibrium entropy.
The macrostate of a system is what we know about the system, for example the
The concept of information entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. When it is applied to the problem of a large number of interacting particles, along with some other constraints, like the conservation of energy, and the assumption that all microstates are equally likely, the resultant theory of statistical mechanics is extremely successful in explaining the [[laws of thermodynamics]].
Line 54:
== Example of increasing entropy ==
{{Main article|Disgregation}}
Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice and water which has been allowed to reach [[thermodynamic equilibrium]] at the melting temperature of ice. In this system, some
The entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of {{val|298
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the {{sfrac|δ''Q''|''T''}} over the continuous range, "at many increments", in the initially cool to finally warm water can be found by calculus. The entire miniature 'universe', i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that 'universe' than when the glass of ice and water was introduced and became a 'system' within it.
Line 71:
* <math>\Delta S</math> is the change in entropy of a system (some physical substance of interest) after some motional energy ("heat") has been transferred to it by fast-moving molecules. So, <math>\Delta S = S_\mathrm{final} - S _\mathrm{initial}</math>.
* Then, <math> \Delta S = S_\mathrm{final} - S _\mathrm{initial} = \frac{q_\mathrm{rev}}{T}</math>, the quotient of the motional energy ("heat") q that is transferred "reversibly" (rev) to the system from the surroundings (or from another system in contact with the first system) divided by T, the absolute temperature at which the transfer occurs.
** "Reversible" or "reversibly" (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That is easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example, in the melting of ice at {{val|273.15
** When the temperature is not at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy ("heat") from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of "T" at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many small temperature intervals or increments. For example, to find the entropy change <math>\frac{q_\mathrm{rev}}{T}</math> from {{val|300
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred "per incremental change in temperature" (the heat capacity, <math>C_p</math>), multiplied by the [[integral]] of <math>\frac{dT}{T}</math> from <math>T_\mathrm{initial}</math> to <math>T_\mathrm{final}</math>, is directly given by <math>\Delta S = C_p \ln\frac{T_\mathrm{final}}{T_\mathrm{initial}}</math>.
Line 89:
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|___location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
: If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen
: <math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> Sh
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>) yes/no questions that would have to be asked to get complete information about the system under study. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, one would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) one could say, equivalently, that it is log<sub>2</sub>(2) which equals the number of binary questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "shannon" (alternatively, "bit"). This is just a difference in units, much like the difference between inches and centimeters. (1 nat = log<sub>2</sub>''e'' shannons). Thermodynamic entropy is equal to the Boltzmann constant times the information entropy expressed in nats. The information entropy expressed with the unit [[shannon (unit)|shannon]] (Sh) is equal to the number of yes–no questions that need to be answered
: The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 shannons: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 shannons, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
: The connection between thermodynamic entropy and information entropy is given by Boltzmann's equation, which says that {{nowrap|1=''S'' = ''k''<sub>B</sub> ln ''W''}}. If we take the base-2 logarithm of ''W'', it will yield the average number of questions we must ask about the microstate of the physical system
== See also ==
|