Introduction to entropy: Difference between revisions

Content deleted Content added
m dash fix (via WP:JWB)
copyedit
Line 81:
=== Thermodynamic entropy===
 
*'''A measure of energy unavailable for work''': This is an often-repeated phrase which, although it is true, requires considerable clarification in order to be understood. It is notonly true except for cyclic reversible processes, and is in this sense misleading. By "work" is meant moving an object, for example, lifting a weight, or bringing a flywheel up to speed, or carrying a load up a hill. In order toTo convert heat into work, using a coal-burning steam engine, for example, one must have two systems at different temperatures, and the amount of work you can extract depends on how large the temperature difference is, and how large the systems are. If one of the systems is at room temperature, and the other system is much larger, and near absolute zero temperature, then almost ALL of the energy of the room temperature system can be converted to work. If they are both at the same room temperature, then NONE of the energy of the room temperature system can be converted to work. Entropy is then a measure of how much energy cannot be converted to work, given these conditions. More precisely, for an isolated system comprising two closed systems at different temperatures, in the process of reaching equilibrium the amount of entropy lost by the hot system, multiplied by the temperature of the hot system, is the amount of energy that cannot converted to work.
 
*'''An indicator of irreversibility''': fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when [[James Prescott Joule]] used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
Line 97:
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
 
:If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen <ref name="questions">The minimum number of questions is achieved when each question either gives an answer with certainty, or cuts the remaining uncertainty in half. For example, it we had a probability function <math>P_i = (1/8,1/2,1/8,1/4)</math> associated with a variable <math>x=(x_1,x_2,x_3,x_4)</math>, then the optimum mode of questioning would be to first ask "is x equal to x<sub>2</sub>?" If the answer is "yes", then ''x'' is certainly equal to x<sub>2</sub> after asking only one question, and the probability of this happening is ''P<sub>2</sub>=1/2''. If the answer is "no", then the next question would be "Is ''x'' equal to ''x<sub>4</sub>''? If the answer is yes, then ''x'' is certainly equal to x<sub>4</sub> after asking two questions, and the probability of this happening is ''P<sub>4</sub>=1/4''. If the answer is "no", we may finally ask "is ''x'' equal to ''x<sub>1</sub>''?. If the answer is yes, then the ''x'' is certainly equal to ''x<sub>1</sub>'' and if not, then ''x'' is certainly equal to ''x<sub>3</sub>'', and the probability of requiring three questions is ''P<sub>1</sub>+P<sub>3</sub>=1/4''. The average number of questions asked willis then be ''Q=(1/2)(1)+(1/4)(2)+(1/4)(3) = 7/4''. Calculating the Shannon information entropy:
 
:<math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> bits
 
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>) yes–noyes/no questions wethat would have to askbe in orderasked to get complete information onabout the system we are dealingunder withstudy. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, weone would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) weone could say, equivalently, that it is Log<sub>2</sub>(2) which equals the number of questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "bit". This is just a difference in units, much like the difference between inches and centimeters. (1 nat = ''e'' bits). Thermodynamic entropy is equal to Boltzmann's constant times the information entropy expressed in nats. The information entropy expressed in bits is equal to the number of yes–no questions that need to be answered in order to determine the microstate from the macrostate.
 
:The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 bits: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 bits, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.