Content deleted Content added
m dash fix (via WP:JWB) |
copyedit |
||
Line 81:
=== Thermodynamic entropy===
*'''A measure of energy unavailable for work''': This is an often-repeated phrase which, although it is true, requires considerable clarification
*'''An indicator of irreversibility''': fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when [[James Prescott Joule]] used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
Line 97:
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
:If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen <ref name="questions">The minimum number of questions is achieved when each question either gives an answer with certainty, or cuts the remaining uncertainty in half. For example, it we had a probability function <math>P_i = (1/8,1/2,1/8,1/4)</math> associated with a variable <math>x=(x_1,x_2,x_3,x_4)</math>, then the optimum mode of questioning would be to first ask "is x equal to x<sub>2</sub>?" If the answer is "yes", then ''x'' is certainly equal to x<sub>2</sub> after asking only one question, and the probability of this happening is ''P<sub>2</sub>=1/2''. If the answer is "no", then the next question would be "Is ''x'' equal to ''x<sub>4</sub>''? If the answer is yes, then ''x'' is certainly equal to x<sub>4</sub> after asking two questions, and the probability of this happening is ''P<sub>4</sub>=1/4''. If the answer is "no", we may finally ask "is ''x'' equal to ''x<sub>1</sub>''?. If the answer is yes, then the ''x'' is certainly equal to ''x<sub>1</sub>'' and if not, then ''x'' is certainly equal to ''x<sub>3</sub>'', and the probability of requiring three questions is ''P<sub>1</sub>+P<sub>3</sub>=1/4''. The average number of questions asked
:<math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> bits
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>)
:The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 bits: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 bits, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
|