Content deleted Content added
→Information entropy and statistical mechanics: "yes-no" question example. |
|||
Line 97:
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
:If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen <ref name="questions">The minimum number of questions is achieved when each question either gives an answer with certainty, or cuts the remaining uncertainty in half. For example, it we had a probability function <math>P_i = (1/8,1/2,1/8,1/4)</math> associated with a variable <math>x=(x_1,x_2,x_3,x_4)</math>, then the optimum mode of questioning would be to first ask "is x equal to x<sub>2</sub>?" If the answer is "yes", then ''x'' is certainly equal to x<sub>2</sub> after asking only one question, and the probability of this happening is ''P<sub>2</sub>=1/2''. If the answer is "no", then the next question would be "Is ''x'' equal to ''x<sub>4</sub>''? If the answer is yes, then ''x'' is certainly equal to x<sub>4</sub> after asking two questions, and the probability of this happening is ''P<sub>4</sub>=1/4''.
:<math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> bits
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly
:The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 bits: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 bits, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
|