Introduction to entropy: Difference between revisions

Content deleted Content added
fix previous editors c/e for sense
m dash fix (via WP:JWB)
Line 101:
:<math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> bits
 
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>) yes-noyes–no questions we would have to ask in order to get complete information on the system we are dealing with. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, we would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) we could say, equivalently, that it is Log<sub>2</sub>(2) which equals the number of questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "bit". This is just a difference in units, much like the difference between inches and centimeters. (1 nat = ''e'' bits). Thermodynamic entropy is equal to Boltzmann's constant times the information entropy expressed in nats. The information entropy expressed in bits is equal to the number of yes-noyes–no questions that need to be answered in order to determine the microstate from the macrostate.
 
:The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 bits: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 bits, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.