Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.3 |
|||
Line 95:
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|___location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
:If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen
:<math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> bits
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly valid only for special cases, and becomes more valid as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>) yes-no questions we would have to ask in order to get complete information on the system we are dealing with. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, we would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) we could say, equivalently, that it is Log<sub>2</sub>(2) which equals the number of questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "bit". This is just a difference in units, much like the difference between inches and centimeters. (1 nat = ''e'' bits). Thermodynamic entropy is equal to Boltzmann's constant times the information entropy expressed in nats. The information entropy expressed in bits is equal to the number of yes-no questions that need to be answered in order to determine the microstate from the macrostate.
:The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 bits: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 bits, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
|