Content deleted Content added
City (RiME (talk | contribs) correct |
|||
Line 43:
where ''S'' is the thermodynamic entropy, ''W'' is the number of microstates that may yield the macrostate, and <math>k_B</math> is [[Boltzmann's constant]]. The [[natural logarithm]] of the number of microstates (<math>\ln W</math>) is known as the [[information entropy]] of the system. This can be illustrated by a simple example:
If you flip two coins, you can have four different results. If ''H'' is heads and ''T'' is tails, we can have (''H'',''H''), (''H'',''T''), (''T'',''H''), and (''T'',''T''). We can call each of these a "microstate" for which we know exactly the results of the process. But what if we have less information? Suppose we only know the total number of heads?. This can be either 0, 1, or 2. We can call these "macrostates". Only microstate (''T'',''T'') will give macrostate zero, (''H'',''T'') and (''T'',''H'') will give macrostate 1, and only (''H'',''H'') will give macrostate 2. So we can say that the information entropy of macrostates 0 and 2 are ln(1) which is zero, but the information entropy of macrostate 1 is ln(2) which is about 0.69. Of all the microstates, macrostate
It turns out that if you flip a large number of coins, the macrostates at or near half heads and half tails accounts for almost all of the microstates. In other words, for a million coins, you can be fairly sure that about half will be heads and half tails. The macrostates around a 50-50 ratio of heads to tails will be the "equilibrium" macrostate. A real physical system in equilibrium has a huge number of possible microstates and almost all of them are the equilibrium macrostate, and that is the macrostate you will almost certainly see if you wait long enough. In the coin example, if you start out with a very unlikely macrostate (like all heads, for example with zero entropy) and begin flipping one coin at a time, the entropy of the macrostate will start increasing, just as thermodynamic entropy does, and after a while, the coins will most likely be at or near that 50-50 macrostate, which has the greatest information entropy - the equilibrium entropy.
|