Introduction to entropy: Difference between revisions

Content deleted Content added
GreenC bot (talk | contribs)
Rescued 1 archive link. Wayback Medic 2.5
Line 42:
:<math>S=k_\text{B}\,\ln W</math>
 
where ''S'' is the thermodynamic entropy, ''W'' is the number of microstates that may yield the macrostate, and <math>k_Bk_\text{B}</math> is the [[Boltzmann's constant]]. The [[natural logarithm]] of the number of microstates (<math>\ln W</math>) is known as the [[information entropy]] of the system. This can be illustrated by a simple example:
 
If you flip two coins, you can have four different results. If ''H'' is heads and ''T'' is tails, we can have (''H'',''H''), (''H'',''T''), (''T'',''H''), and (''T'',''T''). We can call each of these a "microstate" for which we know exactly the results of the process. But what if we have less information? Suppose we only know the total number of heads?. This can be either 0, 1, or 2. We can call these "macrostates". Only microstate (''T'',''T'') will give macrostate zero, (''H'',''T'') and (''T'',''H'') will give macrostate 1, and only (''H'',''H'') will give macrostate 2. So we can say that the information entropy of macrostates 0 and 2 are ln(1) which is zero, but the information entropy of macrostate 1 is ln(2) which is about 0.69. Of all the microstates, macrostate 1 accounts for half of them.