Introduction to entropy: Difference between revisions

Content deleted Content added
m replaced emphasis all-caps with italics (MOS:IT), periods only for complete-sentence image captions (MOS:CAPFRAG), punctuation fixes for logical quotation style (MOS:LQ), rm contractions, changed curly to straight punctuation (MOS:CURLY)
m Alternate explanations of entropy: Eliminated two cases of unnecessary capitalisation of words in heading titles.
Line 79:
== Alternate explanations of entropy==
 
=== Thermodynamic Entropyentropy===
 
*'''A measure of energy unavailable for work''': This is an often-repeated phrase which, although it is true, requires considerable clarification in order to be understood. It is not true except for cyclic reversible processes and is in this sense misleading. By "work" is meant moving an object, for example, lifting a weight, or bringing a flywheel up to speed, or carrying a load up a hill. In order to convert heat into work, using a coal-burning steam engine, for example, one must have two systems at different temperatures, and the amount of work you can extract depends on how large the temperature difference is, and how large the systems are. If one of the systems is at room temperature, and the other system is much larger, and near absolute zero temperature, then almost ALL of the energy of the room temperature system can be converted to work. If they are both at the same room temperature, then NONE of the energy of the room temperature system can be converted to work. Entropy is then a measure of how much energy cannot be converted to work, given these conditions. More precisely, for an isolated system comprising two closed systems at different temperatures, in the process of reaching equilibrium the amount of entropy lost by the hot system, multiplied by the temperature of the hot system, is the amount of energy that cannot converted to work.
Line 91:
:Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic [[phase space]]. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.
 
===Information entropy and Statisticalstatistical mechanics===
 
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|___location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.