Introduction to entropy: Difference between revisions

Content deleted Content added
Line 74:
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred “per incremental change in temperature” (the heat capacity, <math>C_p</math>), multiplied by the [[integral]] of <math>\frac{dT}{T}</math> from <math>T_\mathrm{initial}</math> to <math>T_\mathrm{final}</math>, is directly given by <math>\Delta S = C_p \ln\frac{T_\mathrm{final}}{T_\mathrm{initial}}</math>.
 
==Versions Alternate explanations of entropy==
 
=== Thermodynamic entropyEntropy===
 
*'''A measure of energy unavailable for work''': This oftis an often-repeated phrase referswhich, toalthough theit originalis contexttrue, ofrequires theconsiderable inventionclarification ofin theorder termto 'entropy',be thatunderstood. ofIt is not true except for cyclic reversible processes, asand is in this sense misleading. By "work" is meant moving an object, for example, inlifting a steamweight, engineor bringing a flywheel up to speed, or carrying a load up a hill. In aorder physicallyto impossibleconvert theoreticallyheat imaginedinto scenariowork, givenusing a cylindercoal-burning ofsteam gasengine, withfor aexample, pistonone must have two systems at different temperatures, and athe heatamount reservoirof withwork you can extract depends on how large the temperature absolutedifference zerois, alland ofhow large the gas'ssystems internalare. energyIf wouldone beof convertiblethe tosystems workis thatat wouldroom movetemperature, and the piston.other system is much Realisticallylarger, givenand anear heatabsolute reservoirzero temperature, thatthen canalmost acceptALL of the energy asof heat,the room temperature system can be converted to work. If they are both at somethe practicallysame accessibleroom temperature, belowthen thatNONE of the initial stateenergy of the gas,room intemperature principle,system therecan isbe availableconverted fromto thework. gasEntropy ais maximumthen amounta measure of energyhow thatmuch canenergy movecannot thebe pistonconverted soto thatwork, throughgiven suitablethese leversconditions. More precisely, itfor canan liftisolated asystem weight.comprising Thermodynamictwo entropyclosed issystems usefulat fordifferent calculatingtemperatures, thatin maximumthe availableprocess of reaching equilibrium the amount of work.entropy Thelost restby the hot system, multiplied by the temperature of the gas'shot internalsystem, is the amount of energy isthat unavailablecannot forconverted to work.
 
*'''An indicator of irreversibility''': fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when [[James Prescott Joule]] used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
 
===MicroscopicInformation explanationentropy and interpretationStatistical mechanics===
 
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|___location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
'''Microstate dispersal''': [[Edward A. Guggenheim]] proposed an ordinary language interpretation of entropy that may be rendered as 'dispersal of modes of microscopic motion throughout their accessible range'.<ref name="Dugdale 101">Dugdale, J.S. (1996). ''Entropy and its Physical Meaning'', Taylor & Francis, London, {{ISBN|0748405682}}, Dugdale cites only Guggenheim, on page 101.</ref><ref name="Guggenheim1949">Guggenheim, E.A. (1949), Statistical basis of thermodynamics, ''Research: A Journal of Science and its Applications'', '''2''', Butterworths, London, pp. 450–454; p. 453, "If instead of entropy one reads number of accessible states, or spread, the physical significance becomes clear."</ref> Later, along with a criticism of the idea of entropy as 'disorder', the dispersal interpretation was advocated by [[Frank L. Lambert]],<ref name=Lambert/><ref name="Lambert2005">{{cite journal |last1=Kozliak |first1=Evguenii I. |last2=Lambert |first2=Frank L.|date=2005 |title=“Order-to-Disorder” for Entropy Change? Consider the Numbers!|journal=Chem. Educator |volume=10 |pages= 24-25|access-date=December 13, 2020}}</ref> and is used in some student textbooks.<ref>For example: Atkins, P. W., de Paula J. Atkins' Physical Chemistry, 2006, W.H. Freeman and Company, 8th edition, {{ISBN|9780716787594}}. Brown, T. L., H. E. LeMay, B. E. Bursten, C.J. Murphy, P. Woodward, M.E. Stoltzfus 2017. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, {{ISBN|9780134414232}}. Ebbing, D.D., and S. D. Gammon, 2017. General Chemistry, 11th ed. Centage Learning 1190pp, {{ISBN|9781305580343}}. Petrucci, Herring, Madura, Bissonnette 2011 General Chemistry: Principles and Modern Applications, 10th edition, 1426 pages, Pearson Canada {{ISBN|9780132064521}}.</ref>
 
* [[Entropy (energy dispersal)|'''Microstate dispersalDispersal''']]: [[Edward A. Guggenheim]] proposed an ordinary language interpretation of entropy  that may be rendered as 'dispersal of modes of microscopic motion throughout their accessible range'.<ref name="Dugdale 101">Dugdale, J.S. (1996). ''Entropy and its Physical Meaning'', Taylor & Francis, London, {{ISBN|0748405682}}, Dugdale cites only Guggenheim, on page 101.</ref><ref name="Guggenheim1949">Guggenheim, E.A. (1949), Statistical basis of thermodynamics, ''Research: A Journal of Science and its Applications'', '''2''', Butterworths, London, pp. 450–454; p. 453, "If instead of entropy one reads number of accessible states, or spread, the physical significance becomes clear."</ref> Later, along with a criticism of the idea of entropy as 'disorder', the dispersal interpretation was advocated by [[Frank L. Lambert]],<ref name=Lambert/><ref name="Lambert2005">{{cite journal |last1=Kozliak |first1=Evguenii I. |last2=Lambert |first2=Frank L.|date=2005 |title=“Order-to-Disorder” for Entropy Change? Consider the Numbers!|journal=Chem. Educator |volume=10 |pages= 24-25|access-date=December 13, 2020}}</ref> and is used in some student textbooks.<ref>For example: Atkins, P. W., de Paula J. Atkins' Physical Chemistry, 2006, W.H. Freeman and Company, 8th edition, {{ISBN|9780716787594}}. Brown, T. L., H. E. LeMay, B. E. Bursten, C.J. Murphy, P. Woodward, M.E. Stoltzfus 2017. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, {{ISBN|9780134414232}}. Ebbing, D.D., and S. D. Gammon, 2017. General Chemistry, 11th ed. Centage Learning 1190pp, {{ISBN|9781305580343}}. Petrucci, Herring, Madura, Bissonnette 2011 General Chemistry: Principles and Modern Applications, 10th edition, 1426 pages, Pearson Canada {{ISBN|9780132064521}}.</ref>
The interpretation properly refers to dispersal in abstract microstate spaces, but it may be loosely visualised in some simple examples of spatial spread of matter or energy. If a partition is removed from between two different gases, the molecules of each gas spontaneously disperse as widely as possible into their respectively newly accessible volumes; this may be thought of as mixing. If a partition, that blocks heat transfer between two bodies of different temperatures, is removed so that heat can pass between the bodies, then energy spontaneously disperses or spreads as heat from the hotter to the colder.
 
:The interpretation properly refers to dispersal in abstract microstate spaces, but it may be loosely visualised in some simple examples of spatial spread of matter or energy. If a partition is removed from between two different gases, the molecules of each gas spontaneously disperse as widely as possible into their respectively newly accessible volumes; this may be thought of as mixing. If a partition, that blocks heat transfer between two bodies of different temperatures, is removed so that heat can pass between the bodies, then energy spontaneously disperses or spreads as heat from the hotter to the colder.
Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic [[phase space]]. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.
 
:Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic  [[phase space]]. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.
===Information entropy===
 
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|___location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
 
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is ratherquite accurateuseful. Unfortunately, however, it gives less intuitive insight into physical concept of thermodynamic entropy than other approaches to understanding entropy.
 
:If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is equal to the average number of (carefully chosen) yes-no questions we would have to ask in order to get complete information on the system we are dealing with. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, we would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) we could say, equivalently, that it is Log<sub>2</sub>(2) which equals the number of questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "bit". This is just a difference in units, much like the difference between inches and centimeters.