Introduction to entropy: Difference between revisions

Content deleted Content added
Line 21:
Thermodynamic entropy is only measured as a change in entropy (<math>\Delta S</math>) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the [[macroscopic]] relationship between [[heat flow]] into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system.
 
Following the [[Clausius theorem|formalism of Clausius]], the firstbasic calculation can be mathematically stated as:<ref>I. Klotz, R. Rosenberg, ''Chemical Thermodynamics – Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125</ref>
: <math>{\rm \delta}S = \frac{{\rm \delta}q}{T}.</math>
 
Line 29:
: <math>{{\rm \delta}S} \ge {\frac{{\rm \delta}q}{T}}.</math>
 
According to the [[first law of thermodynamics]], which deals with the [[conservation of energy]], the loss <math>\delta q</math> of heat will result in a decrease in the [[internal energy]] of the [[thermodynamic system]]. Thermodynamic entropy provides a comparative measure of the amount of decrease in internal energy and the corresponding increase in internal energy of the surroundings at a given temperature. In many cases, a visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. When applicable, entropy changeincrease is the quantitative measure of that kind of a spontaneous process: how much energy has flowedbeen effectively lost or howbecome widelyunavailable, itby hasdispersing becomeitself, spreador spreading itself out, as assessed at a specific temperature. For this assessment, when the temperature is higher, the amount of energy dispersed is assessed as 'costing' proportionately less. This is because a hotter body is generally more able to do thermodynamic work, other factors, such as internal energy, being equal. This is why a steam engine has a hot firebox.
 
===Statistical mechanics and information entropy===