Content deleted Content added
Maxeto0910 (talk | contribs) Added short description. Tags: Mobile edit Mobile web edit Advanced mobile edit |
SimLibrarian (talk | contribs) m replaced emphasis all-caps with italics (MOS:IT), periods only for complete-sentence image captions (MOS:CAPFRAG), punctuation fixes for logical quotation style (MOS:LQ), rm contractions, changed curly to straight punctuation (MOS:CURLY) |
||
Line 6:
}}
{{Thermodynamics|cTopic=[[List of thermodynamic properties|System properties]]}}
In [[thermodynamics]], '''entropy''' is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, you can pour cream into coffee and mix it, but you cannot "unmix" it; you can burn a piece of wood, but you
If you reversed a movie of coffee being mixed or wood being burned, you would see things that are impossible in the real world. Another way of saying that those reverse processes are impossible is to say that mixing coffee and burning wood are "irreversible". Irreversibility is described by an important law of nature known as the [[second law of thermodynamics]], which says that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.<ref>Theoretically, coffee can be "unmixed" and wood can be "unburned", but for this you would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".</ref>
Line 12:
Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of [[thermodynamic equilibrium]]. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: An ice cube in a glass of warm water will not spontaneously form from a glass of cool water. Some processes in nature are almost reversible. For example, the orbiting of the planets around the sun may be thought of as practically reversible: A movie of the planets orbiting the sun which is run in reverse would not appear to be impossible.
While the second law, and thermodynamics in general, is accurate in its predictions of intimate interactions of complex physical systems behave, scientists are not content with simply knowing how a system behaves, but want to know also
==Explanation==
Line 52:
The concept of information entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. When it is applied to the problem of a large number of interacting particles, along with some other constraints, like the conservation of energy, and the assumption that all microstates are equally likely, the resultant theory of statistical mechanics is extremely successful in explaining the [[laws of thermodynamics]].
[[Image:Ice water.jpg|thumb|Ice melting provides an example of entropy ''increasing.'']]
==Example of increasing entropy==
Line 60:
It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of {{sfrac|δ''Q''|298 K}} for the surroundings is smaller than the ratio (entropy change), of {{sfrac|δ''Q''|273 K}} for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the {{sfrac|δ''Q''|''T''}} over the continuous range,
==Origins and uses==
Originally, entropy was named to describe the "waste heat
For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the [[kinetic energy|"motional" (i.e. kinetic) energy]] of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref name=Lambert>[http://entropysite.oxy.edu Entropy Sites — A Guide] Content selected by [[Frank L. Lambert]]</ref> Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
Line 70:
===Classical calculation of entropy===
When the word 'entropy' was first defined and used in 1865, the very existence of atoms was still controversial, though it had long been speculated that temperature was due to the motion of microscopic constituents and that
* <math>\Delta S</math> is the change in entropy of a system (some physical substance of interest) after some motional energy (
* Then, <math> \Delta S = S_\mathrm{final} - S _\mathrm{initial} = \frac{q_\mathrm{rev}}{T}</math>, the quotient of the motional energy (
**
** When the temperature
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy
== Alternate explanations of entropy==
Line 85:
*'''An indicator of irreversibility''': fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when [[James Prescott Joule]] used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
* [[Entropy (energy dispersal)|'''Dispersal''']]: [[Edward A. Guggenheim]] proposed an ordinary language interpretation of entropy that may be rendered as
:The interpretation properly refers to dispersal in abstract microstate spaces, but it may be loosely visualised in some simple examples of spatial spread of matter or energy. If a partition is removed from between two different gases, the molecules of each gas spontaneously disperse as widely as possible into their respectively newly accessible volumes; this may be thought of as mixing. If a partition, that blocks heat transfer between two bodies of different temperatures, is removed so that heat can pass between the bodies, then energy spontaneously disperses or spreads as heat from the hotter to the colder.
:Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic [[phase space]]. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.
===Information entropy and Statistical mechanics===
* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful. Unfortunately, however, it gives less intuitive insight into physical concept of thermodynamic entropy than other approaches to understanding entropy.
|