Introduction to entropy: Difference between revisions

Content deleted Content added
Worrydream (talk | contribs)
fix links to Frank Lambert articles
uppercase per proper name and Wikipedia style (Sun), some edits for MOS:OUR (do not use "you" in encyclopedic articles)
Line 6:
}}
{{Thermodynamics|cTopic=[[List of thermodynamic properties|System properties]]}}
In [[thermodynamics]], '''entropy''' is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, yousomeone can pour cream into coffee and mix it, but youthey cannot "unmix" it; youthey can burn a piece of wood, but you cannot "unburn" it. The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder.<ref name="lexico">{{cite web |title=Definition of entropy in English |url=https://www.lexico.com/en/definition/entropy |archive-url=https://web.archive.org/web/20190711005908/https://www.lexico.com/en/definition/entropy |url-status=dead |archive-date=July 11, 2019 |website=Lexico Powered By Oxford |access-date=18 November 2020}}</ref> A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.
 
If you reversed a movie ofshowed coffee being mixed or wood being burned, youit would seedepict things that are impossible in the real world. Another way of saying that those reverse processes are impossible is to say that mixingMixing coffee and burning wood are "irreversible". Irreversibility is described by an important law of nature known as the [[second law of thermodynamics]], which says that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.<ref>Theoretically, coffee can be "unmixed" and wood can be "unburned", but for this you would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".</ref>
 
Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of [[thermodynamic equilibrium]]. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: An ice cube in a glass of warm water will not spontaneously form from a glass of cool water. Some processes in nature are almost reversible. For example, the orbiting of the planets around the sun may be thought of as practically reversible: A movie of the planets orbiting the sunSun which is run in reverse would not appear to be impossible.
 
While the second law, and thermodynamics in general, is accurate in its predictions of intimate interactions of complex physical systems behave, scientists are not content with simply knowing how a system behaves, but want to know also ''why'' it behaves the way it does. The question of why entropy increases until equilibrium is reached was answered very successfully in 1877 by a famous scientist named [[Ludwig Boltzmann]]. The theory developed by Boltzmann and others, is known as [[statistical mechanics]]. Statistical mechanics is a physical theory which explains thermodynamics in terms of the statistical behavior of the atoms and molecules which make up the system. The theory not only explains thermodynamics, but also a host of other phenomena which are outside the scope of thermodynamics.