Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.3 |
|||
Line 65:
Originally, entropy was named to describe the "waste heat", or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by [[Ludwig Boltzmann]] in developing [[Entropy (statistical views)|statistical views of entropy]] using [[probability theory]] to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by [[Werner Heisenberg]] and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and [[statistical mechanics]].
For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the [[kinetic energy|"motional" (i.e. kinetic) energy]] of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref name=Lambert>[http://entropysite.oxy.edu Entropy Sites — A Guide] {{Webarchive|url=https://web.archive.org/web/20130114223933/http://entropysite.oxy.edu/ |date=2013-01-14 }} Content selected by [[Frank L. Lambert]]</ref> Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of [[information entropy]], which lacks the Boltzmann constant inherent in thermodynamic entropy.
|