Introduction to entropy: Difference between revisions

Content deleted Content added
mNo edit summary
Line 5:
In calculations, entropy is symbolised by S and is a measure at a particular instant, a state function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Often change in entropy, symbolised by ΔS, is referred to in relation to change in energy, δQ.
 
[[Statistical mechanics]] introduces calculation of entropy using [[probability theory]] to find the number of possible [[microstatesMicrostate (statistical mechanics)|mirostates]] at an instant, any one of which will contain all the energy of the system at that instant. The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules
 
Statistical mechanical entropy is mathematically similar to [[Shannon entropy]] which is part of [[information theory]], where energy is not involved. This similarity means that some probabilistic aspects of thermodynamics are replicated in information theory.