Partition function (mathematics): Difference between revisions

Content deleted Content added
m The partition function has no application (that I know of) in information science, but definitely has application in information theory. I think this was a simple misnomer. I also un-linked the previous "first" occurrence of the hyperlink.
Tag: gettingstarted edit
m clean up, typo(s) fixed: Equiping → Equipping using AWB
Line 121:
:<math>g_{ij}(\beta) = \frac{\partial^2}{\partial \beta^i\partial \beta^j} \left(-\log Z(\beta)\right) =
\langle \left(H_i-\langle H_i\rangle\right)\left( H_j-\langle H_j\rangle\right)\rangle</math>
This matrix is positive semi-definite, and may be interpreted as a [[metric tensor]], specifically, a [[Riemannian metric]]. EquipingEquipping the space of lagrange multipliers with a metric in this way turns it into a [[Riemannian manifold]].<ref>Gavin E. Crooks, "Measuring thermodynamic length" (2007), [http://arxiv.org/abs/0706.0559 ArXiv 0706.0559]</ref> The study of such manifolds is referred to as [[information geometry]]; the metric above is the [[Fisher information metric]]. Here, <math>\beta</math> serves as a coordinate on the manifold. It is interesting to compare the above definition to the simpler [[Fisher information]], from which it is inspired.
 
That the above defines the Fisher information metric can be readily seen by explicitly substituting for the expectation value: