Content deleted Content added
No edit summary |
m →Gaussian graphical models of protein structures: replaced: neghborhood → neighborhood using AWB |
||
Line 55:
Where <math>Z = (2\pi)^{n/2}|\Sigma|^{1/2}</math> is the closed form for the [[Partition function (mathematics)|partition function]]. The parameters of this distribution are <math>\mu</math> and <math>\Sigma</math>. <math>\mu</math> is the vector of [[mean values]] of each variable, and <math>\Sigma^{-1}</math>, the inverse of the [[covariance matrix]], also known as the [[precision matrix]]. Precision matrix contains the pairwise dependencies between the variables. A zero value in <math>\Sigma^{-1}</math> means that conditioned on the values of the other variables, the two corresponding variable are independent of each other.
To learn the graph structure as a multivariate Gaussian graphical model, we can use either [[L-1 regularization]], or [[
Once the model is learned, we can repeat the same step as in the discrete case, to get the density functions at each node, and use analytical form to calculate the free energy. Here, the [[Partition function (mathematics)|partition function]] already has a [[closed form]], so the [[inference]], at least for the Gaussian graphical models is trivial. If the analytical form of the partition function is not available, [[particle filtering]] or [[expectation propagation]] can be used to approximate ''Z'', and then perform the inference and calculate free energy.
|