Content deleted Content added
No edit summary |
No edit summary |
||
Line 15:
In other words, the [[probability distribution|joint distribution]] factors into a product of conditional distributions. The graph structure indicates direct dependencies among random variables. Any two nodes that are not in a descendant/ancestor relationship are [[Conditional independence|conditionally independent]] given the values of their parents.
This type of graphical model is known as a directed graphical model, [[Bayesian network]], or belief network. Classic [[Machine learning|Machine learning]] methods like [[Hidden Markov Models|Hidden Markov Models]] or [[Neural networks|Neural networks]] can be considered as special cases of Bayesian networks.
There are also undirected graphical models, generally called [[Markov random fields|Markov random fields]].
Applications of graphical models include modelling of [[gene regulatory network]]s, speech recognition, gene finding, computer vision and the diagnosis of diseases.
A good reference for learning the basics of graphical models is written by Neapolitan, Learning Bayesian networks (2004). A more advanced and
▲A good reference for learning the basics of graphical models is written by Neapolitan, Learning Bayesian networks (2004). A more advanced and more statistically oriented book is by Lauritzen, Graphical models (1996).
[[Category:probability theory]]
|