Graphical model: Difference between revisions

Content deleted Content added
Thamelry (talk | contribs)
No edit summary
Thamelry (talk | contribs)
No edit summary
Line 15:
In other words, the [[probability distribution|joint distribution]] factors into a product of conditional distributions. The graph structure indicates direct dependencies among random variables. Any two nodes that are not in a descendant/ancestor relationship are [[Conditional independence|conditionally independent]] given the values of their parents.
 
This type of graphical model is known as a directed graphical model, [[Bayesian network]], or belief network. Classic [[Machine learning|Machine learning]] methods like [[Hiddenhidden Markov Modelsmodels|Hiddenhidden Markov Modelsmodels]] or [[Neuralneural networks|Neuralneural networks]] can be considered as special cases of Bayesian networks.
 
There are also undirected graphical models, generally called [[Markov random fields|Markov random fields]] or [[Markov networks|Markov networks]]..
 
Applications of graphical models include modelling of [[gene regulatory network]]s, [[speech recognition|speech recognition]], gene finding, [[computer vision|computer andvision]] theand diagnosis of diseases.
 
A good reference for learning the basics of graphical models is written by Neapolitan, Learning Bayesian networks (2004). A more advanced and statistically oriented book is by Cowell, Dawid, Lauritzen and Spiegelhalter, Probabilistic networks and expert systems (1999). See also [[belief propagation]].