Content deleted Content added
adding inline citations for most content, and tags for other parts; "connected components" is the same as communicating classes later on (though maybe SCCs are worth mentioning?); making content more concise or removing redundancy |
add stationary distribution and ergodic theorem; move n-step transitions earlier |
||
Line 33:
:In other words, the future state depends on the past ''m'' states. It is possible to construct a chain <math>(Y_n)</math> from <math>(X_n)</math> which has the 'classical' Markov property by taking as state space the ordered ''m''-tuples of ''X'' values, ie. <math>Y_n= \left( X_n,X_{n-1},\ldots,X_{n-m+1} \right)</math>.{{cn|date=August 2020}}
==''n''-step transitions==
The probability of going from state ''i'' to state ''j'' in ''n'' time steps is▼
:<math>p_{ij}^{(n)} = \Pr(X_n=j\mid X_0=i) </math>▼
and the single-step transition is▼
:<math>p_{ij} = \Pr(X_1=j\mid X_0=i). </math>▼
For a time-homogeneous Markov chain:▼
:<math>p_{ij}^{(n)} = \Pr(X_{k+n}=j \mid X_{k}=i) </math>▼
and▼
:<math>p_{ij} = \Pr(X_{k+1}=j \mid X_k=i). </math>▼
The ''n''-step transition probabilities satisfy the [[Chapman–Kolmogorov equation]], that for any ''k'' such that 0 < ''k'' < ''n'',▼
:<math>p_{ij}^{(n)} = \sum_{r \in S} p_{ir}^{(k)} p_{rj}^{(n-k)}</math>▼
where ''S'' is the state space of the Markov chain.<ref name="PRS"/>▼
The [[marginal distribution]] Pr(''X''<sub>''n''</sub> = ''x'') is the distribution over states at time ''n''. The initial distribution is Pr(''X''<sub>0</sub> = ''x''). The evolution of the process through one time step is described by▼
:<math> \Pr(X_n=j) = \sum_{r \in S} p_{rj} \Pr(X_{n-1}=r) = \sum_{r \in S} p_{rj}^{(n)} \Pr(X_0=r).</math>▼
(The superscript (''n'') is an [[indexed family|index]], and not an [[exponent]]).▼
==Communicating classes and properties==
Line 123 ⟶ 152:
Reversible Markov chains are common in Markov chain Monte Carlo (MCMC) approaches because the detailed balance equation for a desired distribution '''{{pi}}''' necessarily implies that the Markov chain has been constructed so that '''{{pi}}''' is a steady-state distribution. Even with time-inhomogeneous Markov chains, where multiple transition matrices are used, if each such transition matrix exhibits detailed balance with the desired '''{{pi}}''' distribution, this necessarily implies that '''{{pi}}''' is a steady-state distribution of the Markov chain.
==Stationary distributions==
A distribution <math>\pi</math> is a stationary distribution of the Markov chain with stochastic matrix <math>P</math> if and only if <math>\pi P = \pi</math>. This can be written as:<ref name="PRS"/>
▲The probability of going from state ''i'' to state ''j'' in ''n'' time steps is
:<math>\forall j\in \mathbb{S}: \sum_{i\in \mathbb{S}} \pi_i p_{ij} = \pi_j</math>
This condition implies that <math>\pi P^n=\pi</math> and hence if the Markov chain <math>(X_n, n\in \mathbb{N})</math> has initial distribution <math>X_0 = \pi</math> then <math>X_n = \pi</math> (in distribution) for any <math>n\in\mathbb{N}</math>.<ref name="PRS"/>
▲:<math>p_{ij}^{(n)} = \Pr(X_n=j\mid X_0=i) </math>
If a Markov chain is irreducible then it has a stationary distribution if and only if it is null recurrent, in which case the unique such distribution is given by <math>\pi_i=\frac{1}{M_i}</math> where <math>M_i=\mathbb{E}(T_i)</math> is the mean recurrence time of ''i''.<ref name="PRS"/>
▲and the single-step transition is
▲:<math>p_{ij} = \Pr(X_1=j\mid X_0=i). </math>
▲For a time-homogeneous Markov chain:
▲:<math>p_{ij}^{(n)} = \Pr(X_{k+n}=j \mid X_{k}=i) </math>
▲and
▲:<math>p_{ij} = \Pr(X_{k+1}=j \mid X_k=i). </math>
▲The ''n''-step transition probabilities satisfy the [[Chapman–Kolmogorov equation]], that for any ''k'' such that 0 < ''k'' < ''n'',
▲:<math>p_{ij}^{(n)} = \sum_{r \in S} p_{ir}^{(k)} p_{rj}^{(n-k)}</math>
▲where ''S'' is the state space of the Markov chain.<ref name="PRS"/>
▲The [[marginal distribution]] Pr(''X''<sub>''n''</sub> = ''x'') is the distribution over states at time ''n''. The initial distribution is Pr(''X''<sub>0</sub> = ''x''). The evolution of the process through one time step is described by
▲:<math> \Pr(X_n=j) = \sum_{r \in S} p_{rj} \Pr(X_{n-1}=r) = \sum_{r \in S} p_{rj}^{(n)} \Pr(X_0=r).</math>
▲(The superscript (''n'') is an [[indexed family|index]], and not an [[exponent]]).
▲===Hitting times===
{{Main|Phase-type distribution}}The hitting time is the time, starting in a given set of states until the chain arrives in a given state or set of states. The distribution of such a time period has a phase type distribution. The simplest such distribution is that of a single exponentially distributed transition.{{cn|date=August 2020}}
For a subset of states ''A'' ⊆ ''S'', the vector ''k''<sup>''A''</sup> of hitting times (where element <math> k_i^A </math> represents the [[expected value]], starting in state ''i'' that the chain enters one of the states in the set ''A'') is the minimal non-negative solution to<ref name="norris2">{{cite book|title=Markov Chains|year=1997|isbn=9780511810633|pages=108–127|chapter=Continuous-time Markov chains II|doi=10.1017/CBO9780511810633.005|pmc=|pmid=|last1=Norris|first1=J. R.|authorlink1=James R. Norris}}</ref>
Line 162 ⟶ 170:
-\sum_{j \in S} q_{ij} k_j^A = 1&\text{ for } i \notin A.
\end{align}</math>
==Ergodic theorem==
An instance of [[ergodic theory]], the ergodic theorem for states that for an irreducible aperiodic Markov chain, with any two states ''i'' and ''j'',<ref name="PRS"/>
:<math>p_{i,j}^{(n)}\rightarrow \frac{1}{M_j}</math> as <math>n\rightarrow \infty</math>
==Notes==
|