Content deleted Content added
m Open access bot: url-access updated in citation with #oabot. |
Link suggestions feature: 3 links added. |
||
Line 69:
:<math> \Pr(X_{n_{ij}}=j \mid X_0=i) = p_{ij}^{(n_{ij})} > 0.</math>
A state ''i'' is said to communicate with state ''j'' (written ''i'' ↔ ''j'') if both ''i'' → ''j'' and ''j'' → ''i''. A communicating class is a [[maximal set]] of states ''C'' such that every pair of states in ''C'' communicates with each other. Communication is an [[equivalence relation]], and communicating classes are the [[equivalence class]]es of this relation.<ref name="PRS"/>
A communicating class is closed if the probability of leaving the class is zero, namely if ''i'' is in ''C'' but ''j'' is not, then ''j'' is not accessible from ''i''.<ref name="PRS"/> The set of communicating classes forms a [[directed acyclic graph|directed, acyclic graph]] by inheriting the arrows from the original state space. A communicating class is closed if and only if it has no outgoing arrows in this graph.
Line 159:
:<math>\pi_i \Pr(X_{n+1} = j \mid X_{n} = i) = \pi_j \Pr(X_{n+1} = i \mid X_{n} = j)</math>
for all times ''n'' and all states ''i'' and ''j''. This condition is known as the [[detailed balance]] condition (or local [[balance equation]]).
Considering a fixed arbitrary time ''n'' and using the shorthand
Line 189:
==== Closest reversible Markov chain ====
For any time-homogeneous Markov chain given by a transition matrix <math>P \in \mathbb{R}^{n \times n}</math>, any norm <math>||\cdot ||</math> on <math> \mathbb{R}^{n \times n}</math> which is induced by a [[Inner product space |scalar product]], and any probability vector <math>\pi</math>, there exists a unique transition matrix <math>P^*</math> which is reversible according to <math>\pi</math>
and which is closest to <math>P</math> according to the norm <math>||\cdot ||.</math> The matrix <math>P^*</math> can be computed by solving a quadratic-convex [[optimization problem]].<ref>A. Nielsen, M. Weber. Online publication in Numerical Linear Algebra with Applications, DOI:10.1002/nla.1967, 2015.</ref>
For example, consider the following Markov chain:
|