Content deleted Content added
No edit summary |
|||
(2 intermediate revisions by one other user not shown) | |||
Line 5:
For a continuous-time [[Markov chain]] with [[transition rate matrix]] ''Q'', the uniformized discrete-time Markov chain has probability transition matrix <math>P:=(p_{ij})_{i,j}</math>, which is defined by<ref name="stewart">{{cite book |title=Probability, Markov chains, queues, and simulation: the mathematical basis of performance modeling|url=https://archive.org/details/probabilitymarko00stew|url-access=limited|last=Stewart |first=William J. |year=2009 |publisher=[[Princeton University Press]] |isbn=0-691-14062-6 |page=[https://archive.org/details/probabilitymarko00stew/page/n379 361]}}</ref><ref name="cass">{{cite book |title=Introduction to discrete event systems|last=Cassandras |first=Christos G. |last2=Lafortune| first2=Stéphane|year=2008 |publisher=Springer |isbn=0-387-33332-0}}</ref><ref name="ross">{{cite book |title=Introduction to probability models|last=Ross |first=Sheldon M. |year=2007 |publisher=Academic Press |isbn=0-12-598062-0}}</ref>
::<math>p_{ij} = \begin{cases} q_{ij}/\gamma &\text{ if } i \neq j \\ 1 - \sum_{
with ''γ'', the uniform rate parameter, chosen such that
Line 15:
::<math>P=I+\frac{1}{\gamma}Q.</math>
For a starting distribution
::<math>\pi(t) = \sum_{n=0}^\infty \pi(0) P^n \frac{(\gamma t)^n}{n!}e^{-\gamma t}.</math>
This representation shows that a continuous-time Markov chain can be described by a discrete Markov chain with transition matrix ''P'' as defined above where jumps occur according to a Poisson process with intensity
In practice this [[series (mathematics)|series]] is terminated after finitely many terms.
|