For a continuous-time Markov chain with [[transition rate matrix]] ''Q'', the uniformized discrete-time Markov chain has probability transition matrix <math>P:=(p_{ij})_{i,j}</math>, which is defined by<ref name="stewart">{{cite book |title=Probability, Markov chains, queues, and simulation: the mathematical basis of performance modeling|last=Stewart |first=William J. |year=2009 |publisher=[[Princeton University Press]] |isbn=0-691-14062-6 |page=361}}</ref><ref name="cass">{{cite book |title=Introduction to discrete event systems|last=Cassandras |first=Christos G. |last2=Lafortune| first2=Stéphane|year=2008 |publisher=Springer |isbn=0-387-33332-0}}</ref><ref name="ross">{{cite book |title=Introduction to probability models|last=Ross |first=Sheldon M. |year=2007 |publisher=Academic Press |isbn=0-12-598062-0}}</ref>
::<math>p_{ij} = \begin{cases} q_{ij}/\gamma &\text{ if } i \neq j \\ 1 - \sum_{j \neq i} q_{ij}/\gamma &\text{ if } i=j \end{cases}</math>
This representation shows, that a continuous-time Markov Chainchain can be described by a discrete Markov Chainchain with transition matrix ''P'' as defined above where jumps occur according to a Poisson Processprocess with intensity γt.
In practice this [[series (mathematics)|series]] is terminated after finitely many terms.