Content deleted Content added
SillyBunnies (talk | contribs) →A more general form of the theorem: The previous few dozen edits were devoted to adding new material relating to a more general form of the theorem. See the Tao book (reference 3) for the proof. |
m Moving Category:Theorems regarding stochastic processes to Category:Theorems about stochastic processes per Wikipedia:Categories for discussion/Speedy |
||
(38 intermediate revisions by 29 users not shown) | |||
Line 1:
{{short description|Consistent set of finite-dimensional distributions will define a stochastic process}}
In [[mathematics]], the '''Kolmogorov extension theorem''' or '''Daniell-Kolmogorov extension theorem''' (also known as '''Kolmogorov existence theorem''' or '''Kolmogorov consistency theorem''') is a [[theorem]] that guarantees that a suitably "consistent" collection of [[finite-dimensional distribution]]s will define a [[stochastic process]]. It is credited to the [[Soviet Union|Soviet]] [[mathematician]] [[Andrey Kolmogorov|Andrey Nikolaevich Kolmogorov]]<ref>{{cite book | author=Øksendal, Bernt | title=Stochastic Differential Equations: An Introduction with Applications | publisher=Springer, Berlin | year=2003 | isbn=3-540-04758-1}}</ref> and also to [[United Kingdom|British]] mathematician [[Percy John Daniell]] who discovered it independently in the slightly different setting of integration theory. <ref>J. Aldrich, But you have to remember PJ Daniell of Sheffield, Electronic Journal for History of Probability and Statistics, Vol. 3, number 2, 2007 </ref>▼
{{About|a theorem on stochastic processes|a theorem on extension of pre-measure|Hahn–Kolmogorov theorem}}
▲In [[mathematics]], the '''Kolmogorov extension theorem'''
==Statement of the theorem==
Let <math>T</math> denote some [[Interval (mathematics)|interval]] (thought of as "[[time]]"), and let <math>n \in \mathbb{N}</math>. For each <math>k \in \mathbb{N}</math> and finite [[sequence]] of distinct times <math>t_{1}, \dots, t_{k} \in T</math>, let <math>\nu_{t_{1} \dots t_{k}}</math> be a [[probability measure]] on <math>(\mathbb{R}^{n})^{k}.</math>
1. for all [[permutation]]s <math>\pi</math> of <math>\{ 1, \dots, k \}</math> and measurable sets <math>F_{i} \subseteq \mathbb{R}^{n}</math>,
Line 9 ⟶ 12:
2. for all measurable sets <math>F_{i} \subseteq \mathbb{R}^{n}</math>,<math>m \in \mathbb{N}</math>
:<math>\nu_{t_{1} \dots t_{k}} \left( F_{1} \times \dots \times F_{k} \right) = \nu_{t_{1} \dots t_{k}, t_{k + 1}, \dots , t_{k+m}} \left( F_{1} \times \dots \times F_{k} \times \underbrace{\mathbb{R}^{n} \times \dots \times \mathbb{R}^{n}}_{m} \right).</math>
Then there exists a [[probability space]] <math>(\Omega, \mathcal{F}, \mathbb{P})</math> and a stochastic process <math>X : T \times \Omega \to \mathbb{R}^{n}</math> such that
:<math>\nu_{t_{1} \dots t_{k}} \left( F_{1} \times \dots \times F_{k} \right) = \mathbb{P} \left( X_{t_{1}} \in F_{1}, \dots, X_{t_{k}} \in F_{k} \right)</math>
for all <math>t_{i} \in T</math>, <math>k \in \mathbb{N}</math> and measurable sets <math>F_{i} \subseteq \mathbb{R}^{n}</math>, i.e. <math>X</math> has <math>\nu_{t_{1} \dots t_{k}}</math> as its finite-dimensional distributions relative to times <math>t_{1} \dots t_{k}</math>.
In fact, it is always possible to take as the underlying probability space <math>\Omega = (\mathbb{R}^n)^T</math> and to take for <math>X</math> the canonical process <math>X\colon (t,Y) \mapsto Y_t</math>. Therefore, an alternative way of stating
for this level of generality is that the measure <math>\nu</math> is only defined on the
==Explanation of the conditions==
The two conditions required by the theorem are trivially satisfied by any stochastic process. For example, consider a real-valued discrete-time stochastic process <math>X</math>. Then the probability <math>\mathbb{P}(X_1 >0, X_2<0)</math> can be computed either as <math>\nu_{1,2}( \mathbb{R}_+
<math>\nu_{1,2}( \mathbb{R}_+
The first condition
Continuing the example, the second condition implies that <math>\mathbb{P}(X_1>0) = \mathbb{P}(X_1>0, X_2 \in \mathbb{R})</math>. Also this is a trivial condition that will be satisfied by any consistent family of finite-dimensional distributions.
Line 27 ⟶ 30:
==Implications of the theorem==
Since the two conditions are trivially satisfied for any stochastic process, the power of the theorem is that no other conditions are required: For any reasonable (i.e., consistent) family of finite-dimensional distributions, there exists a stochastic process with these distributions.
The measure-theoretic approach to stochastic processes starts with a probability space and defines a stochastic process as a family of functions on this probability space. However, in many applications the starting point is really the finite-dimensional distributions of the stochastic process. The theorem says that provided the finite-dimensional distributions satisfy the obvious consistency requirements, one can always identify a probability space to match the purpose. In many situations, this means that one does not have to be explicit about what the probability space is. Many texts on stochastic processes do, indeed, assume a probability space but never state explicitly what it is.
The theorem is used in one of the standard proofs of existence of a [[Brownian motion]], by specifying the finite dimensional distributions to be Gaussian random variables, satisfying the consistency conditions above. As in most of the definitions of [[Brownian motion]] it is required that the sample paths are continuous almost surely, and one then uses the [[
==
The Kolmogorov
Let <math>T</math> be any set. Let <math> \{ (\Omega_t, \mathcal{F}_t) \}_{t \in T} </math> be some collection of measurable spaces, and for each <math> t \in T </math>, let <math> \tau_t</math> be a [[Hausdorff space|Hausdorff topology]] on <math> \Omega_t</math>. For each finite subset <math>J \subset T</math>,
:
For subsets <math>I \subset J \subset T</math>, let <math>\
For each finite subset <math> F \subset T</math>, suppose we have a probability measure <math> \mu_F </math> on <math> \Omega_F </math> which is [[inner regular]] with respect to the [[product topology]]
:<math>\mu_F = (\
where <math>(\
Then there exists a unique probability measure <math>\mu</math> on <math>\Omega_T </math> such that <math>\mu_F=(\
As a remark, all of the measures <math>\mu_F,\mu</math> are defined on the
Note that the original statement of the theorem is just a special case of this theorem with <math>\Omega_t = \mathbb{R}^n </math> for all <math>t \in T</math>, and <math> \mu_{\{t_1,...,t_k\}}=\nu_{t_1 \dots t_k}</math> for <math> t_1,...,t_k \in T</math>. The stochastic process would simply be the canonical process <math> (\pi_t)_{t \in T}</math>, defined on <math>\Omega=(\mathbb{R}^n)^T</math> with probability measure <math>P=\mu</math>. The reason that the original statement of the theorem does not mention inner regularity of the measures <math>\nu_{t_1\dots t_k}</math> is that this would automatically follow, since Borel probability measures on [[Polish space]]s are automatically [[Radon measure|Radon]].
This theorem has many far-reaching consequences; for example it can be used to prove the existence of the following, among others:
*Brownian motion, i.e., the [[Wiener process]],
*a [[Markov chain]] taking values in a given state space with a given transition matrix,
*infinite products of (inner-regular) probability spaces.
==History==
According to John Aldrich, the theorem was independently discovered by [[United Kingdom|British]] mathematician [[Percy John Daniell]] in the slightly different setting of integration theory.<ref>J. Aldrich, But you have to remember PJ Daniell of Sheffield, Electronic Journal for History of Probability and Statistics, Vol. 3, number 2, 2007</ref>
==References==
Line 66 ⟶ 72:
==External links==
* Aldrich, J. (2007) [http://www.emis.de/journals/JEHPS/Decembre2007/Aldrich.pdf "But you have to remember P.J.Daniell of Sheffield"] [http://www.emis.de/journals/JEHPS/indexang.html Electronic Journ@l for History of Probability and Statistics] December 2007.▼
▲Aldrich, J. (2007) [http://www.emis.de/journals/JEHPS/Decembre2007/Aldrich.pdf "But you have to remember P.J.Daniell of Sheffield"] [http://www.emis.de/journals/JEHPS/indexang.html Electronic Journ@l for History of Probability and Statistics] December 2007.
▲[[Category:Stochastic processes]]
|