Kolmogorov extension theorem: Difference between revisions

Content deleted Content added
נחי (talk | contribs)
There is another extensiom theorem named after kolmogorov.
 
(16 intermediate revisions by 14 users not shown)
Line 1:
{{short description|Consistent set of finite-dimensional distributions will define a stochastic process}}
{{About|a theorem deals withon stochastic processes|a theorem deals withon extension of pre-measure|Hahn–Kolmogorov theorem}}
 
In [[mathematics]], the '''Kolmogorov extension theorem''' (also known as '''Kolmogorov existence theorem''', orthe '''Kolmogorov consistency theorem''' or the '''Daniell-Kolmogorov theorem''') is a [[theorem]] that guarantees that a suitably "consistent" collection of [[finite-dimensional distribution]]s will define a [[stochastic process]]. It is credited to the English mathematician [[SovietPercy UnionJohn Daniell]] and the [[Russia|SovietRussian]] [[mathematician]] [[Andrey Kolmogorov|Andrey Nikolaevich Kolmogorov]].<ref>{{cite book | author=Øksendal, Bernt | title=Stochastic Differential Equations: An Introduction with Applications | publisher=Springer |___location=Berlin | year=2003 |edition=Sixth | isbn=3-540-04758-1 |page=11 |url=https://books.google.com/books?id=VgQDWyihxKYC&pg=PA11 }}</ref>
 
==Statement of the theorem==
 
Let <math>T</math> denote some [[Interval (mathematics)|interval]] (thought of as "[[time]]"), and let <math>n \in \mathbb{N}</math>. For each <math>k \in \mathbb{N}</math> and finite [[sequence]] of distinct times <math>t_{1}, \dots, t_{k} \in T</math>, let <math>\nu_{t_{1} \dots t_{k}}</math> be a [[probability measure]] on <math>(\mathbb{R}^{n})^{k}.</math>. Suppose that these measures satisfy two consistency conditions:
 
1. for all [[permutation]]s <math>\pi</math> of <math>\{ 1, \dots, k \}</math> and measurable sets <math>F_{i} \subseteq \mathbb{R}^{n}</math>,
Line 11 ⟶ 12:
 
2. for all measurable sets <math>F_{i} \subseteq \mathbb{R}^{n}</math>,<math>m \in \mathbb{N}</math>
:<math>\nu_{t_{1} \dots t_{k}} \left( F_{1} \times \dots \times F_{k} \right) = \nu_{t_{1} \dots t_{k}, t_{k + 1}, \dots , t_{k+m}} \left( F_{1} \times \dots \times F_{k} \times \underbrace{\mathbb{R}^{n} \times \dots \times \mathbb{R}^{n}}_{m} \right).</math>
Then there exists a [[probability space]] <math>(\Omega, \mathcal{F}, \mathbb{P})</math> and a stochastic process <math>X : T \times \Omega \to \mathbb{R}^{n}</math> such that
:<math>\nu_{t_{1} \dots t_{k}} \left( F_{1} \times \dots \times F_{k} \right) = \mathbb{P} \left( X_{t_{1}} \in F_{1}, \dots, X_{t_{k}} \in F_{k} \right)</math>
Line 17 ⟶ 18:
 
In fact, it is always possible to take as the underlying probability space <math>\Omega = (\mathbb{R}^n)^T</math> and to take for <math>X</math> the canonical process <math>X\colon (t,Y) \mapsto Y_t</math>. Therefore, an alternative way of stating Kolmogorov's extension theorem is that, provided that the above consistency conditions hold, there exists a (unique) measure <math>\nu</math> on <math>(\mathbb{R}^n)^T</math> with marginals <math>\nu_{t_{1} \dots t_{k}}</math> for any finite collection of times <math>t_{1} \dots t_{k}</math>. Kolmogorov's extension theorem applies when <math>T</math> is uncountable, but the price to pay
for this level of generality is that the measure <math>\nu</math> is only defined on the product [[Σ-algebra#Product_σ-algebra|product σ-algebra]] of <math>(\mathbb{R}^n)^T</math>, which is not very rich.
 
==Explanation of the conditions==
Line 23 ⟶ 24:
The two conditions required by the theorem are trivially satisfied by any stochastic process. For example, consider a real-valued discrete-time stochastic process <math>X</math>. Then the probability <math>\mathbb{P}(X_1 >0, X_2<0)</math> can be computed either as <math>\nu_{1,2}( \mathbb{R}_+ \times \mathbb{R}_-)</math> or as <math>\nu_{2,1}( \mathbb{R}_- \times \mathbb{R}_+)</math>. Hence, for the finite-dimensional distributions to be consistent, it must hold that
<math>\nu_{1,2}( \mathbb{R}_+ \times \mathbb{R}_-) = \nu_{2,1}( \mathbb{R}_- \times \mathbb{R}_+)</math>.
The first condition generalisesgeneralizes this obvious statement to hold for any number of time points <math>t_i</math>, and any control sets <math>F_i</math>.
 
Continuing the example, the second condition implies that <math>\mathbb{P}(X_1>0) = \mathbb{P}(X_1>0, X_2 \in \mathbb{R})</math>. Also this is a trivial condition that will be satisfied by any consistent family of finite-dimensional distributions.
Line 33 ⟶ 34:
The measure-theoretic approach to stochastic processes starts with a probability space and defines a stochastic process as a family of functions on this probability space. However, in many applications the starting point is really the finite-dimensional distributions of the stochastic process. The theorem says that provided the finite-dimensional distributions satisfy the obvious consistency requirements, one can always identify a probability space to match the purpose. In many situations, this means that one does not have to be explicit about what the probability space is. Many texts on stochastic processes do, indeed, assume a probability space but never state explicitly what it is.
 
The theorem is used in one of the standard proofs of existence of a [[Brownian motion]], by specifying the finite dimensional distributions to be Gaussian random variables, satisfying the consistency conditions above. As in most of the definitions of [[Brownian motion]] it is required that the sample paths are continuous almost surely, and one then uses the [[kolmogorovKolmogorov continuity theorem]] to construct a continuous modification of the process constructed by the Kolmogorov extension theorem.
 
==General form of the theorem==
The Kolmogorov extension theorem gives us conditions for a collection of measures on Euclidean spaces to be the finite-dimensional distributions of some <math>\mathbb{R}^{n}</math>-valued stochastic process, but the assumption that the state space be <math>\mathbb{R}^{n}</math> is unnecessary. In fact, any collection of measurable spaces together with a collection of [[inner regular measure]]s defined on the finite products of these spaces would suffice, provided that these measures satisfy a certain compatibility relation. The formal statement of the general theorem is as follows.<ref>{{cite book |first=T. |last=Tao |authorlink=Terence Tao |title=An Introduction to Measure Theory |series=[[Graduate Studies in Mathematics]] |volume=126 |___location=Providence |publisher=American Mathematical Society |year=2011 |isbn=978-0-8218-6919-2 |page=195 |url=https://books.google.com/books?id=HoGDAwAAQBAJ&pg=PA195 }}</ref>
 
Let <math>T</math> be any set. Let <math> \{ (\Omega_t, \mathcal{F}_t) \}_{t \in T} </math> be some collection of measurable spaces, and for each <math> t \in T </math>, let <math> \tau_t</math> be a [[Hausdorff space|Hausdorff topology]] on <math> \Omega_t</math>. For each finite subset <math>J \subset T</math>, define
 
:<math>\Omega_J := \prod_{t\in J} \Omega_t</math>.
Line 73 ⟶ 74:
* Aldrich, J. (2007) [http://www.emis.de/journals/JEHPS/Decembre2007/Aldrich.pdf "But you have to remember P.J.Daniell of Sheffield"] [http://www.emis.de/journals/JEHPS/indexang.html Electronic Journ@l for History of Probability and Statistics] December 2007.
 
[[Category:StochasticTheorems about stochastic processes]]
[[Category:Probability theorems]]