In probability theory, continuous mapping theorem states that continuous functions retain their limit-preserving properties in the scope of the convergence of random variables. A continuous function, in Heine’s definition, is such a function which maps convergent sequences into convergent sequences: if xn → x then g(xn) → g(x). The continuous mapping theorem states that this will also be true if we replace deterministic sequence {xn} with a sequence of random variables {Xn}, and the standard notion of convergence of real numbers “→” with one of the types of convergence of random variables.
This theorem was first proved by (Mann & Wald 1943), and therefore sometimes is called the Mann–Wald theorem.[1]
Statement
Let {Xn}, X be random elements defined on a metric space S. Suppose a function g: S→S′ has the set of discontinuity points Dg such that Pr[X∈Dg] = 0. Then [2][3][4]
Proof [5]
Convergence in distribution
This section needs expansion. You can help by adding to it. |
Convergence in probability
Fix arbitrary ε>0. Then for any δ>0 consider the set Bδ defined as
This is the set of continuity points x of function g(·) for which it is possible to find within the δ-neighborhood of x a point which maps outside the ε-neighborhood of g(x). By definition of continuity, this set shrinks as δ goes to zero, so that limδ→0Bδ = ∅.
Now suppose that |g(X) − g(Xn)| > ε. This implies that at least one of the following is true: either |X−Xn|≥δ, or X∈Dg, or X∈Bδ. In terms of probabilities this can be written as
On the right-hand side the first term converges to zero as n → ∞ for any fixed δ, by the definition of convergence in probability of the sequence {Xn}. The second term converges to zero as δ → 0, since the set Bδ shrinks to an empty set. And the last term is identically equal to zero by assumption of the theorem. Therefore we can conclude that
which means that g(Xn) converges to g(X) in probability.
Convergence almost surely
By definition of the continuity of function g(·),
at each point X(ω) where g(·) is continuous. Therefore
By definition, we conclude that g(Xn) converges to g(X) almost surely.
References
See also
Literature
- Amemiya, Takeshi (1985). Advanced Econometrics. Cambridge, MA: Harvard University Press. ISBN 0674005600.
{{cite book}}
: Unknown parameter|lcc=
ignored (help) - Billingsley, Patrick (1969). Convergence of Probability Measures. John Wiley & Sons. ISBN 0471072427.
- Billingsley, Patrick (1999). Convergence of Probability Measures (2nd ed.). John Wiley & Sons. ISBN 0471197459.
- Template:Cite article
- van der Vaart, A. W. (1998). Asymptotic statistics. New York: Cambridge University Press. ISBN 9780521496032.
{{cite book}}
: Unknown parameter|lcc=
ignored (help)CS1 maint: ref duplicates default (link)
Notes
- ^ Amemiya 1985, p. 88
- ^ van der Vaart 1998, Theorem 2.3
- ^ Billingsley 1969, p. 31, Corollary 1
- ^ Billingsley 1999, p. 21, Theorem 2.7
- ^ This proof has been adopted from (van der Vaart 1998, Theorem 2.3)