Content deleted Content added
Capitalize pdf acronym following last edit. |
mNo edit summary |
||
(36 intermediate revisions by 28 users not shown) | |||
Line 2:
{{Probability fundamentals}}
'''Probability theory''' or '''probability calculus''' is the branch of [[mathematics]] concerned with [[probability]]. Although there are several different [[probability interpretations]], probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of [[axioms of probability|axioms]]. Typically these axioms formalise probability in terms of a [[probability space]], which assigns a [[measure (mathematics)|measure]] taking values between 0 and 1, termed the [[probability measure]], to a set of outcomes called the [[sample space]]. Any specified subset of the sample space is called an [[event (probability theory)|event]].
Central subjects in probability theory include discrete and continuous [[random variable]]s, [[probability distributions]], and [[stochastic process]]es (which provide mathematical abstractions of [[determinism|non-deterministic]] or uncertain processes or measured [[Quantity|quantities]] that may either be single occurrences or evolve over time in a random fashion).
Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability theory describing such behaviour are the [[law of large numbers]] and the [[central limit theorem]].
As a mathematical foundation for [[statistics]], probability theory is essential to many human activities that involve quantitative analysis of data.<ref>[http://home.ubalt.edu/ntsbarsh/stat-data/Topics.htm Inferring From Data]</ref> Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in [[statistical mechanics]] or [[sequential estimation]]. A great discovery of twentieth-century [[physics]] was the probabilistic nature of physical phenomena at atomic scales, described in [[quantum mechanics]].
==History of probability==
Line 21 ⟶ 22:
===Motivation===
Consider an [[Experiment (probability theory)|experiment]] that can produce a number of outcomes. The set of all outcomes is called the ''[[sample space]]'' of the experiment. The ''[[power set]]'' of the sample space (or equivalently, the event space) is formed by considering all different collections of possible results. For example, rolling an honest
Probability is a [[Function (mathematics)|way of assigning]] every "event" a value between zero and one, with the requirement that the event made up of all possible results (in our example, the event {1,2,3,4,5,6}) be assigned a value of one. To qualify as a [[probability distribution]], the assignment of values must satisfy the requirement that if you look at a collection of mutually exclusive events (events that contain no common results, e.g., the events {1,6}, {3}, and {2,4} are all mutually exclusive), the probability that any of these events occurs is given by the sum of the probabilities of the events.<ref>{{cite book |last=Ross |first=Sheldon |title=A First Course in Probability |publisher=Pearson Prentice Hall |edition=8th |year=2010 |isbn=978-0-13-603313-4 |pages=26–27 |url=https://books.google.com/books?id=Bc1FAQAAIAAJ&pg=PA26 |access-date=2016-02-28 }}</ref>
Line 27 ⟶ 28:
The probability that any one of the events {1,6}, {3}, or {2,4} will occur is 5/6. This is the same as saying that the probability of event {1,2,3,4,6} is 5/6. This event encompasses the possibility of any number except five being rolled. The mutually exclusive event {5} has a probability of 1/6, and the event {1,2,3,4,5,6} has a probability of 1, that is, absolute certainty.
When doing calculations using the outcomes of an experiment, it is necessary that all those [[elementary event]]s have a number assigned to them. This is done using a [[random variable]]. A random variable is a function that assigns to each elementary event in the sample space a [[real number]]. This function is usually denoted by a capital letter.<ref>{{Cite book |title =Introduction to Probability and Mathematical Statistics |last1 =Bain |first1 =Lee J. |last2 =Engelhardt |first2 =Max |publisher =Brooks/Cole |___location =[[Belmont, California]] |page =53 |isbn =978-0-534-38020-5 |edition =2nd |date =1992 }}</ref> In the case of a
===Discrete probability distributions===
{{Main|Discrete probability distribution}}
[[File:NYW-DK-Poisson(5).svg|thumb|300px|The [[Poisson distribution]], a discrete probability distribution
{{em|Discrete probability theory}} deals with events that occur in [[countable]] sample spaces.
Line 53 ⟶ 54:
So, the probability of the entire sample space is 1, and the probability of the null event is 0.
The function <math>f(x)\,</math> mapping a point in the sample space to the "probability" value is called a {{em|probability mass function}} abbreviated as {{em|pmf}}.
===Continuous probability distributions===
{{Main|Continuous probability distribution}}
[[File:Gaussian distribution 2.jpg|thumb|300px|The [[normal distribution]], a continuous probability distribution
{{em|Continuous probability theory}} deals with events that occur in a continuous sample space.
Line 73 ⟶ 74:
# <math>\lim_{x\rightarrow \infty} F(x)=1\,.</math>
The random variable <math>X</math> is said to have a continuous probability distribution if the corresponding CDF <math>F</math> is continuous. If <math>F\,</math> is [[absolutely continuous]],
For a set <math>E \subseteq \mathbb{R}</math>, the probability of the random variable ''X'' being in <math>E\,</math> is
Line 86 ⟶ 87:
===Measure-theoretic probability theory===
The
An example of such distributions could be a mix of discrete and continuous distributions—for example, a random variable that is 0 with probability 1/2, and takes a random value from a normal distribution with probability 1/2. It can still be studied to some extent by considering it to have a PDF of <math>(\delta[x] + \varphi(x))/2</math>, where <math>\delta[x]</math> is the [[Dirac delta function]].
Line 103 ⟶ 104:
Along with providing better understanding and unification of discrete and continuous probabilities, measure-theoretic treatment also allows us to work on probabilities outside <math>\mathbb{R}^n</math>, as in the theory of [[stochastic process]]es. For example, to study [[Brownian motion]], probability is defined on a space of functions.
When it
==Classical probability distributions==
Line 114 ⟶ 115:
In probability theory, there are several notions of convergence for [[random variable]]s. They are listed below in the order of strength, i.e., any subsequent notion of convergence in the list implies convergence according to all of the preceding notions.
;Weak convergence: A sequence of random variables <math>X_1,X_2,\dots,\,</math> converges {{em|weakly}} to the random variable <math>X\,</math> if their respective CDF converges<math>F_1,F_2,\dots\,</math>
:Most common shorthand notation: <math>\displaystyle X_n \, \xrightarrow{\mathcal D} \, X</math>
Line 135 ⟶ 136:
The {{em|law of large numbers}} (LLN) states that the sample average
:<math>\overline{X}_n=\frac1n{\sum_{k=1}^n X_k}</math>
of a [[sequence]] of [[independent and identically distributed random variables]] <math>X_k</math> converges towards their common [[Expected value|expectation]] (expected value) <math>\mu</math>, provided that the expectation of <math>|X_k|</math> is finite.▼
▲identically distributed random variables <math>X_k</math> converges towards their common expectation <math>\mu</math>, provided that the expectation of <math>|X_k|</math> is finite.
It is in the different forms of [[convergence of random variables]] that separates the ''weak'' and the ''strong'' law of large numbers<ref>{{Cite book|last=Dekking|first=Michel|url=http://archive.org/details/modernintroducti00fmde|title=A modern introduction to probability and statistics : understanding why and how|date=2005|publisher=London : Springer|others=Library Genesis|isbn=978-1-85233-896-1|pages=180–194|chapter=Chapter 13: The law of large numbers}}</ref>
Line 160:
==See also==
{{Portal|Mathematics}}
* {{Annotated link|Mathematical Statistics}}
* {{Annotated link|Expected value}}
* [[Catalog of articles in probability theory]]▼
*
*
* {{Annotated link|Fuzzy measure theory}}
*
*
* [[List of probability topics]]▼
* [[List of publications in statistics]]▼
* [[List of statistical topics]]▼
*
▲* [[Notation in probability]]
▲* [[Predictive modelling]]
* {{Annotated link|Probability distribution}}
▲* [[Probabilistic logic]] – A combination of probability theory and logic
* {{Annotated link|Probability axioms}}
▲* [[Probabilistic proofs of non-probabilistic theorems]]
*
*
▲* [[Statistical independence]]
*
▲* [[Statistical physics]]
=== Lists ===
▲* [[Subjective logic]]
▲* [[Pairwise independence#Probability of the union of pairwise independent events|Probability of the union of pairwise independent events]]
== References ==
Line 251 ⟶ 253:
{{DEFAULTSORT:Probability Theory}}
[[Category:Probability theory| ]]
|