Content deleted Content added
m Reverted edit by 2409:408C:2C90:A469:0:0:3448:B214 (talk) to last version by Morris80315436 |
mNo edit summary |
||
(18 intermediate revisions by 15 users not shown) | |||
Line 7:
Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability theory describing such behaviour are the [[law of large numbers]] and the [[central limit theorem]].
As a mathematical foundation for [[statistics]], probability theory is essential to many human activities that involve quantitative analysis of data.<ref>[http://home.ubalt.edu/ntsbarsh/stat-data/Topics.htm Inferring From Data]</ref> Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in [[statistical mechanics]] or [[sequential estimation]]. A great discovery of twentieth-century [[physics]] was the probabilistic nature of physical phenomena at atomic scales, described in [[quantum mechanics]].
==History of probability==
Line 22:
===Motivation===
Consider an [[Experiment (probability theory)|experiment]] that can produce a number of outcomes. The set of all outcomes is called the ''[[sample space]]'' of the experiment. The ''[[power set]]'' of the sample space (or equivalently, the event space) is formed by considering all different collections of possible results. For example, rolling an honest
Probability is a [[Function (mathematics)|way of assigning]] every "event" a value between zero and one, with the requirement that the event made up of all possible results (in our example, the event {1,2,3,4,5,6}) be assigned a value of one. To qualify as a [[probability distribution]], the assignment of values must satisfy the requirement that if you look at a collection of mutually exclusive events (events that contain no common results, e.g., the events {1,6}, {3}, and {2,4} are all mutually exclusive), the probability that any of these events occurs is given by the sum of the probabilities of the events.<ref>{{cite book |last=Ross |first=Sheldon |title=A First Course in Probability |publisher=Pearson Prentice Hall |edition=8th |year=2010 |isbn=978-0-13-603313-4 |pages=26–27 |url=https://books.google.com/books?id=Bc1FAQAAIAAJ&pg=PA26 |access-date=2016-02-28 }}</ref>
Line 28:
The probability that any one of the events {1,6}, {3}, or {2,4} will occur is 5/6. This is the same as saying that the probability of event {1,2,3,4,6} is 5/6. This event encompasses the possibility of any number except five being rolled. The mutually exclusive event {5} has a probability of 1/6, and the event {1,2,3,4,5,6} has a probability of 1, that is, absolute certainty.
When doing calculations using the outcomes of an experiment, it is necessary that all those [[elementary event]]s have a number assigned to them. This is done using a [[random variable]]. A random variable is a function that assigns to each elementary event in the sample space a [[real number]]. This function is usually denoted by a capital letter.<ref>{{Cite book |title =Introduction to Probability and Mathematical Statistics |last1 =Bain |first1 =Lee J. |last2 =Engelhardt |first2 =Max |publisher =Brooks/Cole |___location =[[Belmont, California]] |page =53 |isbn =978-0-534-38020-5 |edition =2nd |date =1992 }}</ref> In the case of a
===Discrete probability distributions===
{{Main|Discrete probability distribution}}
[[File:NYW-DK-Poisson(5).svg|thumb|300px|The [[Poisson distribution]], a discrete probability distribution
{{em|Discrete probability theory}} deals with events that occur in [[countable]] sample spaces.
Line 74:
# <math>\lim_{x\rightarrow \infty} F(x)=1\,.</math>
The random variable <math>X</math> is said to have a continuous probability distribution if the corresponding CDF <math>F</math> is continuous. If <math>F\,</math> is [[absolutely continuous]],
For a set <math>E \subseteq \mathbb{R}</math>, the probability of the random variable ''X'' being in <math>E\,</math> is
Line 104:
Along with providing better understanding and unification of discrete and continuous probabilities, measure-theoretic treatment also allows us to work on probabilities outside <math>\mathbb{R}^n</math>, as in the theory of [[stochastic process]]es. For example, to study [[Brownian motion]], probability is defined on a space of functions.
When it is convenient to work with a dominating measure, the [[
==Classical probability distributions==
Line 115:
In probability theory, there are several notions of convergence for [[random variable]]s. They are listed below in the order of strength, i.e., any subsequent notion of convergence in the list implies convergence according to all of the preceding notions.
;Weak convergence: A sequence of random variables <math>X_1,X_2,\dots,\,</math> converges {{em|weakly}} to the random variable <math>X\,</math> if their respective CDF converges<math>F_1,F_2,\dots\,</math>
:Most common shorthand notation: <math>\displaystyle X_n \, \xrightarrow{\mathcal D} \, X</math>
Line 253:
{{DEFAULTSORT:Probability Theory}}
[[Category:Probability theory| ]]
|