Probability bounds analysis: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 29:
| isbn = 0-444-11037-2 }}
</ref> Also dating from the latter half of the [[19th century]], the [[Chebyshev_inequality|inequality]] attributed to [[Chebyshev]] described bounds on a distribution when only the mean and
variance of the variable are known, and the related [[Markov_inequality|inequality]] attributed to [[Andrey_MarkovAndrey Markov|Markov]] found bounds on a
positive variable when only the mean is known.
[[Henry_EHenry E._Kyburg Kyburg,_Jr Jr.|Kyburg]]<ref name="kyburg99">Kyburg, H.E., Jr. (1999). [http://www.sipta.org/documentation/interval_prob/kyburg.pdf Interval valued probabilities]. SIPTA Documention on Imprecise Probability.</ref> reviewed the history
of interval probabilities and traced the development of the critical ideas through the [[20th century]], including the important notion of incomparable probabilities favored by [[Keynes]].
Of particular note is [[Maurice René Fréchet|Fréchet]]'s derivation in the [[1930s]] of bounds on calculations involving total probabilities without
Line 45:
 
The methods of probability bounds analysis that could be routinely used in
risk assessments were developed in the [[1980s]]. Hailperin<ref name=Hailperin86 /> described a computational scheme for bounding logical calculations extending the ideas of Boole. Yager<ref name=Yager>Yager, R.R. (1986). Arithmetic and other operations on Dempster–Shafer structures. ''International Journal of Man-machine Studies'' '''25''': 357–366.</ref> described the elementary procedures by which bounds on [[convolution of probability distributions|convolutions]] can be computed under an assumption of independence. At about the same time, Makarov<ref name=Makarov>Makarov, G.D. (1981). Estimates for the distribution function of a sum of two random variables when the marginal distributions are fixed. ''Theory of Probability and Its Applications'' '''26''': 803–806.</ref>, and independently, Rüschendorf<ref>Rüschendorf, L. (1982). Random variables with maximum sums. ''Advances in Applied Probability'' '''14''': 623–632.</ref> solved the problem, originally posed by [[Kolmogorov]], of how to find the upper and lower bounds for the probability distribution of a sum of random variables whose marginal distributions, but not their joint distribution, are known. Frank et al.<ref name=Franketal87>Frank, M.J., R.B. Nelsen and B. Schweizer (1987). Best-possible bounds for the distribution of a sum&mdash;a problem of Kolmogorov. ''Probability Theory and Related Fields'' '''74''': 199–211.</ref> generalized the result of Makarov and expressed it in terms of [[Copula_Copula (probability_theoryprobability theory)|copulas]]. Since that time, formulas and algorithms for sums have been generalized and extended to differences, products, quotients and other binary and unary functions under various dependence assumptions.<ref name=WilliamsonDowns>Williamson, R.C., and T. Downs (1990). Probabilistic arithmetic I: Numerical methods for calculating convolutions and dependency bounds. ''International Journal of Approximate Reasoning'' '''4''': 89–158.</ref><ref name=Fersonetal03>Ferson, S., V. Kreinovich, L. Ginzburg, D.S. Myers, and K. Sentz. (2003). [http://www.ramas.com/unabridged.zip ''Constructing Probability Boxes and Dempster–Shafer Structures'']. SAND2002-4015. Sandia National Laboratories, Albuquerque, NM.</ref><ref>Berleant, D. (1993). Automatically verified reasoning with both intervals and probability density functions. ''Interval Computations'' '''1993 (2) ''': 48–70.</ref><ref>Berleant, D., G. Anderson, and C. Goodman-Strauss (2008). Arithmetic on bounded families of distributions: a DEnv algorithm tutorial. Pages 183–210 in ''Knowledge Processing with Interval and Soft Computing'', edited by C. Hu, R.B. Kearfott, A. de Korvin and V. Kreinovich, Springer (ISBN 978-1-84800-325-5).</ref><ref name=BerleantGoodmanStrauss>Berleant, D., and C. Goodman-Strauss (1998). Bounding the results of arithmetic operations on random variables of unknown dependency using intervals. ''Reliable Computing'' '''4''': 147–165.</ref><ref name=Fersonetal04>Ferson, S., R. Nelsen, J. Hajagos, D. Berleant, J. Zhang, W.T. Tucker, L. Ginzburg and W.L. Oberkampf (2004). [http://www.ramas.com/depend.pdf ''Dependence in Probabilistic Modeling, Dempster–Shafer Theory, and Probability Bounds Analysis'']. Sandia National Laboratories, SAND2004-3072, Albuquerque, NM.</ref> <!--
 
It is possible to mix very different kinds of knowledge together in a bounding analysis. For instance,
Line 105:
:''Z'' ~ <big>[ sup</big><sub>x+y=z</sub> max(''F''(''x'') + ''G''(''y'') &minus; 1, 0), <big>inf</big><sub>x+y=z</sub> min(''F''(''x'') + ''G''(''y''), 1) <big>]</big>.
 
These bounds are implied by the [[copulacopula_(probability_theory)#Fr.C3.A9chet.E2.80.93Hoeffding_copula_bounds|Fréchet–Hoeffding]] [[copula (probability theory)|copula]] bounds. The problem can also be solved using the methods of [[mathematical programming]]<ref name=BerleantGoodmanStrauss />.
 
The convolution under the intermediate assumption that ''X'' and ''Y'' have [[positive quadrant dependence|positive dependence]] is likewise easy to compute, as is the convolution under the extreme assumptions of [[Comonotonicity|perfect positive]] or [[countermonotonicity|perfect negative]] dependency between ''X'' and ''Y''.<ref name=Fersonetal04 />