Content deleted Content added
No edit summary |
|||
Line 1:
{{no footnotes|date=October 2014}}
'''Information-based complexity''' ('''IBC''') studies optimal [[algorithms]] and [[Analysis of algorithms|computational complexity]] for the continuous problems
The goal of information-based complexity is to create a theory of computational complexity and optimal algorithms for problems with partial, contaminated and priced information, and to apply the results to answering questions in various disciplines. Examples of such disciplines include [[physics]], economics, mathematical finance, [[computer vision]], [[control theory]], [[geophysics]], [[medical imaging]], [[weather forecasting]] and [[Numerical weather prediction|climate prediction]], and [[statistics]]. The theory is developed over abstract spaces, typically [[Hilbert space|Hilbert]] or [[Banach space|Banach]] spaces, while the applications are usually for multivariate problems.
Since the information is partial and contaminated, only approximate solutions can be obtained. IBC studies computational complexity and optimal algorithms for approximate solutions in various settings. Since the worst case setting often leads to negative results such as unsolvability and intractability, settings with weaker assurances such as average, probabilistic and randomized are also studied. A fairly new area of IBC research is [[Continuous-variable quantum information|continuous quantum computing]].
==Overview==
Line 31:
Very high dimensional integrals are common in finance. For example, computing expected cash flows for a [[collateralized mortgage obligation]] (CMO) requires the calculation of a number of <math>360</math> dimensional integrals, the <math>360</math> being the number of months in <math>30</math> years. Recall that if a worst case assurance is required the time is of order <math>\epsilon^{-d}</math> time units. Even if the error is not small, say <math>\epsilon=10^{-2},</math> this is <math>10^{720}</math> time units. People in finance have long been using the [[Monte Carlo method]] (MC), an instance of a randomized algorithm. Then in 1994 a research group at [[Columbia University]] ([https://www.cs.columbia.edu/~ap Papageorgiou], Paskov, [https://www.cs.columbia.edu/~traub Traub], Woźniakowski) discovered that the [[quasi-Monte Carlo method|quasi-Monte Carlo]] (QMC) method using [[low-discrepancy sequence|low discrepancy sequences]] beat MC by one to three orders of magnitude. The results were reported to a number of Wall Street finance to considerable initial skepticism. The results were first published by Paskov and [[Joseph F Traub|Traub]], ''Faster Valuation of Financial Derivatives'', Journal of Portfolio Management 22, 113-120. Today QMC is widely used in the financial sector to value financial derivatives.
These results are empirical; where does computational complexity come in? QMC is not a panacea for all high dimensional integrals. What is special about financial derivatives? Here's a possible explanation. The <math>360</math> dimensions in the CMO represent monthly future times. Due to the discounted value of money variables representing times for in the future are less important than the variables representing nearby times. Thus the integrals are non-isotropic. Sloan and Woźniakowski introduced the very powerful idea of weighted spaces, which is a formalization of the above observation. They were able to show that with this additional ___domain knowledge high dimensional integrals satisfying certain conditions were tractable even in the worst case! In contrast the Monte Carlo method gives only a stochastic assurance. See Sloan and Woźniakowski ''When are Quasi-Monte Carlo Algorithms Efficient for High Dimensional Integration?'' J. Complexity 14, 1-33, 1998. For which classes of integrals is QMC superior to MC? This continues to be a major research problem.
==Brief history==
Precursors to IBC may be found in the 1950s by Kiefer, Sard, and Nikolskij. In 1959 [[Joseph F Traub|Traub]] had the key insight that the optimal algorithm and the computational complexity of solving a continuous problem depended on the available information. He applied this insight to the solution of [[nonlinearity|nonlinear equations]], which started the area of optimal iteration theory. This research was published in the 1964 monograph ''Iterative Methods for the Solution of Equations.''
The general setting for information-based complexity was formulated by Traub and Woźniakowski in 1980 in ''A General Theory of Optimal Algorithms.'' For a list of more recent monographs and pointers to the extensive literature see To Learn More below.
|