Wiener–Khinchin theorem: Difference between revisions

Content deleted Content added
mNo edit summary
 
(233 intermediate revisions by more than 100 users not shown)
Line 1:
{{Short description|Spectral decomposition theorem of stationary processes' autocorrelations}}
'''The Wiener–Khinchin theorem''' states that energy [[spectral density]] of a function is the [[Fourier Transform]] of the corresponding [[autocorrelation]] sequence. The theorem is a special case of the [[correlation theorem]].
In [[applied mathematics]], the '''Wiener–Khinchin theorem''' or '''Wiener–Khintchine theorem''', also known as the '''Wiener–Khinchin–Einstein theorem''' or the '''Khinchin–Kolmogorov theorem''', states that the [[autocorrelation]] function of a [[wide-sense-stationary random process]] has a [[Spectral theorem|spectral decomposition]] given by the [[power spectrum|power spectral density]] of that process.<ref name="C. Chatfield 1989 94–95">{{cite book | title = The Analysis of Time Series—An Introduction | author = C. Chatfield | edition = fourth | publisher = Chapman and Hall, London | year = 1989 | isbn=0-412-31820-2 | pages = 94–95}}</ref><ref>{{cite book | title = Time Series | author = Norbert Wiener | publisher = M.I.T. Press, Cambridge, Massachusetts | year = 1964 | page = 42}}</ref><ref>Hannan, E.J., "Stationary Time Series", in: John Eatwell, Murray Milgate, and Peter Newman, editors, ''The New Palgrave: A Dictionary of Economics. Time Series and Statistics'', Macmillan, London, 1990, p. 271.</ref><ref>{{cite book | title = Echo Signal Processing | author = Dennis Ward Ricker | publisher = Springer | year = 2003 | isbn = 1-4020-7395-X | url = https://books.google.com/books?id=NF2Tmty9nugC&pg=PA23 }}</ref><ref>{{cite book | title = Digital and Analog Communications Systems | author = Leon W. Couch II | edition = sixth | publisher = Prentice Hall, New Jersey | year = 2001 | isbn=0-13-522583-3| pages = 406–409}}</ref><ref>{{cite book | title = Wireless Technologies: Circuits, Systems, and Devices | author = Krzysztof Iniewski | publisher = CRC Press | year = 2007 | isbn = 978-0-8493-7996-3 | url = https://books.google.com/books?id=JJXrpazX9FkC&pg=PA390 }}</ref><ref>{{cite book | title = Statistical Optics | author = Joseph W. Goodman | publisher = Wiley-Interscience | year = 1985 | isbn=0-471-01502-4}}</ref>
 
==History==
Continous case:
[[Norbert Wiener]] proved this [[theorem]] for the case of a deterministic function in 1930;<ref>{{cite journal|last=Wiener|first=Norbert|title=Generalized Harmonic Analysis|journal=Acta Mathematica|year=1930|volume=55|pages=117–258|doi=10.1007/bf02546511|doi-access=free}}</ref> [[Aleksandr Khinchin]] later formulated an analogous result for stationary stochastic processes and published that probabilistic analogue in 1934.<ref name="Champeney">{{cite book|author=D.C. Champeney|title=A Handbook of Fourier Theorems|url=https://archive.org/details/handbookoffourie00cham_0|url-access=registration|year=1987|publisher=Cambridge University Press|chapter=Power spectra and Wiener's theorems|page=[https://archive.org/details/handbookoffourie00cham_0/page/102 102] |isbn=9780521265034 |quote=Wiener's basic theory of 'generalised harmonic analysis' is in no way probabilistic, and the theorems apply to single well defined functions rather than to ensembles of functions [...] A further development of these ideas occurs in the work of A. I. Khintchine (1894–1959) on stationary random processes (or stochastic processes) [...] in contexts in which it is not important to distinguish the two approaches the theory is often referred to as the Wiener—Khintchine theory. }}</ref><ref>{{cite journal|last=Khintchine|first=Alexander|title=Korrelationstheorie der stationären stochastischen Prozesse |journal=[[Mathematische Annalen]]|year=1934 |volume=109 |issue=1 |pages=604–615 |doi=10.1007/BF01449156 |s2cid=122842868}}</ref> [[Albert Einstein]] explained, without proofs, the idea in a brief two-page memo in 1914.<ref name="Einstein1914">{{cite journal |last=Einstein |first=Albert |year=1914 |title=Méthode pour la détermination de valeurs statistiques d'observations concernant des grandeurs soumises à des fluctuations irrégulières |url=https://gallica.bnf.fr/ark:/12148/bpt6k2991413 |journal=Archives des Sciences |volume=37 |pages=254–256 |bibcode=1914ArS....37..254E}}</ref><ref>{{cite book|title=The Legacy of Norbert Wiener: A Centennial Symposium (Proceedings of Symposia in Pure Mathematics)|page=95|first1=David|last1=Jerison|first2=Isadore Manuel|last2=Singer|first3=Daniel W.|last3=Stroock|publisher=American Mathematical Society|year=1997|isbn=0-8218-0415-4}}</ref>
:<math>
S_{xx}(f)=\int_{-\infty}^\infty r_{xx}(\tau)e^{-j2\pi f\tau} d\tau
</math>
 
==Continuous-time process==
Discrete case:
{{see also|Fourier transform#Fourier-Stieltjes transform}}
:<math>
S_{xx}(f)=\sum_{k=-\infty}^\infty r_{xx}(k)e^{-j2\pi k f}
</math>
 
For continuous time, the Wiener–Khinchin theorem says that if <math> x </math> is a [[wide-sense-stationary random process]] whose [[autocorrelation function]] (sometimes called [[autocovariance]]) defined in terms of statistical [[expected value]]
{{Math-stub}}
<math display="block">r_{xx}(\tau) = \mathbb{E}\big[x(t)^* x(t - \tau)\big] < \infty, \quad \forall \tau,t \in \mathbb{R},</math>
where the asterisk denotes [[complex conjugate]], then there exists a [[monotonic function|monotone function]] <math> F(f) </math> in the frequency ___domain <math> -\infty < f < \infty </math>, or [[Riesz–Markov–Kakutani representation theorem|equivalently]] a non negative [[Radon measure]] <math>\mu</math> on the frequency ___domain, such that
<math display="block"> r_{xx} (\tau) = \int_{-\infty}^\infty e^{2\pi i\tau f}\mu(df) = \int_{-\infty}^\infty e^{2\pi i\tau f} dF(f) , </math>
where the integral is a [[Riemann–Stieltjes integral]].<ref name="C. Chatfield 1989 94–95"/><ref>{{cite book |last=Hannan |first=E. J. |chapter=Stationary Time Series |editor-first=John |editor-last=Eatwell |editor2-first=Murray |editor2-last=Milgate |editor3-first=Peter |editor3-last=Newman |title=The New Palgrave: A Dictionary of Economics. Time Series and Statistics |publisher=Macmillan |___location=London |year=1990 |page=271 |isbn=9781349208654 |chapter-url=https://books.google.com/books?id=CUevCwAAQBAJ&pg=PA271 }}</ref> This is a kind of [[Spectral_theorem|spectral decomposition]] of the auto-correlation function. <math>F</math> is called the power spectral distribution function and is a statistical distribution function. It is sometimes called the integrated spectrum.
 
The ordinary Fourier transform of <math>x(t)</math> does not exist in general, because stochastic random functions are usually not [[Absolutely integrable function|absolutely integrable]]. Nor is <math> r_{xx} </math> assumed to be absolutely integrable, so it need not have a Fourier transform either.
 
However, if the measure <math> \mu(df) = dF(f) </math> is [[absolute continuity | absolutely continuous]] (e.g. if the process is purely indeterministic), then <math> F </math> is differentiable [[almost everywhere]] and has a [[Radon-Nikodym derivative]] given by
<math display="block"> S(f)= \frac{\mu(df)}{df}.</math>
In this case, one can determine <math> S(f)</math>, the [[power spectral density]] of <math>x(t)</math>, by taking the averaged derivative of <math> F </math>. Because the left and right derivatives of <math> F </math> exist everywhere, i.e. we can put
<math display="block"> S(f) = \frac12 \left(\lim_{\varepsilon \downarrow 0} \frac1\varepsilon \big(F(f + \varepsilon) - F(f)\big) + \lim_{\varepsilon \uparrow 0} \frac1\varepsilon \big(F(f + \varepsilon) - F(f)\big)\right)</math>
everywhere,<ref>{{cite book | title = The Analysis of Time Series—An Introduction |first = C. |last=Chatfield | edition = Fourth | publisher = Chapman and Hall |___location=London | year = 1989 | isbn=0-412-31820-2 | page = 96 |url=https://books.google.com/books?id=qKzyAbdaDFAC }}</ref> (obtaining that ''F'' is the integral of its averaged derivative<ref>{{cite book | title = A Handbook of Fourier Theorems | first = D. C. |last=Champeney | publisher = Cambridge Univ. Press | year = 1987 | pages = 20–22 |url=https://books.google.com/books?id=IkQoC1ava5sC&pg=PA20 | isbn = 9780521366885 }}</ref>), and the theorem simplifies to
<math display="block"> r_{xx} (\tau) = \int_{-\infty}^\infty e^{2\pi i\tau f} \, S(f)df. </math>
Assuming that <math>r</math> and <math>S</math> are [[Fourier_inversion_theorem#Conditions_on_the_function|"sufficiently nice"]] such that the Fourier inversion theorem is valid, the Wiener–Khinchin theorem takes the simple form of saying that <math>r</math> and <math>S</math> are a [[Fourier_transform#Definition|Fourier transform pair]], and
S_{xx}<math display="block"> S(f) = \int_{-\infty}^\infty r_{xx} (\tau) e^{-j22\pi fif\tau} \,d\tau. </math>
 
==Discrete-time process==
{{see also|Wiener's lemma}}
 
For the discrete-time case, the power spectral density of the function with discrete values <math>x_n</math> is
 
:<math> S(\omega)=\frac{1}{2\pi} \sum_{k=-\infty}^\infty r_{xx}(k)e^{-i \omega k} </math>
 
where <math>\omega = 2 \pi f</math> is the angular frequency, <math>i</math> is used to denote the [[imaginary unit]] (in engineering, sometimes the letter <math>j</math> is used instead) and <math>r_{xx}(k) </math> is the discrete [[autocorrelation function]] of <math>x_n</math>, defined in its deterministic or stochastic formulation.
 
Provided <math>r_{xx}</math> is absolutely summable, i.e.
 
S_{xx}(f)=:<math> \sum_{k=-\infty}^\infty |r_{xx}(k)e^{-j2| < +\piinfty k</math> f}
 
the result of the theorem then can be written as
 
:<math> r_{xx}(\tau) = \int_{-\pi}^{\pi} e^{i \tau \omega} S(\omega) d\omega </math>
 
Being a discrete-time sequence, the spectral density is periodic in the frequency ___domain. For this reason, the ___domain of the function <math> S </math> is usually restricted to <math> [-\pi, \pi] </math> (note the interval is open from one side).
 
==Application==
The theorem is useful for analyzing [[Linear time-invariant system|linear time-invariant systems]] (LTI systems) when the inputs and outputs are not square-integrable, so their Fourier transforms do not exist. A corollary is that the Fourier transform of the autocorrelation function of the output of an LTI system is equal to the product of the Fourier transform of the autocorrelation function of the input of the system times the squared magnitude of the Fourier transform of the system impulse response.<ref>
{{cite book
| title = Random signals and noise: a mathematical introduction
| author = Shlomo Engelberg
| publisher = CRC Press
| year = 2007
| isbn = 978-0-8493-7554-5
| page = 130
| url = https://books.google.com/books?id=Zl51JGnoww4C&pg=PA130
}}</ref> This works even when the Fourier transforms of the input and output signals do not exist because these signals are not square-integrable, so the system inputs and outputs cannot be directly related by the Fourier transform of the impulse response.
 
Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the energy [[transfer function]].
 
This corollary is used in the parametric method for power spectrum estimation.
 
==Discrepancies in terminology==
 
In many textbooks and in much of the technical literature, it is tacitly assumed that Fourier inversion of the [[autocorrelation]] function and the power spectral density is valid, and the Wiener–Khinchin theorem is stated, very simply, as if it said that the Fourier transform of the autocorrelation function was equal to the power [[spectral density]], ignoring all questions of convergence<ref>{{cite book | title = The Analysis of Time Series—An Introduction | author = C. Chatfield | edition = fourth | publisher = Chapman and Hall, London | year = 1989 | isbn=0-412-31820-2 | page = 98}}</ref> (similar to Einstein's paper<ref name="Einstein1914" />).
But the theorem (as stated here) was applied by [[Norbert Wiener]] and [[Aleksandr Khinchin]] to the sample functions (signals) of [[wide-sense-stationary random process]]es, signals whose Fourier transforms do not exist.
Wiener's contribution was to make sense of the spectral decomposition of the autocorrelation function of a sample function of a wide-sense-stationary random process even when the integrals for the Fourier transform and Fourier inversion do not make sense.
 
Further complicating the issue is that the [[discrete Fourier transform]] always exists for digital, finite-length sequences, meaning that the theorem can be blindly applied to calculate autocorrelations of numerical sequences. As mentioned earlier, the relation of this discrete sampled data to a mathematical model is often misleading, and related errors can show up as a divergence when the sequence length is modified.
 
Some authors refer to <math>R</math> as the autocovariance function. They then proceed to normalize it by dividing by <math>R(0)</math>, to obtain what they refer to as the autocorrelation function.
 
==References==
{{Reflist|30em}}
 
==Further reading==
*{{cite book |last1=Brockwell |first1=Peter A. |last2=Davis |first2=Richard J. |title=Introduction to Time Series and Forecasting |edition=Second |publisher=Springer-Verlag |___location=New York |year=2002 |isbn=038721657X }}
*{{cite book |last=Chatfield |first=C. |title=The Analysis of Time Series—An Introduction |edition=Fourth |publisher=Chapman and Hall |___location=London |year=1989 |isbn=0412318202 }}
*{{cite book |last=Fuller |first=Wayne |title=Introduction to Statistical Time Series |series=Wiley Series in Probability and Statistics |edition=Second |publisher=Wiley |___location=New York |year=1996 |isbn=0471552399 }}
*{{cite book |last=Wiener |first=Norbert |author-link=Norbert Wiener |year=1949 |title=Extrapolation, Interpolation, and Smoothing of Stationary Time Series: With Engineering Applications |url=https://direct.mit.edu/books/oa-monograph/4361/Extrapolation-Interpolation-and-Smoothing-of |publisher=[[MIT Press]] |isbn=9780262257190}}
*{{cite book |last=Yaglom |first=A. M. |title=An Introduction to the Theory of Stationary Random Functions |publisher=Prentice–Hall |___location=Englewood Cliffs, New Jersey |year=1962 }}
 
{{DEFAULTSORT:Wiener-Khinchin theorem}}
[[Category:Theorems in Fourier analysis]]
[[Category:Signal processing]]
[[Category:FourierTheorems analysisin probability theory]]